„Ich kann es nicht umformulieren“
„Ich kann es nicht umformulieren.“
Vergleichsportale locken mit günstigen Hotels, der chinesische Online-Händler Temu setzt auf besonders niedrige Preise. Wer online einkauft, ist schnell vom Angebot überwältigt. Unternehmen nutzen das aus und steuern mit Algorithmen die Aufmerksamkeit ihrer Kunden, sagt Paul Heidhues.
ZEIT ONLINE: Herr Heidhues, Sie forschen dazu, wie Firmen Konsumenten zu Fehlern verleiten. Die chinesische Billig-Shopping-App Temu ist besonders bei Jüngeren beliebt. Viele schlagen zu, obwohl die Angebote so günstig sind, dass damit etwas nicht stimmen kann. Warum?
Paul Heidhues: Günstig ist ja erst mal prima. Aber manche Angebote sind zu gut, um wahr zu sein. Die Qualität mag niedrig sein, die Lieferbedingungen schlecht oder die Produkte erfüllen keine europäischen Sicherheitsstandards. Mancher Konsument ist sich dieser Probleme nicht bewusst. Konsumenten sind eben Menschen. Und Menschen sind fehlbar.
What do you mean by that?
Heidhues: None of us can or knows everything. In the field of science, we refer to ourselves as fallible consumers. Often, we make purchases not solely based on rational considerations, but because we make mistakes, such as overlooking high fees with a credit card or mobile phone contract, or falling for psychological tricks. Each of us has our attention controlled. An online retailer like Temu aims to maximize its profit and takes advantage of this.
„Händler drängen uns etwa durch Zeitdruck zu einer schnellen Kaufentscheidung.“
What could that be?
Heidhues: Online retailers can influence our behavior in their favor when designing their website. The English Competition and Markets Authority (CMA) has identified factors that online companies can exploit. For example, merchants push us to make quick purchasing decisions by creating a sense of urgency with messages like „only two seats or products available at this price.“ They also strategically place positive information where we first look, such as in the top left corner or in bold font. On the other hand, important negative information is often hidden in the fine print. Additionally, a buyer may be pressured into a initially free subscription where they have to choose between „No, I don’t want any free benefits“ and „Yes, I accept the three-month free offer.“ Any choice that goes against the online retailer’s interests seems foolish. When companies use these tactics to manipulate consumers in their favor, it is referred to as Dark Patterns according to research.
ZEIT ONLINE: Wie kommt es dazu?
Heidhues: The major digital companies are conducting experiments with millions of users and billions of data to see which strategy works. As a result, these companies control our purchases and direct our attention, through push notifications or targeted advertising.
What is the issue with it?
Heidhues: It is a fact that we are fallible and companies can exploit this intentionally. When shopping online, companies control which products are considered for purchase. It makes sense that we primarily see products that we potentially have an interest in. However, the products I am shown also depend on what I have previously searched for on the internet. Let me give you an example: The English financial regulator CFA has examined the advertisements shown to search engine users who were trying to find out how to get rich quickly with so-called high-yield bonds. As a result, the search engine displayed advertisements for a series of completely unscrupulous financial investments to users.
„Ich kann es nicht umformulieren.“
How do they know that these people are particularly susceptible to such loans?
Heidhues: Those who are financially literate know that the higher the profit or interest, the higher the risk. There is no quick path to wealth. Conversely, if you are looking for quick money, advertisers assume that you have no knowledge of finance. That’s why you see advertisements for unreliable financial investments popping up in your browser. For example, this year there was a major data leak at the advertising mediator Xandr. Xandr categorizes consumers into more than 650,000 categories and sells their data to advertisers. Advertisers can choose a profile, ranging from „frequent pregnancy test buyers“ to „fragile seniors“ to „prone to depression“ or „ignorant in financial matters“. The advertising mediators also have details such as illnesses or political beliefs, but there are also more harmless categories like „Dunkin-Donuts visitors“. These advertising mediators then hold an auction where, simplistically speaking, the advertiser who can extract the most money from the consumer wins.
What did your research reveal about how companies control us with sensitive data on the internet?
Heidhues: The more companies know about us, the more they can potentially exploit our weaknesses and sell us things we only believe we need. Previous economic literature on digital markets has assumed that consumers are completely rational. If an advertising intermediary shows a rational consumer a dubious investment opportunity, there must be some good reason behind it, as such a consumer does not make mistakes by assumption. In one of our studies, we argue that a fallible consumer approach is much more realistic. And we show that in such a case, it can be problematic if consumers are directed to questionable financial service providers. But the problem goes far beyond this one example. A person with a gambling addiction should not receive advertisements for online casinos. And other studies show that companies tend to recommend the product for which they receive the highest commission. And consumers are often too trusting and follow company recommendations. In online commerce, consumers are much more likely to choose products that are listed first on sales platforms. And a hotel booking platform like Booking.com often lists hotels higher simply because they pay a larger commission.
How do incorrect purchasing decisions come about?
Heidhues: Many of us tend to project when assessing the value of a good. If we would like to have the good now, we overestimate its value and underestimate the influence of changing circumstances. This is also known as projection bias. Hence, an English proverb advises: Never go shopping on an empty stomach. So, although we all know that the weather changes, it still affects our purchasing decisions. For example, economist Meghan Busse and her team showed that people are more likely to buy convertibles when it is sunny. However, in the long run, the convertible will not be worth much more just because the sun happens to be shining today. Companies can exploit this projection bias by using specific algorithms to advertise based on the weather. Additionally, algorithms can lead us to expensive products even though cheaper ones would be just as good.
Can you provide an example here?
Heidhues: Der billigste Kopfhörer taucht vielleicht erst auf der dritten Seite der Suchergebnisse auf, obwohl er in Tests gut abschneidet. Aber der Algorithmus bevorzugt teure Kopfhörer, für die eine Handelsplattform mehr Provision kassiert oder der Hersteller mehr Werbung eingekauft hat.
Under stress, people are likely more prone to making impulse purchases.
ZEIT ONLINE: Aber Algorithmen sind ja nicht per se schlecht. Nehmen Sie eine Musik-App, die automatisch Playlists erstellt. Mit Musik, die Nutzern gut gefällt.
Heidhues: I also like the playlists. Although it seems completely harmless, companies are increasingly analyzing how you feel beyond simple playlists. Your mood influences how you shop. Under stress or when drunk, people are likely to make more impulse purchases. Advertisers could take advantage of this if it becomes apparent through your inputs into a search engine that you are not feeling well.
Where do companies obtain information?
Heidhues: Techunternehmen speichern alles, was Sie im Internet tun. Das geht über Cookies und Tracker, die Ihre Aktivitäten nachverfolgen, hinaus. Nutzerinnen hinterlassen mit ihren E-Mails oder Accounts große Mengen an Daten. Wir wissen, dass die Firma Geräte wie Beamer und Handys einer ID und damit einem Kundenprofil zuordnet.
„I cannot reword“
ZEIT ONLINE: Auf der Videoplattform YouTube fragt das Mutterunternehmen Google immer öfter danach, ob eine Anzeige für mich relevant ist. Wo ist das Problem?
Heidhues: Targeted advertising itself is not the issue. It becomes difficult when companies exploit vulnerable target groups, such as offering unreliable credit options to someone who is heavily in debt. Or when someone suffering from depression is shown unqualified coaching instead of psychotherapy. We need to provide stronger protection for these individuals, as well as children and adolescents.
ZEIT ONLINE: Liberale würden nun für mehr Eigenverantwortung plädieren.
Heidhues: Limited applies to this. The more informed and empowered the user is, the less susceptible they are to being controlled on the internet. However, either you live under a rock without internet or you are being monitored. You can use a Virtual Private Network (VPN) to have a secure internet connection. It hides your internet address and allows you to access websites anonymously and supposedly from other countries, even if you are surfing in Germany. However, this only helps to a certain extent and is time-consuming. Instead, we should focus on the platforms directly and prevent the misuse of collected data.
What has the Datenschutz-Grundverordnung (DSGVO) achieved since it was implemented in Germany in 2018?
Heidhues: It may be a progress. However, legally it is also based on the fact that by simply clicking, I agree to very long and complicated terms and conditions. Research shows that only a very small percentage of users read the terms and conditions. Actively and consciously agreeing to the GDPR is a legal fiction. Instead, there should be something like a traffic light system for food: users could see from red to green how much data they are sharing on each website. And then users should be able to adjust how much data they want to share on all platforms. And data should not be monetized in any way, especially not the data of vulnerable target groups susceptible to addiction or illnesses.
How can data of the particularly vulnerable be effectively separated from those of other users?
Heidhues: I don’t believe that we can or necessarily should separate this. If the data shows that I have a gambling addiction, then it is appropriate for me to be offered reputable assistance. However, I should not be tempted to gamble through push notifications. And if my fitness tracker identifies issues with my heart rate, it makes sense for both me and my doctor to be informed. But not my employer or job board. Therefore, data usage should be regulated, not data collection itself. However, I admit that this is not easy.
„I cannot reword“
ZEIT ONLINE: Bis 2024 treten der Digital Services Act (DSA) und der Digital Markets Act (DMA) in Kraft, sie sollen die Grundrechte für EU-Bürger digital schützen. Für sehr große Anbieter wie Google und Meta gelten die Regelungen der neuen EU-Gesetze schon. Wie bessert die EU beim Datenschutz damit nach?
Heidhues: Search engines and online services will then have to present their own risk analysis, explaining whether their algorithms are susceptible to disinformation. They must also explain to their users the basis for recommendations on their website. Currently, many of these companies keep much of this information confidential. If implemented well, the new data access right allows independent researchers to analyze the algorithms and uncover their dangers or problems. With this knowledge, regulation can be improved. The beauty of DMA and DSA is that they already anticipate the possibility of regulatory changes if their goals are not met. This is important because a learning system is needed.
What specific changes could occur with DSA and DMA?
Heidhues suggests that companies should be required to disclose how search results change when someone of a different gender or age searches for the same thing. This could prevent discrimination. If companies refuse to provide this information, they should not be allowed to use the algorithm. Additionally, recommendation systems should prioritize user preferences instead of trying to predict errors. It would be helpful if my search queries and browsing decisions actually determined my search results, rather than displaying results from the company that pays the most. This would be a simple first step.