Personal data not only allows major businesses to track individuals鈥 behaviour, it also allows them to drive up prices
In 2019, the US Federal Trade Commission (FTC)聽聽to fine Facebook聽five billion dollars聽for 鈥渄eceiving鈥 its users聽in regard to its ability to keep their data and information private. The fine was the culmination of the US Government鈥檚 investigation into Facebook鈥檚 privacy practices, sparked by the that saw tens of millions of Facebook profiles leaked and used to predict US voters鈥 choices and target them with political campaign material.
Although this was hailed as a sign of regulators鈥 increasing willingness to clamp down on the dubious privacy practices of Big Tech companies, the fine will not necessarily change much. Facebook anticipated the charge and , indicating that聽even a聽five billion dollar fine was something the organisation could take in its stride.
Competition exists between these players in terms of keeping user attention on their own platforms and not on those of their competitors.
More important than the money, though, was the lack of any deeper measures taken by the FTC. 鈥淭he settlement imposes no meaningful changes to the company鈥檚 structure or financial incentives,聽which led to these violations,鈥 said FTC Commissioner Rohit Chopra . 鈥淣or does it include any restrictions on the company鈥檚 mass surveillance or advertising tactics.鈥澛營n other words, it was a slap on the wrist (Facebook is expected to make almost $100 billion selling ads in 2020 alone).
The EU, by comparison, has been more proactive when it comes to regulating Big Tech firms鈥 handling of user data. This has been done partly through the General Data Protection Regulation, which came into force in 2018, but European authorities have also led the way in terms of addressing the problems related to digital platforms through antitrust enforcement.
Information oligopoly
What does this mean in a practical sense? It starts with recognising the state and goal of competition in the field. In an attempt to define its role in the technological boom of the past decade, data has variously been described as 鈥渢he new oil鈥 and 鈥渁s common as water鈥, indicating both its potential value and its abundance. Neither of these generalisations is particularly useful or accurate, however.
What鈥檚 more important to note is that data is key to digital platforms because, when analysed closely, it can provide real-time knowledge of consumer behaviour across applications. This has led to an 鈥渁ttention economy鈥, in which Big Tech players work to capture users鈥 attention (and thus their data), build profiles of their choices and habits, and then sell those profiles to advertisers. Competition exists between these players in terms of keeping user attention on their own platforms and not on those of their competitors.
The system can leave consumers worse off than if they had never given access to their data in the first place.
High-level mergers such as Facebook鈥檚 acquisitions 补苍诲听聽concentrate the digital-platform market. This reduces competition, making attention a scarce resource for advertisers. Since Facebook鈥檚 business model (and indeed Google's) is based on selling ads to the highest bidder, this leads dominant digital platforms to favour advertising only high-cost items to consumers. In other words, the system can leave consumers worse off than if they had never given access to their data in the first place. On top of this, data privacy breaches, potential profiling and discrimination, and marketing to exploit personal vulnerabilities (e.g.听) are all detrimental to consumers, while ensuring they receive no value in return for handing over their private data.
The question of how much we want technology companies to know about us has become of even greater interest since the coronavirus outbreak. Big Tech players such as Microsoft, Amazon and Palantir have been praised for stepping up to help the UK's National Health Service work out where important resources are most likely to be needed, but聽this has also .听
However,聽it鈥檚 important not to fall in the trap of a false dichotomy between health and privacy. Technology can and should be used during this emergency, and technology still can and should guarantee fundamental rights in our democracies.
It鈥檚 academic
A frequently cited argument聽is the so-called 鈥減rivacy paradox鈥: the observation that consumers claim to care about preserving their privacy, but don鈥檛 take steps to reduce the amount of data they share. However, consumer choices are complex: most have little to no understanding of data practices, and there鈥檚 little transparency in such phrases as 鈥渨e may collect your personal information for marketing purposes鈥 or 鈥渨e may share your personal data with affiliates and trusted businesses鈥. As a result, consumers are unable to understand the future costs of their choices, so cannot be held to be expressing a preference when accepting opaque privacy terms.
There are virtually no published papers in the past 10 years that used primary data from one of the five leading digital companies.
Again, the issue of competition is key here. Digital markets are, by their nature, difficult for new entrants to break into, as network effects create barriers to entry. Elusive data practices serve to strengthen these barriers, giving platform incumbents a competitive advantage by concealing information that would allow consumers to compare alternatives. Academic access to anonymised data could help to remove these barriers but it is another area in which Big Tech firms remain elusive. In the top-five economic journals, there are virtually no published papers in the past 10 years that used primary data from one of the five leading digital companies (i.e. Amazon, Apple, Facebook, Google or Microsoft) to tackle a question related to competition or competition policy. We currently struggle with very important policy questions in the digital space, and digital giants have not helped us much in finding answers.
Competitive spirit
Within this challenge, however, lies opportunity. Lack of competition in a market leads to a reduction in quality, and rapidly eroding consumer data protections stand as the quality reduction in this sector 鈥 an example of competitive harm. Let鈥檚 not forget, for instance, that in the mid-2000s, when Facebook was an upstart social media platform, it tried to differentiate itself from the then-market leader, Myspace. In particular, Facebook publicly pledged to protect privacy, but as its competition began to disappear, Facebook revoked its users鈥 ability to vote on changes to its privacy policies. Hence, competition and privacy protection are part of the same process.
Under EU Article 102, consumers are protected against such harm caused by abuse of dominance, and this gives European authorities an antitrust enforcement rationale to investigate big tech firms. As pressure mounts, the firms in question may opt to increase transparency in order to deflect criticisms of their opaque data practices, which could also allow for greater academic scrutiny. This, in turn, could improve consumer understanding of the choices and costs associated with data privacy. Of course, this all depends on authorities choosing to take action 鈥 the twin pillars of antitrust enforcement and academic scrutiny must go together to have full effect.
This article was amended on 6 April 2020聽to reflect the coronavirus pandemic. The accompanying video was recorded before the UK Government-initiated lockdown.