Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
The managing director of Cambridge Analytica told an undercover reporter, “It doesn’t help to fight an election campaign with facts, because it’s all about emotions.” The political consulting firm had to train its algorithm to predict and map personality traits in order to reach out to U.S. voters and appeal to their hopes, neuroses, and fears. That took a lot of personal information.
So, Cambridge Analytica hired a professor from Cambridge University, whose app collected data on about 50 million Facebook users and their friends. This allowed Cambridge Analytica to make these psychographic profiles. At that time, Facebook let app developers get this personal information. Facebook said that Cambridge Analytica and the professor broke its rules about how to handle data. But this wasn’t the first time that one of its rules was broken. It’s not likely to be the last one, either.
Russia used Facebook, Google, and Twitter to “sow discord in the U.S. political system, including the 2016 U.S. presidential election,” which led to this scandal. It made people worry more about the power of the tech giants of today.
Part of what makes this happen is the data. Companies like Facebook, Google, Amazon, and others like them are called “data ops.” This means companies that have control over a key platform that, like a coral reef, draws users, sellers, advertisers, software developers, app makers, and accessory makers to its ecosystem.
Apple and Google, for example, each control a popular mobile phone operating system platform and some of the most important apps on that platform. Amazon controls the largest online merchant platform, and Facebook controls the largest social network platform. A large amount and variety of personal data flow through their most popular platforms. The speed with which these companies can get and use this personal information can give them a lot of market power.
Is it okay for a few companies to have so much information and so much power because of it? At least in the U.S., antitrust officials seem to be on the fence about these data policies for now. People think, “They’re free, so what’s the harm?” But that way of thinking is wrong. Data-opolies are very dangerous for consumers, workers, competition, and the health of our democracy as a whole. This is why.
The European competition authorities have recently taken action against Google, Apple, Facebook, and Amazon for being data monopolies (or GAFA for short). For example, the European Commission fined Google a record €2.42 billion for using its search monopoly to boost its comparison shopping service.
The Commission also found, in a preliminary way, that Google abused its dominant position with both Android and AdSense. Germany’s competition agency has found that Facebook abused its dominant position by making it a requirement for people to use its social network that Facebook be allowed to collect all kinds of data from third-party websites and combine it with the user’s Facebook account.
In the next few years, the Europeans are likely to use more fines and other punishments. But in the U.S., neither the Obama nor the Bush administrations paid much attention to the data policies. Notably, the European Commission thought Google’s search bias hurt competition, but the U.S. Federal Trade Commission did not. Since the year 2000, the Department of Justice has only brought one case against anyone for monopolization. (In contrast, the DOJ sued monopolies and oligopolies 39 times in civil court and 3 times in criminal court from 1970 to 1972.)
The current head of the DOJ’s Antitrust Division said that the U.S. and Europe have different ways of enforcing laws. He said that his agency has “special worries about digital markets.” But unless there is “demonstrable harm to competition and consumers,” the DOJ is “reluctant to impose special duties on digital platforms.” This is because the DOJ is worried that special duties could stifle the innovation that has made competition lively and helped consumers.
So, the different ways that antitrust laws are enforced may reflect different ideas about how bad data monopolies are. Most of the time, monopolies hurt people because prices go up, output goes down, or quality goes down. At first glance, it appears that data-opolies pose little, if any, risk of these harms. Unlike some pharmaceutical, data-opolies do not charge exorbitant prices to consumers.
Most of the consumer products that Google and Facebook offer seem to be “free.” The size of data monopolies can also lead to better products. The more people use a certain search engine, the more the algorithm can learn about the users’ preferences. This means that the search results will likely be more relevant, which will likely bring more people to the search engine, and so on.
Robert Bork said, “There is no logical case for monopolization because consumers can use Google for free and switch to another search engine with one click.”
But powerful companies can hurt their customers and the rest of society in other ways besides raising prices. When examined further, data-opolies can cause at least eight different types of harm.
Antitrust officials are realizing more and more that companies can compete on privacy and keeping data safe. But without competition, data ops face less pressure. They can lower privacy protection below what is needed to compete and collect personal information above what is needed to compete. When too much personal information is collected, it can be like charging too much.
Data monopolies can also hide what information they collect and how they plan to use it. They don’t have much pressure from the market to change their vague privacy policies. What does it matter if a data monopoly makes its privacy statement better? The current notice-and-consent system is useless because there are no good alternatives and people have very different bargaining powers.
Surveillance and security risks. In a monopolized market, personal data is concentrated in a few firms. Consumers have limited outside options that offer better privacy protection. This raises additional risks, including:
Wealth transfer to data-opolies. Even when their products and services are ostensibly “free,” data-opolies can extract significant wealth in several ways that they otherwise couldn’t in a competitive market:
First, data-opolies can extract wealth by getting personal data without having to pay for the data’s fair market value. The personal data collected may be worth far more than the cost of providing the “free” service. The fact that the service is “free” does not mean we are fairly compensated for our data. Thus, data-opolies have a strong economic incentive to maintain the status quo, in which users, as the MIT Technology Review put it, “have little idea how much personal data they have provided, how it is used, and what it is worth.” If the public knew, and if they had viable alternatives, they might hold out for compensation.
Second, something similar can happen but with the content users create. Data-opolies can extract wealth by getting creative content from users for free. In a competitive market, users could conceivably demand compensation not only for their data but also for their contributions to YouTube and Facebook. With no viable alternatives, they cannot.
Third, data-opolies can extract wealth from sellers upstream. One example is when data-opolies scrape valuable content from photographers, authors, musicians, and other websites and post it on their own platforms. In this case, the wealth of the data-opolies comes at the expense of other businesses in their value chain.
Fourth, data-opolies can extract our wealth indirectly, when their higher advertising fees are passed along in the prices for the advertised goods and services. If the data-opolies faced more competitors for their advertising services, ads could cost even less — and therefore so might the products being advertised.
Finally, data-opolies can extract wealth from both sellers upstream and consumers downstream by facilitating or engaging in “behavioral discrimination,” a form of price discrimination based on past behavior — like, say, your internet browsing. They can use personal data to get people to buy things they did not necessarily want at the highest price they are willing to pay.
As data-opolies expand their platforms to digital personal assistants, the Internet of Things, and smart technologies, the concern is that their data advantage will increase their competitive advantage and market power. As a result, the data-opolies’ monopoly profits will likely increase, at our expense.
Loss of trust. Market economies rely on trust. For online markets to deliver their benefits, people must trust firms and their use of personal data. But as technology evolves and more personal data is collected, we are increasingly aware that a few powerful firms are using our personal information for their own benefit, not ours. When data-opolies degrade privacy protections below competitive levels, some consumers will choose not “to share their data, to limit their data sharing with companies, or even to lie when providing information,” as the UK’s Competition and Markets Authority put it. Consumers may forgo the data-opolies’ services, which they otherwise would have used if privacy competition were robust. This loss would represent what economists call a deadweight welfare loss. In other words, as distrust increases, society overall becomes worse off.
Significant costs on third parties. Additionally, data-opolies that control a key platform, like a mobile phone operating system, can cheaply exclude rivals by:
Data-opolies can also impose costs on companies seeking to protect our privacy interests. My book with Ariel Ezrachi, Virtual Competition, discusses, for example, Google’s kicking the privacy app Disconnect out of its Android app store.
Less innovation in markets dominated by data-opolies. Data-opolies can chill innovation with a weapon that earlier monopolies lacked. Allen Grunes and I call it the “now-casting radar.” Our book Big Data and Competition Policy explores how some platforms have a relative advantage in accessing and analyzing data to discern consumer trends well before others. Data-opolies can use their relative advantage to see what products or services are becoming more popular. With their now-casting radar, data-opolies can acquire or squelch these nascent competitive threats.
Social and moral concerns. Historically, antitrust has also been concerned with how monopolies can hinder individual autonomy. Data-opolies can also hurt individual autonomy. To start with, they can direct (and limit) opportunities for startups that subsist on their super platform. This includes third-party sellers that rely on Amazon’s platform to reach consumers, newspapers, and journalists that depend on Facebook and Google to reach younger readers, and, as the European Commission’s Google Shopping Case explores, companies that depend on traffic from Google’s search engine.
But the autonomy concerns go beyond the constellation of app developers, sellers, journalists, musicians, writers, photographers, and artists dependent on the data-opoly to reach users. Every individual’s autonomy is at stake. In January, the hedge fund Jana Partners joined the California State Teachers’ Retirement pension fund to demand that Apple do more to address the effects of its devices on children. As The Economist noted, “You know you are in trouble if a Wall Street firm is lecturing you about morality.” The concern is that the data-opolies’ products are purposefully addictive, thereby eroding individuals’ ability to make free choices.
There is an interesting counterargument that’s worth noting, based on the interplay between monopoly power and competition. On the one hand, in monopolized markets, consumers have fewer competitive options. So, arguably, there is less need to addict them. On the other hand, data-opolies, like Facebook and Google, even without significant rivals, can increase profits by increasing our engagement with their products. So, data-opolies can have the incentive to exploit behavioral biases and imperfect willpower to addict users — whether watching YouTube videos or posting on Instagram.
Political concerns. Economic power often translates into political power. Unlike earlier monopolies, data-opolies, given how they interact with individuals, possess a more powerful tool: namely, the ability to affect the public debate and our perception of right and wrong.
Many people now receive their news from social media platforms. But the news isn’t just passively transmitted. Data-opolies can affect how we feel and think. Facebook, for example, in an “emotional contagion” study, manipulated 689,003 users’ emotions by altering their news feeds. Other risks of this sort include:
Upon closer examination, data-policies can actually be more dangerous than traditional monopolies. They can affect not only our wallets but our privacy, autonomy, democracy, and well-being.
Markets dominated by these data-opolies will not necessarily self-correct. Network effects, high switching costs for consumers (given the lack of data portability and user rights over their data), and weak privacy protection help data-opolies maintain their dominance.
Luckily, global antitrust enforcement can help. The Reagan administration, in espousing the then-popular Chicago School of economics beliefs, discounted concerns over monopolies. The Supreme Court, relying on faulty economic reasoning, surmised that charging monopoly prices was “an important element of the free market system.” With the rise of a progressive, anti-monopoly New Brandeis School, the pendulum is swinging the other way. Given the emergence of data-policies, this is a welcomed change.
Nonetheless, global antitrust enforcement, while a necessary tool to deter these harms, is not sufficient. Antitrust enforcers must coordinate with privacy and consumer protection officials to ensure that the conditions for effective privacy competition and an inclusive economy are in place.
READ MORE ARTICLES: