Big tech, regulation, and the Spence distortion

Regulation of big tech arrived in 2020. In California, the Consumer Privacy Act went into effect in January, and in November, a second privacy bill passed through referendum. The House Antitrust Subcommittee dropped its long-awaited report, setting the stage for reforms in the next Congress. Meanwhile, the Department of Justice brought a case against Google last month, and the Federal Trade Commission is expected to drop a lawsuit against Facebook before the year is over.

These changes may feel new, but policymakers of all stripes should be raiding the past for lessons. Indeed, Nobel Prize-winning economist Andrew Michael Spence published “Monopoly, Quality, and Regulation” in 1975. It is a chronically underappreciated paper that can help give some insight into this current moment in tech policy. 

As Spence pointed out, the marginal consumer, or the next most likely person to choose the good, sets the price for monopolies. However, Spence wasn’t only interested in this price, but also the quality offered. Since “prices [don’t] convey information about the value attached to [the] quality” of a good, there is a distortion created. In other words, the quality of the good could be lower or higher than what is optimal for the average consumer. Today economists call this the Spence distortion. 

Regulators trying to correct this problem face a complex task: it is difficult to understand the average valuation of quality for all the consumers in the market. Thus, altering the market to that quality and price level forces them to revert to second-best solutions. 

If social media companies are truly monopolists, as some claim, then the Spence distortion might just apply. Consider, for example, privacy concerns. Although some users may be ill-informed of privacy issues, firms cannot quickly identify these individuals and offer them a separate privacy policy. A company will likely set the quality of service to attract marginal consumers who are concerned about privacy. A committed minority of individuals might be able to persuade a platform to offer a level of privacy protection above and beyond what the average would want. In other words, platforms might be setting the level of privacy offerings too high. 

Moreover, regulators trying to correct this privacy problem will face the impossible task of understanding the optimal privacy level and then adjusting the market to reflect that preference. Even in an ideal world, the best-case scenario will be a second-best solution.