State and federal tech policy

Reversing economic stagnation

I’m convinced the answer isn’t degrowth, the answer is abundance. Excessive vetos and inaction slow down development, so one of the big projects of our generation will be the dismantling of our system of vetocracy.

If we want to do better, we should adopt the abundance agenda I have laid out.

Cities are changing and everyone wants to know the impact. My Urbanism FAQ is a collection of research on urban economics.

Artificial intelligence

If the rumors are to be believed, Senator Chuck Schumer is working on a new bill to regulate AI. What’s interesting is that nearly all of the “four guardrails” that are steering the conversation are already being implemented by the company that is leading this charge, OpenAI. And the other major company in this race, Anthropic, is competing with even more safety and protections built in.

To me, the effort to regulate AI seems misplaced. Congress has for years failed to pass privacy legislation, which would do a lot to accomplish the goals Sen. Schumer has laid out. Members of Congress already have bill language. What’s needed isn’t another grand vision. What’s needed is bill movement.

Meanwhile, some of the biggest names in AI are calling for a six month pause on all new experiments in the tech. In a newly released letter, business leaders like Elon Musk and Apple Co-founder Steve Wozniak, alongside tech critics like Tristan Harris are calling “on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.”

Altogether, the letter shows just how siloed that Silicon Valley is from the rest of the world. It misreads industrial history and then offers a set of remedies which are already being adopted by the industry.

On top of this, if groups and teams working on AI don’t comply, the letter argues that “governments should step in and institute a moratorium.” Arguably though, because all of the tech is happening in the United States, only the US federal government would need to step in.

The National Telecommunications and Information Administration (NTIA) recently put out a proceeding to better understand how commercial entities data collect and use data. Importantly, it was seeking to understand how “specific data collection and use practices potentially create or reinforce discriminatory obstacles for marginalized groups regarding access to key opportunities, such as employment, housing, education, healthcare, and access to credit.” What the NTIA seeks to tackle is a wicked problem in Rittel and Webber’s classic definition.

The public interest comments I filed argued for a twist on that theme.

Wicked problems, which plague public policy and planning are distinct from natural problems because “natural problems are definable and separable and may have solutions that are findable (while) the problems of governmental planning and especially those of social or policy planning are ill-defined.” But the case of fairness in AI shows that they are over-defined. The reason why “social problems are never solved,” they “are only resolved-over and over again” is because there are many possible solutions.

When the NTIA issues its final report, it should resist the tendency to reduce wicked problems into natural ones. Rather, the agency should recognize, as one report described it, the existence of a hidden universe of uncertainty about AI models.

To address this problem holistically:

  • The first section explains how data-generating processes can create legibility but never solve the problem of illegibility.
  • The second section explains what is meant by bias, breaks down the problems in model selection, and walks through the problem of defining fairness.
  • The third section explores why people have a distaste for the kind of moral calculations made by machines and why we should focus on impact.

Teens and tech

I did a deep dive into teen mental health and the impact of social media. You might be surprised by what the research says, I was. Teens and the impact of social media, a deep dive into recent work from Haidt


Thoughts on what the CDC YRBS data means for social media, teens, and mental health
– March 14, 2023 – Now+Next – On February 13, The Centers for Disease Control (CDC) released “The Youth Risk Behavior Survey Data Summary & Trends Report: 2011–2021,” a summary of the latest findings from the Youth Risk Behavior Survey (YRBS). What is most worrying about this release is the mental health decline of adolescent Americans, especially teenage girls. But when you begin back in the late 1960s, when the CDC began collecting data, it becomes clear that the US has already survived at least one great wave of poor teen mental health.

Broadband and infrastructure

I just finished an estimation of the real extent of broadband availability in the US using data from Georgia. It is driven by an economic model I developed and is a central piece of my recent work in broadband.

The result was an estimate of where broadband truly exists.

Some other open questions:

I have been thinking about the cost of time, especially the value of one dollar delays. There is more to do here, especially since

I also have been writing/thinking about excessive veto power, technological atonement, when an online community migrates, the Millennial wealth gap, short-termism, the rural broadband penalty, COVID-19 and the relocalization of politics, the Spence distortion, and that Noam Chomsky signed the Harper’s Magazine letter.

The attention economy

I ’ve been writing on the economics of the attention economy. One strand of this work charts the deep relationship between privacy on social media platforms and the value of the company. Another stand of this research uses the Facebook and Instagram blackout in late 2021 to understand how competition works in reality.

What follows is a selected list of my work as it relates to policy.

How do we bridge the digital divide?

The digital divide is far more encompassing than simply getting broadband into every home. And yet, leadership at both the federal and state levels have tended to focus on the supply of broadband, to plug the broadband availability gap.

Completely closing the broadband availability gap hinges on the last two percent. This group of locations, which I’ve dubbed economically stubborn, is unique because subscriber revenues won’t be sufficient to pay for yearly operational costs. In the world of finance, these projects will be cash flow negative on an operational basis, meaning they lose money in their operation. The current broadband bills in Congress show a clear lack of understanding about these realities.

Having spent much of my formative years in rural Illinois, in Olney, I am painfully aware that the rural locations face hurdles beyond just broadband. Rural regions are shaped by their distance from an economic hub like a small town. Researchers sometimes call this the rural penalty, which in its classic definition, is “the economic disadvantage of having to overcome some costs that are lower in other places that are less rural.” Having access to broadband can help rural communities reduce this economic remoteness but that doesn’t mean the region will remain unchanged. To summarize the literature in a phrase, reinvigorating communities will take more than broadband.

Should we break up big tech?

I’m skeptical. In this Wall Street Journal op-ed, titled “Breaking Up Big Tech Is Hard to Do”, I explain why structural separation will face challenges. Breaking up tech companies means that the government would have to split up their teams and their underlying technology. It would also require a legal and regulatory system to keep each targeted company separate from the others’ markets. These restrictions would pose challenges for any company. But for highly integrated tech firms, they’d be a death sentence.

In a companion piece to my WSJ op-ed above, “Breaking Up Tech Companies Means Breaking Up Teams And The Underlying Technology”, I explore more in depth the difficulties in breaking up tech companies, especially as it relates to each company’s tech stack.

Is there a kill zone in tech?

Recently, Noah Smith explored an emerging question in tech. Is there a kill zone where new and innovative upstarts are being throttled by the biggest players? While there is good evidence that such a thing happens in pharma, there is little evidence of it for other industries, most importantly tech. Since I am not clever, here is “Is There a Kill Zone in Tech?”

Should consumers “own” their data?

The properties of data make ownership a difficult prospect. Knowing this, regulators have instead opted for simple restrictions on data use instead of making data completely an individual’s property. Consumer behavior confirms part of this thesis, as many are willing to trade their information for services. Not surprisingly, data ownership, like other data restriction plans, is a costly endeavor. While there is a natural inclination to push for data-ownership policies, implementing these kinds of policies would have a detrimental effect on innovation. I break it all down in, “The Law & Economics of ‘Owning Your Data.’

What’s the value of data

A Dive Into Digital Dividends – California Governor Gavin Newsom recently called for a data dividend “because we recognize that your data has value, and it belongs to you.” The notion that platforms should pay out a portion of their profits to their users has been gaining steam more broadly, as well. But the idea is misplaced for three primary reasons. First, advertising revenue does not equal the value of data. Second, even if data is jointly created, joint control isn’t the most efficient outcome. And finally, consumers already benefit from ad-supported platforms to the tune of $7 trillion a year.

Why A Data Portability Act Might Not Be An Effective Policy Path – Voice in tech, who are more reluctant to regulate directly, have come up with an idea to increase competition: data portability. Most of the proposals for creating data portability, however, miss what makes data valuable, and thus what gives these companies such power. Data isn’t the key to Facebook, Amazon, and Google. Rather, it is the structure and processing tools around the data that drive these platforms. Further, the United States already tried an open-access regime to foster competition in the telecom sector, with lackluster results. Contrary to the standard bearers of this idea, there are very good reasons to think that regulations to create data portability won’t be effective at all.

Network neutrality

When I first started in tech policy, I vehemently supported network neutrality, but I have since lost my confidence in the need for this kind of rule.

The basic economic rule is telling. Companies want to maximize future expected profits which means a natural inclination for openness. The more content that flows over a network, the more it is valued by the consumer and thus the access price can be higher. It is a version of a two part tariff problem.

Section 230

The Precedents for Forcing Neutrality on Tech Platforms – This Congress, Republican Senator Josh Hawley proposed legislation that would fundamentally change a key law underpinning the tech economy, Section 230 of the Communications Decency Act. Currently, Section 230 limits the liability of tech companies for the content that traverses across their Internet platforms. This proposal arises from a longstanding concern about media companies and platforms: that they are biased. In fact, if enacted this proposal would not be the first law that attempts to force such neutrality. But such enforced neutrality has met opposition in the Supreme Court, and ironically it could stifle the free speech it seeks to support.

Reviving the OTA

The Political Economy of Expertise – I’ve been skeptical of reviving the OTA in the past and I remain so. In this Cato Unbound piece, I explore the reasons why.

Will AI steal jobs?

Understanding Job Loss Predictions From Artificial Intelligence – Worries about artificial intelligence (AI) tend to emanate from concerns about the impact of the new technology on work. Many fear that automation will destabilize labor markets, depress wage growth, and lead to long-term secular decline in the labor market and economy as a whole. Policymakers should know that (1) similar models charting AI job loss can result in widely different job prediction losses; (2) most AI job loss predictions aren’t compared against current economic baselines; and (3) implementing AI-based systems isn’t costless and is likely to take some time to accomplish. Not only is there a lack of consensus on the best way to model AI-based labor changes, but more important, there is no consensus as to the best policy path to help us prepare for these changes.

Tracing the impact of automation on workers and firms – August 14, 2020 – The Benchmark – Automation will be a slow process in many sectors. Instead, productivity data is uneven. Firms are reluctant to change, and only some industries seem to be affected by robotics or other automation methods.

Is AI biased?

Understanding Job Loss Predictions From Artificial Intelligence – Worries about artificial intelligence (AI) tend to emanate from concerns about the impact of the new technology on work. Many fear that automation will destabilize labor markets, depress wage growth, and lead to long-term secular decline in the labor market and economy as a whole. Policymakers should know that (1) similar models charting AI job loss can result in widely different job prediction losses; (2) most AI job loss predictions aren’t compared against current economic baselines; and (3) implementing AI-based systems isn’t costless and is likely to take some time to accomplish. Not only is there a lack of consensus on the best way to model AI-based labor changes, but more important, there is no consensus as to the best policy path to help us prepare for these changes.

Mandating AI Fairness May Come At The Expense Of Other Types of Fairness – In 2016, ProPublica sparked a conversation over the use of risk assessment algorithms when they concluded that a widely used “score proved remarkably unreliable in forecasting violent crime” in Florida. Their examination of the racial disparities in scoring has been cited countless times, often as a proxy for the power of automation and algorithms in daily life. As this examination continues, two precepts are worth keeping in mind. First, the social significance of algorithms needs to be considered, not just their internal model significance. While the accuracy of algorithms are important, more emphasis should be placed on how they are used within institutional settings. And second, fairness is not a single idea. Mandates for certain kinds of fairness could come at the expense of others forms of fairness. As always, policymakers need to be cognizant of the tradeoffs.

An AI Innovation Agenda – The AI race is on. Countries, entrepreneurs, and firms across the globe are jockeying to capitalize on artificial intelligence (AI). While it has received much negative publicity, AI creates many opportunities for the economy and country. To ensure that the United States can benefit from the possibilities of these new technologies, policymakers should work toward implementing this AI agenda.

Are U.S. industries too concentrated?

The question of whether there is an “optimal” level of market power in any given market (that could be different from the two polar cases of monopoly and perfect competition) is one of the most important questions in all of economics in my opinion.

As I see it, there are two parallel questions.

  1. Have changes in antitrust enforcement led to higher markups and/or concentration?
  2. Can antitrust tools take on big tech companies?

Concerning the first question, there are only a handful of papers on this issue so it’s an open question still. As for question 2, the shift really isn’t away from the consumer welfare standard, as much as it’s about incorporating non-price changes (ie innovation) into analysis. That has been part of the “Chicago school” conversation for some time. As a matter of history though, I’m just not sure there is a coherent through line with the Chicago School other than the focus on consumer welfare.