Digital garden (research)

I once heard the term digital garden, and I thought it described what I was trying to accomplish with my own work and research. Clusters of ideas. A place for my ideas to grow and be harvested.


Reversing economic stagnation

I’m convinced the answer isn’t degrowth, the answer is abundance. Excessive vetos and inaction slow down development, so one of the big projects of our generation will be the dismantling of our system of vetocracy.

The attention economy

I ’ve been writing on the economics of the attention economy. One strand of this work charts the deep relationship between privacy on social media platforms and the value of the company. Another stand of this research uses the Facebook and Instagram blackout in late 2021 to understand how competition works in reality.

Artificial intelligence

If the rumors are to be believed, Senator Chuck Schumer is working on a new bill to regulate AI. What’s interesting is that nearly all of the “four guardrails” that are steering the conversation are already being implemented by the company that is leading this charge, OpenAI. And the other major company in this race, Anthropic, is competing with even more safety and protections built in.

To me, the effort to regulate AI seems misplaced. Congress has for years failed to pass privacy legislation, which would do a lot to accomplish the goals Sen. Schumer has laid out. Members of Congress already have bill language. What’s needed isn’t another grand vision. What’s needed is bill movement.

Meanwhile, some of the biggest names in AI are calling for a six month pause on all new experiments in the tech. In a newly released letter, business leaders like Elon Musk and Apple Co-founder Steve Wozniak, alongside tech critics like Tristan Harris are calling “on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.”

Altogether, the letter shows just how siloed that Silicon Valley is from the rest of the world. It misreads industrial history and then offers a set of remedies which are already being adopted by the industry.

On top of this, if groups and teams working on AI don’t comply, the letter argues that “governments should step in and institute a moratorium.” Arguably though, because all of the tech is happening in the United States, only the US federal government would need to step in.

The National Telecommunications and Information Administration (NTIA) recently put out a proceeding to better understand how commercial entities data collect and use data. Importantly, it was seeking to understand how “specific data collection and use practices potentially create or reinforce discriminatory obstacles for marginalized groups regarding access to key opportunities, such as employment, housing, education, healthcare, and access to credit.” What the NTIA seeks to tackle is a wicked problem in Rittel and Webber’s classic definition.

The public interest comments I filed argued for a twist on that theme.

Wicked problems, which plague public policy and planning are distinct from natural problems because “natural problems are definable and separable and may have solutions that are findable (while) the problems of governmental planning and especially those of social or policy planning are ill-defined.” But the case of fairness in AI shows that they are over-defined. The reason why “social problems are never solved,” they “are only resolved-over and over again” is because there are many possible solutions.

When the NTIA issues its final report, it should resist the tendency to reduce wicked problems into natural ones. Rather, the agency should recognize, as one report described it, the existence of a hidden universe of uncertainty about AI models.

To address this problem holistically:

  • The first section explains how data-generating processes can create legibility but never solve the problem of illegibility.
  • The second section explains what is meant by bias, breaks down the problems in model selection, and walks through the problem of defining fairness.
  • The third section explores why people have a distaste for the kind of moral calculations made by machines and why we should focus on impact.

Teens and tech

I did a deep dive into teen mental health and the impact of social media. You might be surprised by what the research says, I was. Teens and the impact of social media, a deep dive into recent work from Haidt


Thoughts on what the CDC YRBS data means for social media, teens, and mental health
– March 14, 2023 – Now+Next – On February 13, The Centers for Disease Control (CDC) released “The Youth Risk Behavior Survey Data Summary & Trends Report: 2011–2021,” a summary of the latest findings from the Youth Risk Behavior Survey (YRBS). What is most worrying about this release is the mental health decline of adolescent Americans, especially teenage girls. But when you begin back in the late 1960s, when the CDC began collecting data, it becomes clear that the US has already survived at least one great wave of poor teen mental health.

Broadband and infrastructure

I just finished an estimation of the real extent of broadband availability in the US using data from Georgia. It is driven by an economic model I developed and is a central piece of my recent work in broadband. Some other open questions:

I have been thinking about the cost of time, especially the value of one dollar delays. There is more to do here, especially since

I also have been writing/thinking about excessive veto power, technological atonement, when an online community migrates, the Millennial wealth gap, short-termism, the rural broadband penalty, COVID-19 and the relocalization of politics, the Spence distortion, and that Noam Chomsky signed the Harper’s Magazine letter.