In this grounding conversation, the first of two parts on “What is AI?” we go on a deep dive with Meredith Whittaker on the history of artificial intelligence (AI), from hype to current day-to-day impacts (see our companion conversation with Lucy Suchman here).

Meredith had a front seat to some of the most important developments and growth in so-called AI over the past 15 years. Here, she dives into the historical context that gave rise to AI, emphasizing how its emergence was bound together with the privatization of the internet and with the expansion of the surveillance advertising business model currently at the core of the tech industry. But facts alone won’t counter narratives meant to either disillusion us or desensitize us to rampant data collection, surveillance, and the automation of incredibly important, life-altering decisions and resource allocation by companies and governments the world over. We need new strategies and responses to address the prevailing myths and misperceptions surrounding AI.


Meredith Whittaker is the President of Signal. She is the current Chief Advisor, and the former Faculty Director and Co-Founder of the AI Now Institute. Her research and advocacy focus on the social implications of artificial intelligence and the tech industry responsible for it, with a particular emphasis on power and the political economy driving the commercialization of computational technology. Prior to founding AI Now, she worked at Google for over a decade, where she led product and engineering teams, founded Google’s Open Research Group, and co-founded M-Lab, a globally distributed network measurement platform that now provides the world’s largest source of open data on internet performance. She has advised the White House, the FCC, FTC, the City of New York, the European Parliament, and many other governments and civil society organizations on artificial intelligence, internet policy, measurement, privacy, and security.


  • What is the path to democratize AI?
    • Meredith: How do you democratize a technology that itself, in the form we’re seeing it now, is a product of concentrated power?…We can begin to demand democratic control over these institutions and structures and material resources. But, of course, that’s a fight, right? That’s not something that you do by simply being right about your argument for long enough. Right now, saying the same thing in a room but not having power means you’re just going to get ushered out of that room at the end of the day, and they’re going to make the same decision anyway.
  • What can we learn about AI when we look at its material dimensions as someone that has thought about this, not just theoretically, but very much materially in the present and earlier as well?
    • Meredith: The reason OpenAI ultimately needed to form a relationship with one of these big companies is because they can’t do what they do without that infrastructure. That is a relationship built on the fact that Microsoft owns the Azure Cloud and OpenAI needs that computational resource in order to develop its massive generative AI systems. This is still an ecology dominated by the same players, dominated by the same logics. It’s data, it’s compute, it’s market reach, it’s capital.
  • You said recently that we can’t fight these insidious narratives and the brilliant marketing just simply by reciting and repeating calmly and rigorously, pointing to the facts. We need new responses. What might these other responses or strategies be?
    • Meredith: I think we need to begin much more boldly making demands for a world we want to live in…We can’t simply critique. Being right is not a strategy. We need to be right and we need to be, I think, much bolder in shaping alternative visions and being much more uncompromising about demanding those visions. If we don’t want to be interpolated by AI systems, if we don’t want to be surveilled or displaced in our jobs, instead of being a writer who uses my embodied experience and creativity to produce communication for other human beings, I am an editor of regurgitated ChatGPT, I don’t want that. I don’t think most people want that. And that is not inevitable.

Further reading: