
The “common sense” around artificial intelligence has become potent over the past two years, imbuing the technology with a sense of agency and momentum that make the current trajectory of AI appear inevitable, and certainly essential for economic prosperity and global dominance for the US. In this section, we break down the narratives propping up this “inevitability,” explaining why it is particularly challenging—but still necessary—to contest the current trajectory of AI, especially at this moment in global history.

The fusing of economic and national security goalposts under the banner of the US-China AI arms race is a critical asset for US AI firms: It affords them patronage not just from their own government, but potentially from the many other nation-states vying for a fighting chance at national competitiveness in this market; it insulates them from regulatory friction by framing any calls for accountability as not just anti-innovation but harming national interests; and—as we explore in Chapter 1.2: Too Big To Fail—is a key factor in positioning them as not just too big, but too strategically important, to fail.
Nation-states have developed their own flavors of “AI Nationalisms,” embarking on initiatives designed simultaneously to support homegrown development and sovereign infrastructures free of dependency on US tech firms, and to attract AI investment.1Amba Kak et al., (<)em(>)AI Nationalism(s): Global Industrial Policy Approaches to AI(<)/em(>), AI Now Institute, March 12, 2024, (<)a href='https://ainowinstitute.org/ai-nationalisms'(>)https://ainowinstitute.org/ai-nationalisms(<)/a(>). But though AI nationalism is on the rise globally, the rhetoric around the AI arms race remains centered around two poles: the US and China. Since the mid 2010s, the notion of a US-China AI arms race has been primarily deployed by industry-motivated actors to push back against regulatory friction. A frequent motif in policy discussions at moments where the industry has sought to stem the tide of regulation, the notion of an arms race was one of the key arguments made against the introduction of a federal data protection law, a package of antitrust reforms targeting the tech industry in 2022, and an omnibus AI Accountability Bill that was considered before Congress.2AI Now Institute, (<)em(>)Tracking the US and China AI Arms Race(<)/em(>), April 11, 2023, (<)a href='https://ainowinstitute.org/publications/tracking-the-us-and-china-ai-arms-race'(>)https://ainowinstitute.org/publications/tracking-the-us-and-china-ai-arms-race(<)/a(>).
In the past two years, this so-called race has taken on a new character (let’s call it the “AI arms race 2.0”), taking shape as a slate of measures that go far beyond deregulation to incorporate direct investment, subsidies, and export controls in order to boost the interests of dominant AI firms under the argument that their advancement is in the national interest (what we refer to as AI industrial policy3Amba Kak and Sarah M. West, “A Modern Industrial Strategy for AI?: Interrogating the US Approach,” AI Now Institute, March 12, 2024, (<)a href='https://ainowinstitute.org/publications/a-modern-industrial-strategy-for-aiinterrogating-the-us-approach'(>)https://ainowinstitute.org/publications/a-modern-industrial-strategy-for-aiinterrogating-the-us-approach(<)/a(>).). Such an approach predates the Trump administration. Arguably a number of the core measures propping up the AI arms race 2.0 were outlined under the Biden Administration; Jake Sullivan, in particular, was a vocal proponent of the logics of economic security.4Jake Sullivan, “Remarks by National Security Advisor Jake Sullivan on Renewing American Economic Leadership at the Brookings Institution” (speech, Washington, DC, April 27, 2023), The White House, (<)a href='https://bidenwhitehouse.archives.gov/briefing-room/speeches-remarks/2023/04/27/remarks-by-national-security-advisor-jake-sullivan-on-renewing-american-economic-leadership-at-the-brookings-institution/'(>)https://bidenwhitehouse.archives.gov/briefing-room/speeches-remarks/2023/04/27/remarks-by-national-security-advisor-jake-sullivan-on-renewing-american-economic-leadership-at-the-brookings-institution(<)/a(>); Jake Sullivan, “Remarks by APNSA Jake Sullivan at the Brokings Institution” (speech, Washington, DC, October 23, 2024), The White House, (<)a href='https://bidenwhitehouse.archives.gov/briefing-room/speeches-remarks/2024/10/23/remarks-by-apnsa-jake-sullivan-at-the-brookings-institution/'(>)https://bidenwhitehouse.archives.gov/briefing-room/speeches-remarks/2024/10/23/remarks-by-apnsa-jake-sullivan-at-the-brookings-institution/(<)/a(>). The Biden administration’s AI Executive Order,5“Executive Order 14110 of October 30, 2023, Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence,” (<)em(>)Code of Federal Regulations(<)/em(>), title 88 (2023): 75191-75226, (<)a href='https://www.federalregister.gov/documents/2023/11/01/2023-24283/safe-secure-and-trustworthy-development-and-use-of-artificial-intelligence'(>)https://www.federalregister.gov/documents/2023/11/01/2023-24283/safe-secure-and-trustworthy-development-and-use-of-artificial-intelligence(<)/a(>). National Security memo,6White House, “Memorandum on Advancing the United States’ Leadership in Artificial Intelligence; Harnessing Artificial Intelligence to Fulfill National Security Objectives; and Fostering the Safety, Security, and Trustworthiness of Artificial Intelligence,” October 24, 2024, (<)a href='https://bidenwhitehouse.archives.gov/briefing-room/presidential-actions/2024/10/24/memorandum-on-advancing-the-united-states-leadership-in-artificial-intelligence-harnessing-artificial-intelligence-to-fulfill-national-security-objectives-and-fostering-the-safety-security'(>)https://bidenwhitehouse.archives.gov/briefing-room/presidential-actions/2024/10/24/memorandum-on-advancing-the-united-states-leadership-in-artificial-intelligence-harnessing-artificial-intelligence-to-fulfill-national-security-objectives-and-fostering-the-safety-security(<)/a(>). and export controls7“Export Control Framework for Artificial Intelligence Diffusion,” (<)em(>)Code of Federal Regulations(<)/em(>), title 15 (2024): 740, 774, (<)a href='https://www.govinfo.gov/content/pkg/CFR-2024-title15-vol2/pdf/CFR-2024-title15-vol2-part740.pdf'(>)https://www.govinfo.gov/content/pkg/CFR-2024-title15-vol2/pdf/CFR-2024-title15-vol2-part740.pdf(<)/a(>). all established an intent for the US government to widely adopt AI and to clear the pathway for the industry to expand through infrastructure build-out, while simultaneously hindering the advancement of strategic adversaries like China by limiting the export of leading-node chips. Unsurprisingly, this stance ran parallel to the lobbying platforms of firms like OpenAI that have sought government cooperation, with a narrow list of conditionalities such as the use of renewable energy and compliance with security measures.8Cade Metz and Tripp Mickle, “Behind OpenAI’s Audacious Plan to Make A.I. Flow Like Electricity,” (<)em(>)New York Times(<)/em(>), September 25, 2024, (<)a href='https://www.nytimes.com/2024/09/25/business/openai-plan-electricity.html'(>)https://www.nytimes.com/2024/09/25/business/openai-plan-electricity.html(<)/a(>). OpenAI specifically has made threats that it will relocate its business absent commensurate support from the US government.9Jordan Wolman and Mohar Chatterjee, “White House Weighing Executive Action to Spur Data Centers,” (<)em(>)Politico(<)/em(>), December 11, 2024, (<)a href='https://subscriber.politicopro.com/article/2024/12/white-house-weighing-executive-action-to-spur-data-centers-00193846'(>)https://subscriber.politicopro.com/article/2024/12/white-house-weighing-executive-action-to-spur-data-centers-00193846(<)/a(>). Since inauguration, the Trump administration has escalated support for the AI industry, rolling back the conditionalities articulated by the Biden administration by repealing the AI Executive Order and replacing it with a blanket assertion: “It is the policy of the United States to sustain and enhance America’s global AI dominance in order to promote human flourishing, economic competitiveness, and national security.”10White House, “Removing Barriers to American Leadership in Artificial Intelligence,” January 23, 2024, (<)a href='https://www.whitehouse.gov/presidential-actions/2025/01/removing-barriers-to-american-leadership-in-artificial-intelligence'(>)https://www.whitehouse.gov/presidential-actions/2025/01/removing-barriers-to-american-leadership-in-artificial-intelligence(<)/a(>).

A New Silicon Valley Consensus Beyond Targeted Ads to Targeted AI Weapons?

While the Trump administration has firmly asserted AI as a strategic national asset, they are likely to expect the industry to act in ways that align more closely with state interest. The specifics of what that means is left deliberately hazy, but a popular refrain has been that companies should be devoted less to targeted advertising, and more to AI that would bolster national security—and defense tech is increasingly front and center of events like the Hill & Valley Forum,11“The Hill & Valley Forum 2025,” The Hill & Valley Forum, April 30, 2025, (<)a href='https://www.thehillandvalleyforum.com'(>)https://www.thehillandvalleyforum.com(<)/a(>). an annual consortium of Silicon Valley elites and DC lawmakers that first convened in March 2023 to combat China’s influence on the American tech industry.12Elizabeth Dwoskin, “Tech Leaders Were on the Outside Looking In. Now They Own Washington,” (<)em(>)Washington Post(<)/em(>), March 6, 2025, (<)a href='https://www.washingtonpost.com/politics/2025/03/06/tech-leaders-were-outside-looking-now-they-own-washington'(>)https://www.washingtonpost.com/politics/2025/03/06/tech-leaders-were-outside-looking-now-they-own-washington(<)/a(>). Cofounded by Palantir’s Jacob Helberg, the Hill & Valley Forum is more aligned than ever before with state national security interests,13Mohar Chatterjee, “Silicon Valley Comes to Washington — Singing Trump’s Tune,” (<)em(>)Politico(<)/em(>), April 30, 2025, (<)a href='https://www.politico.com/newsletters/digital-future-daily/2025/04/30/silicon-valley-comes-to-washington-singing-trumps-tune-00319538'(>)https://www.politico.com/newsletters/digital-future-daily/2025/04/30/silicon-valley-comes-to-washington-singing-trumps-tune-00319538(<)/a(>); and Dwoskin, “Tech Leaders.” as Helberg,14Jacob Helberg (@jacobhelberg), “Thank you, Mr. President. I am deeply honored and humbled by the trust that you have placed in me.” X, December 10, 2024, (<)a href='https://x.com/jacobhelberg/status/1866626272203509803'(>)https://x.com/jacobhelberg/status/1866626272203509803(<)/a(>). like Michael Kratsios and David Sacks, is one of many industry representatives who find themselves in key policy roles under the Trump administration.15Helberg has been nominated to become Under Secretary for Economic Growth, Energy and the Environment. Michael Kratsios, from Scale AI, is Director of the Office of Science and Technology Policy. David Sacks is the White House’s “AI Czar.” See Nitasha Tiku, Cat Zakrzewski, and Elizabeth Dwoskin, “A Podcast Star Rallied Silicon Valley to Back Trump. Now He’s the Nation’s Tech Czar,” (<)em(>)Washington Post(<)/em(>), April 13, 2025, (<)a href='https://www.washingtonpost.com/technology/2025/04/13/david-sacks-ai-crypto-trump'(>)https://www.washingtonpost.com/technology/2025/04/13/david-sacks-ai-crypto-trump(<)/a(>); and Madison Alder, “Trump Taps Michael Kratsios, Lynne Parker for Tech and Science Roles,” (<)em(>)Fedscoop(<)/em(>), December 23, 2024, (<)a href='https://fedscoop.com/trump-taps-michael-kratsios-lynne-parker-tech-science-roles'(>)https://fedscoop.com/trump-taps-michael-kratsios-lynne-parker-tech-science-roles(<)/a(>).
So far, the industry seems to support this vision. This is best seen in the rhetoric of Palantir’s CEO Alex Karp, who has long framed the company’s mission as addressing a civilizational need to support democratic and Western supremacy through leading-edge technology. But emboldened by Trump’s intent to scale up mass deportations and police surveillance, Karp has escalated, saying in an investor call in early 2025: “We are dedicating our company to the service of the West and the United States of America, and we’re super-proud of the role we play, especially in places we can’t talk about. Palantir is here to disrupt. And, when it’s necessary, to scare our enemies and, on occasion, kill them.”16Sophie Hurwitz, “The Gleeful Profiteers of Trump’s Police State,” (<)em(>)Mother Jones(<)/em(>), February 6, 2025, (<)a href='https://www.motherjones.com/politics/2025/02/palantir-alex-karp-trump-private-prisons-profiteers'(>)https://www.motherjones.com/politics/2025/02/palantir-alex-karp-trump-private-prisons-profiteers(<)/a(>).
Karp isn’t alone. Since the Biden administration’s shift toward securitization of AI in 2024, companies that have historically distanced themselves from the military have also doubled down on national security: After making an amendment to its permissible use policy enabling its tools to be used by militaries,17Sam Biddle, “OpenAI Quietly Deletes Ban on Using ChatGPT for ‘Military and Warfare’,” (<)em(>)Intercept(<)/em(>), January 12, 2024, (<)a href='https://theintercept.com/2024/01/12/open-ai-military-ban-chatgpt'(>)https://theintercept.com/2024/01/12/open-ai-military-ban-chatgpt(<)/a(>). OpenAI has increasingly leaned in to making policy arguments on security grounds,18OpenAI, “OpenAI’s Approach to AI and National Security,” October 24, 2024, (<)a href='https://openai.com/global-affairs/openais-approach-to-ai-and-national-security'(>)https://openai.com/global-affairs/openais-approach-to-ai-and-national-security(<)/a(>). going so far as to assert that expanding fair use under copyright law to include AI development is a security imperative.19Emma Roth, “OpenAI and Google Ask the Government to Let Them Train AI on Content They Don’t Own,” (<)em(>)The Verge(<)/em(>), March 14, 2025, (<)a href='https://www.theverge.com/news/630079/openai-google-copyright-fair-use-exception'(>)https://www.theverge.com/news/630079/openai-google-copyright-fair-use-exception(<)/a(>). In February 2025, Google amended its guidelines to allow its AI technologies to be used for military weapons and surveillance, despite ongoing protests by its employees and a long-standing ban on use of its technology for weapons following the Project Maven protests of 2018.20Lucy Hooker and Chris Vallance, “Concern Over Google Ending Ban on AI Weapons,” BBC, February 5, 2025, (<)a href='https://www.bbc.com/news/articles/cy081nqx2zjo'(>)https://www.bbc.com/news/articles/cy081nqx2zjo(<)/a(>); Ina Fried, “Google’s Hassabis Explains Shift on Military Use of AI,” (<)em(>)Axios(<)/em(>), February 14, 2025, (<)a href='https://www.axios.com/2025/02/14/google-hassabis-ai-military-use'(>)https://www.axios.com/2025/02/14/google-hassabis-ai-military-use(<)/a(>); Scott Shane and Daisuke Wakabayashi, “‘The Business of War’: Google Employees Protest Work for the Pentagon,” (<)em(>)New York Times(<)/em(>), April 4, 2018, (<)a href='https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html'(>)https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html(<)/a(>). And Meta made an announcement in November 2024 that it would make its Llama models available to the US government for national security use.21Patrick Moorhead, “Meta Extends Llama Support to U.S. Government For National Security,” (<)em(>)Forbes(<)/em(>), November 4, 2024, (<)a href='https://www.forbes.com/sites/patrickmoorhead/2024/11/04/meta-extends-llama-support-to-us-government-for-national-security'(>)https://www.forbes.com/sites/patrickmoorhead/2024/11/04/meta-extends-llama-support-to-us-government-for-national-security(<)/a(>).
Meanwhile, Anthropic’s CEO Dario Amodei recently wrote about the threat of authoritarian governments establishing military dominance on AI as a reason to accelerate US leadership22Anthropic, “Statement from Dario Amodei on the Paris AI Action Summit,” February 11, 2025, (<)a href='https://www.anthropic.com/news/paris-ai-summit'(>)https://www.anthropic.com/news/paris-ai-summit(<)/a(>). and the VC firm Andreessen Horowitz operates an “American Dynamism” practice expressly designed to support the national interest in strategically important sectors: aerospace, defense, public safety, education, housing, supply chain, industrials, and manufacturing.23Andreessen Horowitz, “What We Believe,” accessed April 24, 2024, (<)a href='https://a16z.com/american-dynamism'(>)https://a16z.com/american-dynamism(<)/a(>).

A Double-Edged Sword:
Chip Diffusion and “Sovereign AI”

It’s worth noting that the AI arms race 2.0 has shifted from being an absolute policy advantage for the tech industry writ large to being a double-edged sword for some: Aggressive restrictions on the export of chips are closing off a huge market for US AI hardware companies and data center products, which has left firms like Nvidia and Oracle deeply unhappy.24Ken Glueck, “Export Control Diffusion Confusion,” (<)em(>)Oracle(<)/em(>) (blog), January 5, 2025, (<)a href='https://www.oracle.com/news/announcement/blog/export-control-diffusion-confusion-2025-01-05'(>)https://www.oracle.com/news/announcement/blog/export-control-diffusion-confusion-2025-01-05(<)/a(>). During the Biden administration, the implementation of export controls restricting the sale of semiconductors to certain countries through the “diffusion framework” received the bulk of the criticism, with a number of firms invested in the global chip market particularly up in arms about the impact to their businesses.25Information Technology & Innovation Foundation (ITIF), “AI Diffusion Rule Threatens US Leadership, Warns ITIF,” January 13, 2025, (<)a href='https://itif.org/publications/2025/01/13/ai-diffusion-rule-threatens-us-leadership-warns-itif'(>)https://itif.org/publications/2025/01/13/ai-diffusion-rule-threatens-us-leadership-warns-itif(<)/a(>). The Trump Administration may make changes to the diffusion rule,26Karen Freifeld, “Exclusive: Trump Officials Eye Changes to Biden’s AI Chip Export Rule, Sources Say,” Reuters, April 29, 2025, (<)a href='https://www.reuters.com/world/china/trump-officials-eye-changes-bidens-ai-chip-export-rule-sources-say-2025-04-29'(>)https://www.reuters.com/world/china/trump-officials-eye-changes-bidens-ai-chip-export-rule-sources-say-2025-04-29(<)/a(>). and is internally fragmented between factions that are supportive of tariffs and hawkish toward China, and those that are interested in global expansion of the AI market.27Sydney J. Freedberg Jr., “White House Tries to Tighten AI Export Controls Amidst Industry Outrage,” (<)em(>)Breaking Defense(<)/em(>), January 14, 2025, (<)a href='https://breakingdefense.com/2025/01/white-house-tries-to-tighten-ai-export-controls-amidst-industry-outrage'(>)https://breakingdefense.com/2025/01/white-house-tries-to-tighten-ai-export-controls-amidst-industry-outrage(<)/a(>).
For its part, Nvidia—the leading semiconductor firm, which is most directly affected by the export controls—has embarked on a push for “sovereign AI,” a term coined by the company to refer to nations’ abilities to produce their own AI using some combination of homegrown infrastructures, data, workforces, and business networks.28Angie Lee, “What is Sovereign AI?” NVIDIA (blog), February 28, 2024, (<)a href='https://blogs.nvidia.com/blog/what-is-sovereign-ai'(>)https://blogs.nvidia.com/blog/what-is-sovereign-ai(<)/a(>).
Nvidia’s stance is an example of a play at market expansion. As the provider of computing chips for the data center infrastructures central to sovereignty initiatives, the company stands to benefit from nation-states’ growing interest in building out their own homegrown industries and attracting AI investment. For chip manufacturers, the push toward sovereign AI can be seen as a way of diversifying their customer base away from the hyperscalers and hedging their business against the potential slump in the demand from these companies.29Frederike Kaltheuner, Leevi Saari, and AI Now Institute, “German Election, China’s AI Rise, and the MAGAfication of Big Tech,” (<)em(>)EU AI Industrial Policy Monitor (<)/em(>)(blog), January 14, 2025, (<)a href='https://euaipolicymonitor.substack.com/i/154749386/nvidia-parachutes'(>)https://euaipolicymonitor.substack.com/i/154749386/nvidia-parachutes(<)/a(>).
The European Union and its member states have also espoused interest in sovereign investment into AI in a bid to compete at the frontier. The European Commission has gradually repurposed its existing European high-performance supercomputing capacity toward training large-scale AI models.30In 2024, these supercomputers, combined with a supplementary ecosystem of data processing and talent, were rebranded as “AI Factories” to work as the key nodes of European AI development. At least thirteen such factories are expected to be operational by 2026. European Commission, “AI Factories,” accessed May 1, 2025, (<)a href='https://digital-strategy.ec.europa.eu/en/policies/ai-factories'(>)https://digital-strategy.ec.europa.eu/en/policies/ai-factories(<)/a(>). To further up the ante, the Commission announced a €20 billion InvestAI initiative to establish European “gigafactories” that would house one hundred thousand GPUs with the objective to facilitate training of models with “hundreds of trillions” of parameters.31European Commission, “AI Continent Action Plan,” last updated May 7, 2025, (<)a href='https://digital-strategy.ec.europa.eu/en/factpages/ai-continent-action-plan'(>)https://digital-strategy.ec.europa.eu/en/factpages/ai-continent-action-plan(<)/a(>). Investment has also picked up in the member states. In February 2025, France hosted the Paris AI Action Summit, during which president Emmanuel Macron announced around €110 billion in investment pledges to boost France’s AI sector, with a focus on infrastructure investments.32AI Now Institute, Frederike Kaltheuner, and Leevi Saari, “The Week When Decades Happened,” (<)em(>)EU AI Industrial Policy Monitor(<)/em(>) (blog), February 21, 2025, (<)a href='https://euaipolicymonitor.substack.com/i/155921632/investments-it-is-about-the-money-stupid'(>)https://euaipolicymonitor.substack.com/i/155921632/investments-it-is-about-the-money-stupid(<)/a(>).,33“Details of 110 Billion Euros in Investment Pledges at France’s AI Summit,” Reuters, February 10, 2025, (<)a href='https://www.reuters.com/technology/artificial-intelligence/details-110-billion-euros-investment-pledges-frances-ai-summit-2025-02-10'(>)https://www.reuters.com/technology/artificial-intelligence/details-110-billion-euros-investment-pledges-frances-ai-summit-2025-02-10(<)/a(>). In Germany, the new government coalition has agreed to house at least one of the gigafactories, complemented with commitment to develop a sovereign tech stack, as well as support for a budding “Eurostack” movement, an informal coalition34“EuroStack – Why, What and How,”(<)em(>) (<)/em(>)EuroStack(<)em(>),(<)/em(>) Accessed April 24, 2025, (<)a href='https://euro-stack.eu'(>)https://euro-stack.eu(<)/a(>). at the European level that aims to reduce European tech dependencies by developing domestic alternatives.35Max Helleberg, “The Role of Artificial Intelligence in the 21st Legislative Period: An Evalution of the Coalition Agreement,” (<)em(>)Noerr(<)/em(>), April 11, 2025, (<)a href='https://www.noerr.com/en/insights/the-role-of-artificial-intelligence-in-the-21st-legislative-period-an-evaluation-of-the-coalition-agreement'(>)the-role-of-artificial-intelligence-in-the-21st-legislative-period-an-evaluation-of-the-coalition-agreement(<)/a(>)
These investments at the level of the EU and its member states still pale in comparison to the scale of the private investment plans in the US, with the $500 billion joint venture fund Stargate announced in January 2025; the fund arguably cements monopoly dominance by a cartel of US-based firms.36Madhavi Singh, “Stargate or StarGatekeepers? Why This Joint Venture Deserves Scrutiny,” (<)em(>)Berkeley Technology Law Journal(<)/em(>) 41 (forthcoming), (<)a href='https://dx.doi.org/10.2139/ssrn.5184657'(>)https://dx.doi.org/10.2139/ssrn.5184657(<)/a(>). Meanwhile, the UAE and Saudi Arabia are geopolitical swing states, given their financial capital to sustain infrastructural build-out, and have been flooding the market with money via the AI funds MGX, G42, and the Saudi Public Investment Fund (PIF) for AI,37See Kate Rooney and Kevin Schmidt, “Middle Eastern Funds Are Plowing Billions of Dollars Into Hottest AI Startups,” CNBC, September 22, 2024, (<)a href='https://www.cnbc.com/2024/09/22/middle-eastern-funds-plowing-billions-into-the-hottest-ai-start-ups-.htm'(>)https://www.cnbc.com/2024/09/22/middle-eastern-funds-plowing-billions-into-the-hottest-ai-start-ups-.htm(<)/a(>); and Adam Satariano and Paul Mozur, “‘To the Future’: Saudi Arabia Spends Big to Become an A.I. Superpower,” (<)em(>)New York Times(<)/em(>), April 26, 2024, (<)a href='https://www.nytimes.com/2024/04/25/technology/saudi-arabia-ai.html'(>)https://www.nytimes.com/2024/04/25/technology/saudi-arabia-ai.html(<)/a(>). money that the leaders of AI firms are avidly seeking.38Jason Karaian, “Elon Musk, Sam Altman and Other C.E.O.s Join Trump at U.S.-Saudi Lunch,” (<)em(>)New York Times(<)/em(>), May 13, 2025, (<)a href='https://www.nytimes.com/2025/05/13/us/politics/trump-saudi-business-lunch-musk-altman.html'(>)https://www.nytimes.com/2025/05/13/us/politics/trump-saudi-business-lunch-musk-altman.html(<)/a(>).
Nationalism thus still remains a critical shaping force in AI policymaking: The “AI arms race” has if anything become increasingly complex in a moment of geopolitical uncertainty, and is wielded by firms both to avert regulation and to court investment.