OpenAI’s latest deal to buy chips from AMD signals the company is betting that demand for generative artificial intelligence will keep rising, even as some observers caution the market may be overheated.
The company said on Monday that it will purchase several data centers’ worth of AMD hardware in a deal that could give OpenAI an option to take roughly a 10 percent stake in the chipmaker. The arrangement is being billed as a major supply pact that links compute purchases with an equity option.
“Excited to partner with AMD to use their chips to serve our users!” OpenAI CEO Sam Altman wrote on X, and he said the firm will ramp up investments in Nvidia chips. He added: “The world needs much more compute …”
In a blog post, OpenAI committed to buying 6 gigawatts’ worth of AMD chips over the coming years. The first deployment will deliver a gigawatt of AMD Instinct MI450 GPUs in the second half of 2026, according to the company. Under terms outlined by AMD, OpenAI has the right to purchase up to 160 million shares of AMD common stock, a stake of about 10 percent; OpenAI may acquire those shares as it brings the chips online.
“Our view is that we’re nowhere near the top of the demand curve as every major enterprise, cloud provider, and sovereign initiative is scaling AI infrastructure in parallel,” says Forrest Norrod, an executive vice president at AMD. He added: “we see this as the foundation of a long-term growth cycle for AI infrastructure, not a short-term surge.”
That endorsement from AMD executives matters because the agreement gives the chipmaker a clearer path to challenge Nvidia for business tied to training and running large AI models. “This is a breakthrough achievement for AMD,” Patrick Moorhead, a prominent chip industry analyst, wrote on X in response to the announcement. “It will be a strategic supplier for the leading AI company and this will attract even more tier 1 customers."
The pact is one piece of a much larger wave of investments in compute and data centers. Shortly after his inauguration in January, US president Donald Trump announced that OpenAI, SoftBank, and Oracle would invest $100 billion to build AI data centers on US soil and that the participants could pour up to $500 billion into AI infrastructure over time.
A central technical case for this spending is the expectation that model quality improves as systems are trained with more compute and more data. “There's this basic story that scale is the thing, and that's been true for some time, and it might still be true,” Jonathan Koomey, a visiting professor at UC Berkeley who studies computing and data center efficiency, said in September while discussing an OpenAI-Oracle plan to create three new data center sites. “That's the bet that many of the US AI companies are making.”
The scale argument has drawn scrutiny from observers tracking money flows. Derek Thompson, a prominent economics reporter, pointed out in a recent post that the tech industry may spend roughly $400 billion on AI infrastructure this year, compared with demand for AI services from US consumers of about $12 billion a year, according to one study cited in his note.
OpenAI’s arrangement with AMD builds on a prior working relationship. At AMD’s Advancing AI event in Silicon Valley in June, Altman joined AMD CEO Lisa Su onstage. Su said the company had been collecting customer feedback over several years as it designed the MI400 family of accelerators, and that OpenAI was among its marquee customers.
Onstage, Altman warned that the sector’s shift toward reasoning-style models has increased pressure on model developers around efficiency and long-context deployments, and he described OpenAI’s hardware needs bluntly: “tons of compute, tons of memory, and tons of CPUs,” in addition to the Nvidia GPUs that much of the generative AI stack depends on. Su called Altman a “great friend” and an “icon in AI.”
Analysts and executives see the OpenAI-AMD pact as part of a competitive scramble among cloud providers, chip designers, and model developers to secure the compute that future systems will run on. The deal links a large, guaranteed purchase of processors to a potential equity position, a structure that both locks in demand for AMD and gives OpenAI another lever for securing capacity as it grows its services and research efforts.

