OpenAI is leaving few stones unturned in the race to build compute capacity for its AI efforts.
The ChatGPT maker on Wednesday said it had struck agreements with two of the world’s biggest manufacturers of memory chips, Samsung Electronics and SK Hynix, to make DRAM wafers for the Stargate AI infrastructure project, and build data centers in South Korea.
The companies signed the letters of intent following a meeting in Seoul between OpenAI CEO Sam Altman, South Korea’s president Lee Jae-myung, Samsung Electronics’ executive chairman Jay Y. Lee, and SK chairman Chey Tae-won.
Under the deal, Samsung and SK Hynix plan to scale their manufacturing to produce up to 900,000 high-bandwidth DRAM memory chips per month for use in Stargate and AI data centers. SK Group noted in a separate statement that this would be more than double the current industry capacity for high-bandwidth memory chips.
Stargate is a massive infrastructure project by OpenAI, Oracle, and SoftBank that seeks to spend $500 billion to build data centers dedicated to AI development in the United States.
Disrupt 2026: The tech ecosystem, all in one room
Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $400.
Save up to $300 or 30% to TechCrunch Founder Summit
1,000+ founders and investors come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately
Offer ends March 13.
Wednesday’s agreements follow a month of frenetic investment in AI compute capacity, and OpenAI has been the locus of a lot of that activity. Just a couple of weeks ago, Nvidia said it would invest up to $100 billion in OpenAI as part of a deal that would give the ChatGPT maker access to more than 10 gigawatts of compute capacity via Nvidia’s AI training systems. The following day, OpenAI said it would build out five data centers with SoftBank and Oracle for the Stargate project, aiming to increase its total compute capacity to 7 gigawatts.
Earlier in September, Oracle agreed to sell $300 billion of compute capacity to OpenAI over five years.
OpenAI said it is also working with the Korean Ministry of Science and ICT to find opportunities to build AI data centers outside Seoul, and that it had struck a separate deal with SK Telecom to build an AI data center. The AI company also signed a few other agreements with Samsung subsidiaries to explore avenues for building more data centers in the country.
Samsung and SK Group will also integrate ChatGPT Enterprise and OpenAI APIs into their operations as part of the deal.
