Advertisement

OpenAI CEO Sounds Alarm On China's Next-Gen AI Advances: "I Am Worried"

He admitted that China's progress, particularly with open-source models like DeepSeek and Kimi K2, influenced OpenAI's decision to release its models.

OpenAI CEO Sounds Alarm On China's Next-Gen AI Advances: "I Am Worried"
Mr Altman also questioned the effectiveness of US export controls on semiconductors.
  • Sam Altman warned that the US may underestimate China’s AI advancements in multiple areas
  • China’s open-source AI models influenced OpenAI to release its gpt-oss-120b and gpt-oss-20b models
  • The gpt-oss-120b model runs on a single 80GB GPU and rivals OpenAI’s o4-mini in performance benchmarks
Did our AI summary help?
Let us know.

Sam Altman, CEO of OpenAI, has expressed concerns that the United States may be underestimating China's advancements in next-generation artificial intelligence. In a recent media briefing, he highlighted the complexity of the US-China AI race, suggesting it's not just about who's ahead but involves multiple layers like inference capacity, research, and product development. 

"I'm worried about China," he said.

"There's inference capacity, where China probably can build faster. There's research, there's product; a lot of layers to the whole thing. I don't think it'll be as simple as: Is the U.S. or China ahead?", he added as reported by CNBC. 

Mr Altman also admitted that China's progress, particularly with open-source models like DeepSeek and Kimi K2, influenced OpenAI's decision to release its open-weight models, gpt-oss-120b and gpt-oss-20b. "It was clear that if we didn't do it, the world was gonna head to be mostly built on Chinese open source models. That was a factor in our decision, for sure. Wasn't the only one, but that loomed large," the CEO revealed. 

Notably, these text-only models are designed to be lower-cost options, allowing developers, researchers, and companies to download, run locally, and customise them. The larger model, gpt-oss-120b, has 117 billion parameters and can run on a single 80GB GPU, matching or exceeding the performance of OpenAI's o4-mini model on key benchmarks. The smaller model, gpt-oss-20b, has 21 billion parameters and can operate on devices with as little as 16GB of RAM, making it accessible for developers with limited hardware resources. 

During the briefing, Mr Altman also questioned the effectiveness of US export controls on semiconductors, noting that China could find workarounds, such as building its chip fabrication facilities. 

"My instinct is that doesn't work. You can export-control one thing, but maybe not the right thing… maybe people build fabs or find other workarounds," he said. "I'd love an easy solution. But my instinct is: That's hard," he added. 

Mr Altman's comments come as the US government is fine-tuning its approach to limiting China's advancements in AI. China's tech giants are instead pivoting towards self-reliance, investing heavily in domestic semiconductor development. One notable example is Huawei's push into high-end AI chips, particularly the Ascend 910C. This chip is designed to match Nvidia's flagship H100 performance and is poised to fill the gap left by US export restrictions. 

Industry experts warn that these export controls may ultimately harm US companies more than China, driving innovation in China's semiconductor sector while limiting US firms' access to the lucrative Chinese market. 

Track Latest News Live on NDTV.com and get news updates from India and around the world

Follow us:
Listen to the latest songs, only on JioSaavn.com