Former Google China head Kai-Fu Lee is pivoting his AI startup 01.AI to fully embrace Deepseek's open-source models, describing them as an existential challenge to OpenAI's business model.
Lee's company has abandoned its previous strategy of training proprietary large language models and now relies entirely on Deepseek's open-source offerings, he explained in an interview with the South China Morning Post.
According to Lee, Deepseek's release triggered what he calls a "ChatGPT moment" in China, generating widespread enthusiasm for AI applications. The shift has prompted many Chinese hardware and software providers to align their services with Deepseek models.
"It became imperative for us to … embrace [DeepSeek] as our primary bet," Lee says. The decision followed surging demand for Deepseek models from Chinese CEOs in late January.
Lee believes Deepseek's free, open-source approach presents a fundamental challenge to OpenAI. "The biggest nightmare for Sam Altman is that his competitor is free," Lee says. "I've already met many people who have canceled their ChatGPT subscriptions because Deepseek is free."
Lee's 200-employee startup 01.AI now plans to concentrate on customizing Deepseek models for corporate clients in the financial, gaming and legal sectors.
"A pre-trained model can only be justified by amassing hundreds of millions of users," Lee said. "So Alibaba can justify it, Google can justify it, DeepSeek can justify it and ByteDance can just justify it, but the rest of us can't."
Lee maintains that his company's technical expertise remains valuable despite the strategic shift. "How do you train it, how do you tune it, how do you do reinforcement learning and how do you do fast inference? This last part can only be done by companies with LLM capabilities," he explains.
Lee projects revenue of 100 million yuan ($13 million) for the first quarter of 2025, matching the company's entire 2024 revenue. Despite the growth, 01.AI has yet to achieve profitability.
OpenAI and Anthropic recently urged the US government to ban Deepseek models, characterizing the Chinese startup as "state-controlled." Lee interprets this as a "sign of paranoia."
"They're watching the house of cards they built up starting to fall apart because someone has built an equally good house for free," Lee says.
Consolidation in progress
In a separate Bloomberg interview, Lee highlighted the ongoing consolidation of pre-trained models in both the US and China. He predicts that open-source approaches will ultimately prevail, while pre-training of large language models will remain limited to a few major companies.
Lee points to the stark contrast in operating costs between the two companies. While OpenAI reportedly spent $7 billion in operating costs during 2024, Lee claims Deepseek requires only about two percent of that amount.
"So the issue really isn't whose model is 1% better? I think they're all very good. But the issue is is OpenAI's model even sustainable," Lee says.
Lee describes Deepseek as "infinitely lasting" because its founder has sufficient funding to maintain current operations and has reduced computing costs by a factor of 5 to 10. "So with that kind of formidable competitor, I think Sam Altman is probably not sleeping well," says Lee.