Meta has established multiple emergency response teams after Chinese AI company Deepseek demonstrated AI models that are both more efficient and significantly cheaper to operate than Western alternatives.
According to The Information, Meta's AI division has entered crisis mode as company executives worry their upcoming Llama model won't match Chinese competitor Deepseek's capabilities.
Meta has set up four specialized "war rooms" to analyze and respond to Deepseek's technology. Two teams are studying how to replicate Deepseek's cost-effective training and operational methods, while a third investigates what training data the Chinese startup might be using. The fourth team is exploring how to restructure Meta's models to match Deepseek's efficiency.
Meta's internal scramble was first revealed through an employee leak on the anonymous platform "Blind," where a staffer described panic within Meta's AI department. Meta spokesperson Jon Carvill downplayed these concerns, telling The Information that regularly evaluating competing AI models is standard practice.
Deepseek challenges Western AI dominance
What sets Deepseek apart is its aggressive pricing strategy – the company's cloud API costs a mere fraction of comparable OpenAI services, with prices 17 to 27 times lower. In addition, Deepseek's impressive R1 performance, its first "reasoning" model similar to OpenAI o1, has propelled its app to the top of the iPhone charts, surpassing even the widely popular ChatGPT.
The company's success is already affecting US markets. Nvidia and other AI-related chip stocks dropped several percentage points overnight as investors processed Deepseek's ability to run powerful AI models with fewer chips.
While not directly addressing Deepseek, Microsoft CEO Satya Nadella shared his thoughts on the situation through the lens of the Jevons Paradox. This economic principle suggests that increased efficiency in resource use often leads to higher demand for that resource.
Nadella applied this concept to AI, stating, "As AI gets more efficient and accessible, we will see its use skyrocket, turning it into a commodity we just can't get enough of."