AI startup Groq has raised $640 million in its latest funding round. The company develops specialized chips for AI applications such as language models.
Groq Inc., an AI chip and software startup founded in 2016 by Jonathan Ross, has raised $640 million in its latest funding round. The round was led by BlackRock Inc. funds and supported by the investment arms of Cisco Systems Inc. and Samsung Electronics Co. The company is now valued at $2.8 billion.
Ross, who previously worked on TPU chips at Google, is developing specialized hardware for AI applications with Groq, particularly those that accelerate current foundation models like language models. The company has developed so-called LPUs (Language Processing Units) optimized for running language models. These can achieve speeds of hundreds of tokens per second, even with larger models like Meta's Llama 2 70B - equivalent to hundreds of words per second. This speed makes the hardware attractive for real-time applications, although Nvidia's hardware now also achieves this speed - albeit at significantly higher prices.
Groq builds chips for inference, not training
According to Groq, LPUs are more energy-efficient than conventional graphics chips (GPUs) as they reduce the overhead for managing multiple threads and avoid underutilization of cores. Groq's chip design also allows for the connection of multiple specialized cores without the traditional bottlenecks that occur in GPU clusters.
The hardware could thus provide an alternative to the currently high-demand chips from Nvidia, but only for inference, i.e., the execution of AI models. Companies still need conventional hardware from Nvidia or similar chips for model training. For example, Meta plans to train Llama 4 with ten times the computing power used for Llama 3.