Mixtral: French start-up Mistral releases what is essentially a small GPT-4
French startup Mistral AI has released its new language model Mixtral 8x7B via a torrent link. Mixtral is a mixture-of-experts model, following an architecture that OpenAI is rumored to be using for GPT-4, but on a much larger scale.
There are no benchmarks, blog posts, or articles about the model yet, but Mistral-7B — the first version of Mistral AI — generally performed very well and was quickly adopted by the open-source community. Mistral is thought to have used the MegaBlocks MoE library for training. The Paris-based company was recently valued at nearly $2 billion.

AI News Without the Hype – Curated by Humans
As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.
Subscribe now