Ad
Skip to content

Mixtral: French start-up Mistral releases what is essentially a small GPT-4

French startup Mistral AI has released its new language model Mixtral 8x7B via a torrent link. Mixtral is a mixture-of-experts model, following an architecture that OpenAI is rumored to be using for GPT-4, but on a much larger scale.

There are no benchmarks, blog posts, or articles about the model yet, but Mistral-7B — the first version of Mistral AI — generally performed very well and was quickly adopted by the open-source community. Mistral is thought to have used the MegaBlocks MoE library for training. The Paris-based company was recently valued at nearly $2 billion.

AI News Without the Hype – Curated by Humans

As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.

AI news without the hype
Curated by humans.

  • Over 20 percent launch discount.
  • Read without distractions – no Google ads.
  • Access to comments and community discussions.
  • Weekly AI newsletter.
  • 6 times a year: “AI Radar” – deep dives on key AI topics.
  • Up to 25 % off on KI Pro online events.
  • Access to our full ten-year archive.
  • Get the latest AI news from The Decoder.
Subscribe to The Decoder