Mixtral 8x22B: AI startup Mistral releases new open language model
Paris-based AI startup Mistral has released Mixtral-8x22B MoE, a new open language model, via a torrent link. An official announcement with more details will follow later. According to early users, the model offers 64,000 token context windows and requires 258 gigabytes of VRAM. Like the Mixtral-8x7B, the new model is a mixture-of-experts model. You can try out the new model at Together AI.
AI News Without the Hype – Curated by Humans
As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.
Subscribe nowAI news without the hype
Curated by humans.
- Over 20 percent launch discount.
- Read without distractions – no Google ads.
- Access to comments and community discussions.
- Weekly AI newsletter.
- 6 times a year: “AI Radar” – deep dives on key AI topics.
- Up to 25 % off on KI Pro online events.
- Access to our full ten-year archive.
- Get the latest AI news from The Decoder.