Short

Paris-based AI startup Mistral has released Mixtral-8x22B MoE, a new open language model, via a torrent link. An official announcement with more details will follow later. According to early users, the model offers 64,000 token context windows and requires 258 gigabytes of VRAM. Like the Mixtral-8x7B, the new model is a mixture-of-experts model. You can try out the new model at Together AI.

Image: Together AI | X.com
Ad
Ad
Short

IBM is partnering with German AI startup Aleph Alpha to promote the use of generative AI in the public and private sectors in Europe. Through the new partnership, IBM and Aleph Alpha aim to simplify the use of generative AI in businesses and government agencies in Germany and Europe. Aleph Alpha, based in Heidelberg, offers AI solutions for businesses and public administration. The AI startup has an Explain function that makes the output of its own language model Luminous and other open-source models more understandable. With its AI and data platform "watsonx", IBM contributes its own solution with which companies can develop AI applications based on curated data and models. According to the company, this is complemented by the many years of consulting and practical experience of the IBM Consulting division in the field of AI.

Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Google News