Ad
Short

French startup Mistral AI has released its new language model Mixtral 8x7B via a torrent link. Mixtral is a mixture-of-experts model, following an architecture that OpenAI is rumored to be using for GPT-4, but on a much larger scale.

There are no benchmarks, blog posts, or articles about the model yet, but Mistral-7B — the first version of Mistral AI — generally performed very well and was quickly adopted by the open-source community. Mistral is thought to have used the MegaBlocks MoE library for training. The Paris-based company was recently valued at nearly $2 billion.

Image: Twitter.com
Ad
Ad
Short

With Generative Power of Tens, researchers demonstrate a method enabling "extreme semantic zooms" from wide-angle views to macro shots. Unlike traditional super-resolution methods, the team from the University of Washington, Google Research, and UC Berkeley uses text prompts for each scale, enabling deeper zoom levels. Compared to traditional outpainting techniques, the approach produces a consistent zoom in which the content of the coarser and finer zoom levels are consistent.

Video: Wang et al.

Ad
Ad
Ad
Ad
Google News