AI start-up Liquid AI has unveiled new foundation models that are not based on the usual Transformer architecture. According to the company, the so-called Liquid Foundation Models (LFMs) should outperform models such as Meta's Llama 3 or Microsoft's Phi-3 in a comparable size. According to Liquid AI, even the smallest model, the LFM 1.3B, outperforms Meta's new Llama 3.2-1.2B in many benchmarks. The LFMs are available in three sizes: 1.3 billion, 3 billion and 40 billion parameters. They are said to be particularly memory efficient and suitable for various data types such as text, audio, and video. Liquid AI was founded by former MIT researchers; team members were involved in StripedHyena, among other projects. Liquid AI is planning a product launch at MIT on 23 October 2024, the models are not open source. You can try out Liquid on the Playground.
Ad
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Sources
News, tests and reports about VR, AR and MIXED Reality.
Become a mixed reality detective in this Quest 3 exclusive
AMD's new RDNA 4 graphics cards are launching with AI power for 4K and QHD gaming
Nvidia's new RTX 5090 and RTX 5080 graphics cards will be available in stores this month
MIXED-NEWS.com
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.