IBM has rolled out version 3.1 of its open-source Granite LLMs, bringing some major improvements under the hood. The new models have been trained on a dataset spanning 12 languages and 116 programming languages, processing 12 trillion tokens in total. The latest version features a redesigned dense architecture that can handle up to 128,000 tokens at once. According to IBM, these Apache 2.0-licensed models excel at tasks like answering questions using external data (RAG), pulling information from unstructured text, and creating document summaries. Developers can now access these models through Hugging Face. IBM first introduced Granite back in May 2024.

Ad
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.