IBM has rolled out version 3.1 of its open-source Granite LLMs, bringing some major improvements under the hood. The new models have been trained on a dataset spanning 12 languages and 116 programming languages, processing 12 trillion tokens in total. The latest version features a redesigned dense architecture that can handle up to 128,000 tokens at once. According to IBM, these Apache 2.0-licensed models excel at tasks like answering questions using external data (RAG), pulling information from unstructured text, and creating document summaries. Developers can now access these models through Hugging Face. IBM first introduced Granite back in May 2024.
Ad