AI research

Trillion Parameter Consortium: global scientists join forces for AI breakthroughs

Maximilian Schreiner

DALL-E 3 prompted by THE DECODER

The Trillion Parameter Consortium aims to train massive, interdisciplinary AI models for science. Founding members include leading research institutions, national supercomputing centres, and companies.

A global consortium of scientists from federal laboratories, research institutes, academia, and industry has come together to advance AI models for scientific discovery - with a particular focus on giant models of one trillion or more parameters.

The Trillion Parameter Consortium (TPC) identifies particular challenges in developing scalable model architectures and training strategies, organizing and curating scientific data for model training, optimizing AI libraries for current and future exascale computing platforms, and developing deep evaluation platforms.

The TPC aims to create an open community of researchers, often already working in small groups, to develop large-scale generative AI models for scientific and engineering problems. In particular, joint projects will be initiated to avoid duplication of effort and to share methods, approaches, tools, knowledge, and workflows. In this way, the consortium hopes to maximize the impact of the projects on the wider AI and scientific community.

TPC bets on new exascale computing platforms

But it is not just about individual groups working together: The TPC also aims to create a global network of resources, data and expertise. Since its inception, the consortium has established a number of working groups to address the complexities of building large-scale AI models.

These will lead initiatives to exploit emerging exascale computing platforms for training LLMs (Large Language Models) or alternative model architectures for scientific research. Models with trillions of parameters are the limit of today's AI models, and only the largest commercial AI systems, such as GPT-4, currently reach this scale.

The exascale computing resources required for training will be provided by several US Department of Energy (DOE) national laboratories and several TPC founding partners in Japan, Europe, and other countries. Even with these resources, the training will take several months.

New AI models should be able to work across disciplines

At our laboratory and at a growing number of partner institutions around the world, teams are beginning to develop frontier AI models for scientific use and are preparing enormous collections of previously untapped scientific data for training,” said Rick Stevens, associate laboratory director of computing, environment and life sciences at DOE’s Argonne National Laboratory and professor of computer science at the University of Chicago.

We collaboratively created TPC to accelerate these initiatives and to rapidly create the knowledge and tools necessary for creating AI models with the ability to not only answer domain-specific questions but to synthesize knowledge across scientific disciplines.”

The list of founding partners includes many leading AI research institutions, companies and hardware manufacturers.

Sources: