Ad
Skip to content

Trillion Parameter Consortium: global scientists join forces for AI breakthroughs

Image description
DALL-E 3 prompted by THE DECODER

Key Points

  • The Trillion Parameter Consortium (TPC) is a global consortium of academics, research institutes, and industry that has come together to advance AI models for scientific discovery, in particular trillion parameter models.
  • The TPC focuses on challenges such as scalable model architectures, training strategies, organization and curation of scientific data, optimization of AI libraries for exascale computing platforms, and development of evaluation platforms.
  • The consortium aims to create an open community of researchers who will initiate joint projects to avoid duplication of effort, share methods, approaches, tools, insights, and workflows, and build a global network of resources, data, and expertise.

The Trillion Parameter Consortium aims to train massive, interdisciplinary AI models for science. Founding members include leading research institutions, national supercomputing centres, and companies.

A global consortium of scientists from federal laboratories, research institutes, academia, and industry has come together to advance AI models for scientific discovery - with a particular focus on giant models of one trillion or more parameters.

The Trillion Parameter Consortium (TPC) identifies particular challenges in developing scalable model architectures and training strategies, organizing and curating scientific data for model training, optimizing AI libraries for current and future exascale computing platforms, and developing deep evaluation platforms.

The TPC aims to create an open community of researchers, often already working in small groups, to develop large-scale generative AI models for scientific and engineering problems. In particular, joint projects will be initiated to avoid duplication of effort and to share methods, approaches, tools, knowledge, and workflows. In this way, the consortium hopes to maximize the impact of the projects on the wider AI and scientific community.

Ad
DEC_D_Incontent-1

TPC bets on new exascale computing platforms

But it is not just about individual groups working together: The TPC also aims to create a global network of resources, data and expertise. Since its inception, the consortium has established a number of working groups to address the complexities of building large-scale AI models.

These will lead initiatives to exploit emerging exascale computing platforms for training LLMs (Large Language Models) or alternative model architectures for scientific research. Models with trillions of parameters are the limit of today's AI models, and only the largest commercial AI systems, such as GPT-4, currently reach this scale.

The exascale computing resources required for training will be provided by several US Department of Energy (DOE) national laboratories and several TPC founding partners in Japan, Europe, and other countries. Even with these resources, the training will take several months.

New AI models should be able to work across disciplines

At our laboratory and at a growing number of partner institutions around the world, teams are beginning to develop frontier AI models for scientific use and are preparing enormous collections of previously untapped scientific data for training,” said Rick Stevens, associate laboratory director of computing, environment and life sciences at DOE’s Argonne National Laboratory and professor of computer science at the University of Chicago.

Ad
DEC_D_Incontent-2

We collaboratively created TPC to accelerate these initiatives and to rapidly create the knowledge and tools necessary for creating AI models with the ability to not only answer domain-specific questions but to synthesize knowledge across scientific disciplines.”

The list of founding partners includes many leading AI research institutions, companies and hardware manufacturers.

AI News Without the Hype – Curated by Humans

As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.

Source: anl.gov | tpc.dev