AI research

Google Cloud and Hugging Face join forces to advance open-source AI

Jonathan Kemper

Hugging Face

Google Cloud will soon power more open-source models. Hugging Face is expanding its ties to Google's high-performance infrastructure, and Google is getting its foot in the door for open-source AI.

Google has announced a partnership with AI startup Hugging Face to advance open-source AI. Hugging Face users will be able to host their software on Google Cloud. This might ultimately benefit the open-source community.

"We will collaborate with Google across open science, open source, cloud, and hardware to enable companies to build their own AI with the latest open models from Hugging Face and the latest cloud and hardware features from Google Cloud," reads the official blog post.

Hundreds of thousands of Hugging Face users are already active on Google Cloud servers. The partnership is intended to make it easier to train and host new models within the Google Kubernetes Engine and Vertex AI.

This collaboration is particularly significant for Google, as it signals a stronger commitment to open-source AI. Until now, the tech giant has relied exclusively on proprietary models for both its own projects and through its investment in Anthropic, which is now under scrutiny by the FTC.

Last year, a Google engineer expressed concern that the company was lagging open-source developers in AI advancements.

Expanding Google's open-source horizons

Founded in New York in 2016, Hugging Face has established itself as a cornerstone of the open-source AI community, providing an important platform for developers to share and advance AI models.

The AI startup has established relationships with several key players in the technology industry, beyond Google, to expand and improve its infrastructure and services.

For example, Hugging Face uses Amazon servers with its proprietary Trainium AI chip to power the development of its Bloom open-source language model. A partnership with Nvidia allows Hugging Face users to leverage Nvidia's DGX Cloud AI infrastructure for training or fine-tuning AI models.

These alliances are critical to scaling and distributing open-source alternatives to proprietary AI models. For example, Hugging Face's HuggingChat is designed to be an open-source counterpart to OpenAI's ChatGPT, but has yet to match its success. Modifications of Meta's LLaMa language models also find a home on Hugging Face.