Content
newsletter Newsletter

Artificial Intelligence advances in language processing are making an old sci-fi dream seem very real: a real-time universal translation system that removes all language barriers. Meta is working on it.

At an event about its current AI projects, Meta CEO Mark Zuckerberg announced plans to use artificial intelligence to develop a universal AI translation system for many languages. “AI will make that possible in our lifetime,” Zuckerberg said.

Advanced language AI also plays a role in the Metaverse context: corresponding AI models are to enable advanced voice control and assistance in the generation of virtual worlds in the future.

AI for speech: Meta is working on real-time universal translator

Meta CEO Mark Zuckerberg describes the universal translation technology that his company has been researching for years as a “superpower that people have always dreamed of”.

Ad
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer

“Eliminating language barriers would be profound, making it possible for billions of people to access information online in their native or preferred languages,” Meta’s AI researchers highlight the importance of the project.

Meta’s ultimate goal is to translate speech directly into language without prior transcription, as is usually the case with current systems, and in hundreds of languages. Such a system is also known to science fiction veterans as “Babelfish”; Meta calls it Universal Speech Translator.

In the context of VR and AR, AI translation tools should help people “to do everyday activities — hosting a book club or collaborating on a work project — with anyone, anywhere, just as they would with someone next door”. Zuckerberg sees the AI Universal Translator as a potential killer app for future AR glasses, which Meta is also researching.

AI translation and voice input for the Metaverse

Zuckerberg also emphasizes the importance of universal translation for large virtual worlds – his Metaverse vision – where even more people from diverse cultures are expected to meet than is already the case in reality. The No Language Left Behind initiative therefore aims to develop AI models that can use only a small amount of training data from rare languages to translate them.

Meta also wants to use AI for speech to facilitate the creation of virtual worlds for the metaverse: Zuckerberg demonstrated a “builder bot” in a recording that generates a virtual beach scene in reaction to voice commands.

Recommendation

Video: Meta, recording via The Verge

Meta’s babelfish ambitions have been known for years: The company unveiled its M2M-100 AI system in the fall of 2020, which can translate between any two pairs of sentences from 100 languages without the so-called “bridge language” of English.

In 2021, Meta demonstrated the more advanced WMT2021 variant: the multilingual AI model was able to outperform bilingually trained language models in translation performance for the first time. Google is also working on a babelfish AI with Translatotron 2.

Read more about AI:

Ad
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Join our communit
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our communit
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.