Meta has released the MMCSG (Multi-Modal Conversations in Smart Glasses) dataset, featuring two-sided conversations recorded using Aria glasses. The dataset includes multi-channel audio, video, accelerometer, and gyroscope data, and is aimed at supporting research in areas such as automatic speech recognition, activity detection, and speaker diarization. The glasses capture video and audio with seven microphones, along with inertial measurement unit (IMU) measurements. All data was collected from consenting participants and has been anonymized to ensure privacy. The MMCSG dataset could potentially be used for applications like real-time language translation. More information can be found in the related research paper and the dataset is available for download under Meta's Data License.

Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.