Content
summary Summary

Elevenlabs is expanding its portfolio with 11ai, a voice-activated AI assistant designed to actively participate in digital workflows. The alpha version is meant to showcase what’s possible with voice-first technology and API integrations.

Ad

According to Elevenlabs, most voice assistants struggle to move beyond simple conversation and rarely take productive actions. 11ai aims to change that: users give spoken commands, and the system carries out tasks on their behalf.

Video: Elevenlabs

Voice-first productivity with direct tool integrations

Through a web interface at 11.ai/app/eleven, users can issue commands like "Plan my day and add my most important tasks to Linear," or "Use Perplexity to research our prospect meeting today and summarize their recent funding."

Ad
Ad

A promotional video shows 11ai available in every room during a morning routine, though Elevenlabs hasn’t specified what hardware powers the responses.

The system can handle sequential actions and understands context across different tools. For example, when researching a customer, 11ai searches connected systems, finds relevant data, and processes it through CRM updates or team messages.

Built-in integrations and custom servers

11ai’s functionality is built on the Model Context Protocol (MCP), a standardized API integration for AI assistants. Elevenlabs’ conversational AI platform supports MCP natively, connecting to services like Salesforce, HubSpot, Gmail, and Zapier.

At launch, Elevenlabs offers prebuilt integrations for Perplexity, Linear, Slack, HackerNews, and Google Calendar, with more planned to roll out weekly.

The underlying Elevenlabs conversational AI platform is designed for low latency in real-time conversations and supports multimodal voice and text interaction.

Recommendation

The platform also includes RAG functionality for accessing external knowledge bases and can automatically detect languages for multilingual conversations. Users can choose from more than 5,000 voices or create their own voice clones for a personalized experience.

Panel: Experimentelle Integrationsverwaltung bei ElevenLabs mit Diensten Google Calendar, HackerNews, Linear, Perplexity und Slack.
The selection of ready-made integrations is still very limited, but this is likely to change quickly thanks to the Model Context Protocol used. | Image: Screenshot by THE DECODER

11ai also supports custom MCP servers. Teams can connect internal tools or specialized software to 11ai through their own MCP servers, extending the assistant’s functionality to fit their workflows.

According to Elevenlabs, 11ai uses a permissions model, letting users specify what actions the assistant can take for each application.

11ai is currently available as a free, experimental alpha. Elevenlabs is using the alpha phase to collect feedback on integrations, desired MCP servers, voice interaction versus traditional interfaces, and new features for daily routines.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.

Actionable agents are becoming the norm

By entering the market for actionable voice assistants, Elevenlabs is competing directly with a growing number of similar products. Perplexity recently launched a mobile assistant that can handle tasks like restaurant reservations. Amazon has introduced Alexa+, an upgraded, agent-like Alexa focused on voice-first interactions.

Claude from Anthropic is also compatible with the MCP protocol, though it’s more focused on B2B use cases. Only recently did Claude get a voice mode, which relies on Elevenlabs’ technology. Google’s Gemini supports voice interaction, but its ability to take action and integrate with tools is still limited - even within Google’s own ecosystem.

Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Elevenlabs has introduced 11ai, a voice-activated assistant that carries out digital tasks by following spoken commands, aiming to move beyond simple conversation and directly handle actions within user workflows.
  • The system works through a web interface, connects to popular tools like Salesforce, Slack, and Google Calendar using the Model Context Protocol, and allows users to integrate custom servers for internal tools, with support for over 5,000 voices and multilingual conversations.
  • 11ai enters a competitive field of actionable voice assistants, with its alpha release focused on gathering user feedback and expanding integrations, as companies like Perplexity, Amazon, and Google also advance their own voice-driven agents for productivity.
Sources
Jonathan writes for THE DECODER about how AI tools can make our work and creative lives better.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.