Apple is working on bringing visual AI capabilities to its wearable devices, according to Bloomberg.
The company aims to integrate cameras into Apple Watch models that could analyze the wearer's surroundings and provide relevant information. The camera placement would vary by model.
Bloomberg's Mark Gurman reports that standard Apple Watch versions would house the camera in the screen, while Ultra models would feature cameras near the digital crown and side button.
This expansion builds on Apple's "Visual Intelligence" feature, currently exclusive to iPhone 16. The technology, which uses AI from Google and OpenAI, can analyze images to provide information about landmarks or translate text. Apple plans to extend these capabilities beyond watches to other devices, including future AirPods equipped with cameras.
The timeline remains uncertain. Gurman says these camera-equipped Apple Watches are still generations away, though they appear on the company's product roadmap. Long-term, Apple intends to replace Google and OpenAI's technology with its own AI models for image analysis.
Integrating AI into existing products
While standalone AI devices like the AI Pin haven't gained market traction, to put it nicely, Apple's strategy focuses on integrating AI capabilities into its existing product line. Recent developments with AI-powered Siri illustrate the technical challenges companies face when implementing generative AI in consumer products.
Apple isn't alone in pursuing wearable AI cameras. Meta's Ray-Ban smart glasses offer similar image analysis capabilities, while Google and OpenAI already provide these features through smartphone apps. Moving these functions to wearables could make AI assistance more accessible - though it also raises privacy concerns about cameras that constantly observe their surroundings.