Apple is rolling out new AI tools for developers, embedding generative models directly into Xcode and iOS apps with a strong emphasis on privacy and user control.
At WWDC 2025, Apple introduced the Foundation Models framework, a centralized set of tools that let developers add Apple's own AI models straight into their apps. According to Apple, these models run entirely on the device, require no cloud connection, cost nothing to use, and are designed to protect user privacy.
The framework is built right into Swift, giving developers access to Apple Intelligence with just three lines of code. Features like "Guided Generation" and "Tool Calling" are already included, making it easier to add generative AI to existing apps.
Apple says Automattic's journaling app "Day One" is using the framework to deliver "privacy-centric intelligent features."
App Intents now support visual search
Apple has also upgraded the App Intents interface, which lets developers make their app’s features available system-wide. With the latest update, App Intents now supports visual intelligence, allowing apps to present visual search results directly within the operating system in places like Siri, Spotlight, or widgets. Users can jump from a visual hit list straight to the corresponding app.
Etsy is already taking advantage of this feature to improve product search in its iOS app. According to CTO Rafe Colburn, many items on Etsy are hard to describe in words, so visual search is helping buyers discover creative goods from small sellers.
Xcode 26: ChatGPT and on-device models come to the IDE
Developers can now access ChatGPT directly inside Xcode 26, even without a personal OpenAI account. Other models can be added via API keys or run locally on Macs with Apple Silicon.
The new coding tools cover everything from automatic code generation and test writing to documentation, bug analysis, and iterating on design ideas. Context-aware suggestions appear in the code editor, like creating playgrounds, fixing bugs, or generating preview views.
Apple is also revamping the Xcode interface with better navigation, improvements to the localization catalog, and improved accessibility. Developers can now dictate Swift code or navigate the entire Xcode interface using their voice.