The technological conversation of recent months has an undisputed protagonist: Apple's new AI engine built into Siri, capable of searching the web and offering immediate answers in the style of modern generative assistants. Leaks point to a system that will not only answer questions, but generate concise summaries, images and videos directly from iOS, iPadOS, and macOS, with Safari and Spotlight as natural gateways.
The most striking thing is who would be behind this big bet: Google would provide a customized version of Gemini that would run on private servers controlled by Apple itself, thus preserving privacy. Meanwhile, Apple is accelerating its roadmap for Apple Intelligence, its personal AI platform that is now available in Spanish with a good batch of new features on iPhone, iPad, Mac, Apple Watch, and even Apple Vision Pro.
Apple's plan: an AI-powered search engine for Siri, Safari, and Spotlight

As Mark Gurman reported in Bloomberg, Apple is working on a AI-powered response engine whose internal name would have been Answers, designed to feed Siri with internet results and return compressed information with context, images, and videos. This approach would bring search capabilities beyond the assistant, also embedding them in Safari and Spotlight so that they appear right where the user starts their queries.
The news does not come alone: Apple would have already closed a agreement with Google to use a variant of Gemini that would run on the Cupertino company's private infrastructure. The idea is to keep processing under a enhanced privacy, preventing data from persisting in the cloud and reducing exposure to third parties.
Before deciding on this path, Apple considered other alternatives such as Anthropic, OpenAI and even the acquisition of Perplexity, but the deal with Google would have been more attractive and profitable. Even so, the company will move cautiously: after the delays of the new Siri in 2024, will avoid announcing something that is not ready and will only take the step when the product meets its standards.
Internal sources speak of a Siri with LLM whose performance would be close to what we expect from a ChatGPT, although the deployment would be gradual and conditioned by stability. The roadmap points to milestones over the next year, with a margin up to 2026 if technical obstacles arise that force priorities to be adjusted.

Apple Intelligence Today: New Useful and Private Features
In parallel with the new search engine, Apple has announced a powerful package of new features Apple Intelligence which extends across iPhone, iPad, Mac, Apple Watch and Apple Vision Pro. The premise is to offer an AI that understands the context of the device and the user, works with processing on the computer itself and scale to the cloud privately when the task demands larger models.
The big star of the moment is Live Translation, natively integrated into Messages, FaceTime, and Phone. In Messages, texts are automatically translated into the recipient's preferred language as you type; in FaceTime, they appear live translated subtitles; and on phone calls you can hear the translation aloud during the conversation, all with models designed by Apple and executed on the device.
Creativity also gets a boost: Image Playground and Genmoji now allow you to mix emojis and descriptions to create new combinations, adjust people's features in images inspired by your photo library, and use additional styles like oil painting or vector artAdditionally, the Any Style option lets you describe exactly what you want and generate a custom image, allowing you to ask ChatGPT for help with your explicit consent.
Another qualitative leap comes with Visual Intelligence, which is no longer limited to what the camera sees: now it understands and acts on what's on your screen in any app. You can ask ChatGPT about the content you're viewing, search for similar products or images on Google, Etsy, and other apps, highlight an object to refine your search or add an event to Calendar when it detects a date. Access it by tapping the same buttons as the screenshot, then choosing Explore with AI.
On Apple Watch, it appears Workout Buddy, an AI coach that analyzes your heart rate, pace, distance, milestones, and more in real time to offer personalized motivational messagesA new generative voice, created by Fitness+ trainers, turns stats into dynamic guidance that guides you through your session.

Siri Gets Serious: Context, Writing, and External Model Support
The new Siri has been rebuilt from scratch on Apple Intelligence to understand natural language, track you when you get stuck, and maintain context between requests. It can understand what is on the screen and act on it: if someone sends you an address, just ask to save it and Siri will know what you mean without any intermediate steps.
In addition to voice, Siri incorporates the writing mode: You can type requests by tapping the bottom of the screen, which illuminates the edge of the iPhone and pops up the content. For more open-ended or creative queries, Siri can rely on ChatGPT when you approve it, even with integration of your payment account if you want advanced answers.
Writing, Images, and Photos: Productivity with Spark
The writing tools allow you to rewrite, revise and summarize a text, change the tone and adjust grammar and structure. With Describe your change, you can indicate exactly what modification you want and the AI takes care of the rest, whether in Mail, Notes, Pages or even in Safari content.
In the visual field, Image Playground create images in seconds and Genmoji lets you craft unique emojis from a description. If you're using Apple Pencil, the Image Wand transforms a quick sketch into a effective illustration that you can take with you to your favorite apps.
The Photos app adds the Draft mode to remove obtrusive objects or people without sacrificing fidelity, and enhances Memories with a more coherent narrative thread. The search with natural language understands requests like “sticker photos on faces” and integrates better with your organizational habits.
There are also email summaries, Smart Replies in Mail and Messages, rich previews, and an inbox with Priority Messages so that the urgent doesn't get lost in the noise.
Visual Intelligence, Shortcuts, and a Framework for Developers
The expansion of Visual Intelligence It is more than just a curiosity: it turns the screen into an interactive surface where you can recognize objects, extract data to create events or search for similar products. Everything is designed to save steps and reduce friction in tasks that previously took several taps.
The Shortcuts They now connect directly with Apple Intelligence and launch smart actions to summarize text, create images with Image Playground or combine information in complex flows. You can integrate model responses into the rest of the shortcut, on the device, or with Private Cloud Computing on tap, while maintaining privacy.
For those who build products, Apple opens the framework of foundational models: Any app can use the great on-device model that powers Apple Intelligence, with native support for Swift. According to Apple, all you need to do is three lines of code to incorporate assisted generation, tool invocation, and more without API costs or connection dependency.
The use cases are varied: from an educational app that generates a personalized questionnaire from your notes without leaving your device, to a hiking app that adds natural language search even without coverage.
Apple Watch and workouts: AI-powered personal trainer
With watchOS 26, Workout Buddy Analyzes your activity on the fly and offers you useful indications based on your history: heart rate, pace, distance, personal milestones and more. Thanks to a new text-to-speech model, you'll hear a generative voice that captures the style and energy of Fitness+ trainers to keep you motivated.
This function processes the data in a way private and secure with Apple Intelligence. Requires compatible Apple Watch with Bluetooth headphones and an iPhone with Apple Intelligence nearby. The initial rollout is available in English and includes popular workouts like outdoor and indoor running, walking, cycling, HIIT, and strength.
Compatibility, languages and availability
Apple Intelligence is coming to Spanish and will be expanding to more languages. Later this year, more languages will be added. eight new languagesDanish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Traditional Chinese, and Vietnamese, in addition to those already supported. Deployment may vary by region and local laws.
As for devices, compatibility covers all iPhone 16, iPhone 15 Pro and Pro Max, iPad mini with A17 Pro, and iPad and Mac with M1 chip or later. To activate it, Siri and the system language must match. Some features have specific requirements: for example, adding events to Calendar with Visual Intelligence is available. in English on iPhone 16 and iPhone 15 Pro/Max; and Live Translation has different language coverage in Messages vs. Phone and FaceTime.
Developers can now test the new features through the Apple Developer Program, and the public beta is being distributed through Apple's Beta Software Program. Various features will be released throughout the year for users with supported devices and a configured language.
iOS 26, macOS Tahoe, and the new Liquid Glass design
Apple has given a visual and functional twist to its systems: aesthetics arrives Liquid glass, with translucent icons and floating controls that appear when needed, without getting in the way of what you're doing. The change unifies the numbering: all platforms now have a 26 at the end to reflect the cycle.
In iOS 26, calls, video and social media communications come together in a single list from which to initiate actions. Call Screening shows who is calling and what they are leaving in a message, chats support dynamic backgrounds and AI powers group features. Apple Music translates lyrics and provides pronunciation; Maps learns your routes and suggests the best route with your common stops.
Visual Intelligence operates with a simple click on photos, objects or addresses: you can capture a garment seen on a social network and search similar, or grab a concert poster and add your event to your calendar with a tap. CarPlay evolves with more customization and debuts Car Play Ultra, which integrates the experience beyond the center console.
On Mac macOS Tahoe Better exploits the large format with transparent controls, smoother Continuity, new Shortcuts and a Spotlight that integrates deeply with Apple Intelligence. The following appear: Quick Keys to trigger actions by typing small sequences, and a Games app that turns the device into a console with spectacular graphics.
iPadOS 26 powers the multitask with multiple apps in view, contextual control buttons, and a per-app top menu; tvOS 26 simplifies playback with translucent controls and improved profiles; and tvOS incorporates Apple Intelligence to increase immersion reducing visual noise
Privacy by design: from device to private cloud
Apple's proposal is based on a key idea: your information is processed on your device whenever possible. When a larger model is needed, Private Cloud Computing comes into play, an environment with Apple chips where the code is auditable by experts independent to verify that data is not withheld.
The company promises that does not save or share personal data in these processes and that access to external services, such as ChatGPT, is done only with your explicit permission. This is an important step towards delivering useful AI without sacrificing the confidentiality that many users expect from the ecosystem.
The new AI: Apple finally puts all its eggs in one basket
The overall picture is clear: Apple is shaping a AI engine which will unite search, on-screen context, and actionable responses within Siri, Safari, and Spotlight, while extending Apple Intelligence across all its systems.
Between live translations, Visual Intelligence that understands what you look at, a generative voice wrist trainer, time-saving writing and creation tools, and a framework for developers to add AI with just a few lines, the ecosystem becomes more useful without neglecting privacy.
There are challenges ahead—Siri and AI search will set the pace—but the direction has been taken and the pieces are already falling into place in users' hands. Will this be the next big revolution Tim Cook was hoping for?
