
Apple is preparing a real-time translation function for AirPods with the arrival of iOS 26, designed to hold face-to-face conversations in different languages without missing a beat. Evidence found in the latest betas of the system suggests that the company already has the very advanced development.
A system image showing AirPods next to the iOS 6 beta 26 has been found. greetings in several languages, in addition to an associated activation gesture. Specialized sources, such as 9to5Mac, they point out that this experience would be integrated directly with the app Apple Translate.
What has been discovered in the beta

iOS 26 beta 6 internals include a graphic with the text "Hello" in multiple languages and the file name "Translate", which leaves little room for interpretation: this is a translation mode designed for headphones. The illustration suggests a gesture on the stem to get it started.
Currently, the feature is not active for the public. and there's no toggle in the settings. Everything indicates that Apple will reveal it when it's ready, fitting in with the expansion of Live Translation already present in Phone, Messages and FaceTime within iOS 26.
In addition to the visual clue, the code links the new feature to the iOS Translate app, so AirPods would act as input and output interface (microphone and speaker), while the iPhone would handle the processing.
The lack of an announcement at WWDC does not invalidate the evidence: different leaks agree that Apple has worked in this face-to-face mode for months and that its public presentation would be imminent.
How translation would work with AirPods

The activation would occur with a double tap on the stem from the headphones, automatically launching Translate on the iPhone to convert the incoming voice and return the already translated audio to the user's ear. That is, AirPods don't translate on their own, but depend on the linked device.
The models referenced in the internal tracks are the AirPods Pro (2nd generation) and the futures AirPods (4nd generation)The feature is expected to arrive via a firmware update for compatible models, coordinated with iOS 26.
Since Apple is consolidating Apple Intelligence into its platform, it is very likely that this translation experience requires an iPhone compatible with Apple Intelligence, both in terms of computing power and privacy and security needs.
Latency will be key: for natural conversations, the system needs to be responsive almost instantlyThis requirement would explain the dependence on the latest hardware and Apple's caution in publicly enabling the feature.
Compatibility, languages and requirements

Apple has not confirmed the list of compatible iPhones or the countries of availability. Sources indicate that the new feature would be conditioned by Apple Intelligence and the performance requirements associated with live translation, so support may be limited to the latest devices.
As for languages, everything points to a start with those already supported by the Translate app: Spanish (Spain), English (US & UK), French, German, Italian, Portuguese (Brazil), Japanese, Simplified Chinese, and KoreanIt is reasonable to expect the catalog to expand over time.
It remains to be seen how Apple manages the privacy and processing Voice. The company has been emphasizing local executions and hybrid processes, so it will be important to know whether the translation is performed on the device or with cloud support.
Beyond the technical section, the configuration could be integrated into the Translate app and the AirPods settings, with options to select languages, activate gestures, and define conversation modes.
Planned schedule and deployment

The feature is not yet available in beta 6, but the feedback is strong enough to consider an announcement. during the launch of iOS 26If Apple sticks to its usual schedule, the new features would be unveiled in September alongside the next-generation iPhone.
Depending on the state of the software, real-time translation could be activated on day one of iOS 26 or arrive shortly thereafter via Firmware update for compatible AirPods. Apple typically staggers these releases to ensure stability.
For users who travel or work in multilingual environments, direct in-ear translation can be a leap in accessibility, reducing screen usage and making interaction in different languages more natural.
With testing underway and the pieces falling into place—Live Translate in system apps, integration with Apple Intelligence, and planned support for AirPods Pro 2 and AirPods 4—everything indicates that Apple is preparing to turn its headphones into a discreet and effective ally when it comes to understanding and making ourselves understood without barriers.