
AirPods: we tested the live translation, promises, installation and verdict in real conditions
Apple AirPods Pro 3
How the pricing table works
Turn off your ad blocker to access all of the links above.
AirPods live translation is finally coming to Europe, after being delayed by negotiations between Apple and the European Commission around the DMA. We were able to test this functionality with the developer beta of iOS 26.2, before a public release scheduled for December 2025. It is currently limited to a few languages: French, English, German, Portuguese, Spanish, Italian, Japanese, Korean and Chinese.
A function reserved for AirPods under iOS 26.2 with Apple Intelligence
Apple reserves this new feature for AirPods Pro 3to AirPods Pro 2 and to AirPods 4 with ANCand requires an iPhone compatible with Apple Intelligence, either fromiPhone 15 Pro. Designed to facilitate face-to-face conversations, this function uses the microphones of the headphones to translate out loud what the interlocutor is saying into the user’s language, while relying on noise reduction to maintain a fluid and natural exchange.
Advertisement
To access the feature, we installed the iOS 26.2 developer beta on a iPhone 16. Without a specific update for our AirPods Pro 3, a new “Translation” tab appeared in the earphone settings. This is where we download the languages that the microphones will have to recognize. To activate live translation, you must then synchronize the headphones with the Translate application. In its “Live” tab, this allows you to select the language of the interlocutor and the one in which the comments will be rendered.
Installation and usage constraints, what you need to know
Once the settings have been made, the translation is activated using a long press on the stems of the two headphones, and stops in the same way. However, usage imposes some constraints. Both earbuds must be worn at all times and the paired iPhone must remain unlocked. If it is on standby, stored in a pocket or even placed nearby, the headphones refuse to start the translation. A rather painful limitation for conversing on the move, because it turns out to be almost impossible to press the stems simultaneously with your phone in hand. Fortunately, it is possible to start the translation manually via the Translate application, a more practical method in this scenario. Overall, activation was smooth, even when audio content is playing or another application is open.
Advertisement
AirPods Pro 3 © Les Numériques
Note that this function takes on its full meaning when the two interlocutors are wearing AirPods, which allows a real dialogue to be established. However, if one of the people has an iPhone but not AirPods, they can display the words of their interlocutor in the Translate app. It is also possible to ask your iPhone to read your own translated words out loud, to present them to a person who does not have any device, even when the headphones are connected to the iPhone.
During use, AirPods apply moderate noise reduction, not adjustable, but well-calibrated enough to remain focused on the voice while perceiving the external environment.
Tests in practice: strengths and obstacles
In practice, the voice generated by the translation is rather natural, fluid and punctuated by silences, despite a signal subject to some saturation. In a quiet, undisturbed environment, the feature may occasionally miss its target, but in the vast majority of cases, Siri provides a full translation with slight variations in rather realistic wording and vocabulary. Under these conditions, the AirPods microphones are even capable of picking up a relatively low voice.
If the interlocutor launches into a long speech without stopping, the translation starts approximately 5 seconds after the start of speaking. It is therefore not immediate strictly speaking, but this has the advantage of allowing you to hear the real start of the conversation before the translated voice takes over.
Advertisement
At a distance of 5 or 6 meters, translation remains effective, a definite asset for following a speech or conference. Please note, however, that understanding is only good if the interlocutor speaks with a natural accent in their mother tongue. When someone speaks in English with a heavy French accent, AirPods misinterpret, make up words, and make the translation unusable.
AirPods 4 ANC © Les Numériques
Limits appear as soon as the environment becomes a little chaotic. When several people are talking simultaneously in the same room, even if the main interlocutor is right in front, the translation stalls completely and only enunciates one word out of four, trying to translate all the voices at once. Outdoors, the function remains surprisingly stable as long as ambient noise is continuous, such as heavy traffic. On the other hand, as soon as passers-by chat nearby, the microphones of the headphones lose focus. If an external noise stops and then suddenly starts again in the middle of a conversation, the microphones interpret it as a disturbance and trigger a voice notification asking you to move away from the noise. Suffice to say that the experience quickly becomes frustrating in a noisy environment.
AirPods Pro 3 © Les Numériques
Beyond the fact that it is designed for quiet environments, the function is more suitable for simple exchanges. During long speeches, she sometimes loses the thread, repeats a word several times, resumes in a haphazard manner, etc. This therefore results in poorly structured sentences, with punctuation defects. In addition, poorly articulated or rapidly spoken words can also be poorly translated. Note that swear words are fully reproduced, but slang or expressions are very poorly understood. Despite this, it is generally sufficient to understand the overall meaning.
Advertisement
Finally, instant translation can be used to translate simple audio content (podcast, interview or speech) without too much difficulty, provided that it comes from a source other than the smartphone to which the headphones are connected.
Apple signs a more advanced feature than Google
Ultimately, the live translation of AirPods remains perfectible, sometimes capricious and still far from replacing a natural conversation in all contexts. She stumbles in the noise and sometimes runs out of breath during tirades that are too long. But despite these very real limits, Apple has today signed the most successful proposal of its kind: a translation truly carried by the microphones of the headphones, designed for a continuous, hands-free and seamless exchange.
Google Pixel Buds Pro 2 © Les Numériques
In comparison, although Google does offer real-time and continuous translation, this is essentially textual and segmented, without really using the microphones of the Pixel Buds headphones. There are more translated languages, but reading the transcription out loud within the Google Translate app temporarily stops the translation, which considerably limits the fluidity of the exchange. AirPods stand out as the first system that comes close to true simultaneous interpretation for the general public.
Advertisement
Want to save even more? Discover our promo codes selected for you.




