
Ray-Ban Meta: live translation and “vision” finally arrives in France

Until now, the Meta assistant I present in the Ray-Ban Meta glasses was quite limited in France. He has certainly won some features since the launch of the product in November 2024, but he was still content to respond to the voice, As we reported to you in our test. If you have these glasses, you will probably be delighted to learn that their AI will soon be able to finally use the integrated camera to see what you are looking at.
This is a hell of a change: Meta Ai will have the opportunity to “see” your visual environment to answer your questions in a contextual way. Concretely? You can just look at a monument or an object and ask “Hey Meta, what is this building?” Without having to draw your smartphone or take a photo. This function already existed in the United States, but it was expected in Europe, especially because of the stricter rules on the protection of personal data.
The deployment begins next week with us, but also in Belgium and Germany, where the assistant has just been activated. This late arrival still demonstrates that Meta intends to integrate her AI into our daily life in Europe, even if the regulations are stricter than in the United States.
Integrated translation, even without network
Another very practical novelty, especially when traveling: instant translation. Glasses will be able to translate English, Italian, Spanish and French, even without being connected to the Internet. It will be enough to download the language packs in advance to take advantage of it anywhere. In theory, this new feature should be very practical when you walk in areas with little network or to avoid exploding your data package abroad.
On the strategy side, we see that Meta multiplies the means to interact with its AI, whether on Facebook, Instagram, Whatsapp or via its devices like these glasses therefore. Even if it means generating criticism, as was recently the case for having complicated access to options that allow Meta to deactivate since the applications cited above.
Of course, analyzing what you see is not a revolution in itself, since Chatgpt or Gemini also allow it. But the difference here is that it is integrated directly into an accessory that the user carries every day. This allows a more natural interaction with the world, without having to get your phone permanently.
Besides, Meta attaches particular importance to appearing since her glasses are likely to be mistaken for classic frames and are quite light (between 44 and 49 grams). This discretion is undoubtedly essential for the general public to adopt them and for their societal acceptance. The late Google Glass know something about it, even if the web giant‘didn’t say her last word…




