
iOS 26: With Visual Intelligence, Apple reinterprets the Google Circle To Search function
Until now, Apple’s visual intelligence made it possible to point the camera to an object or text to learn more. But with the future update iOS 26this technology takes on a whole different dimension: it can now analyze what is happening on the very screen of your iPhone, and not only in the real world.
Visual intelligence is an integrated function of iOS which uses the camera and the AI developed by Cupertino to automatically detect elements such as:
- Text to translate or copy.
- Objects to be identified.
- Clickable elements to launch a search.
So far, this only worked for filmed elements live. Now Apple extends this capacity to everything that is displayed on the screen, thanks to a simple screenshot.
An iOS version of Circle To Search
With iOS 26, Apple transforms the iPhone into a real intelligent visual assistant, capable of understanding any content displayed on the screen. Just take a screenshot of a conversation, an image, or an app, so that the iPhone automatically detects the important elements and offers actions. The device also recognizes the dates of events visible on a poster, an Instagram story or a missive, and directly invites you to add them to your calendar. Ideal not to miss any concert or festival spotted over the scroll.
A new button also appears on screenshots, allowing access to a discussion window with Chatgpt in order to opt for more precise information on what you see, and get an instant response without leaving what you do.
A tool that recalls the possibilities offered by Circle to Search (Surround to search) on Android, or the Essential Space of recents CMF Phone 2 Pro,, Nothing Phone (3a) and (3a) pro.