iOS 26 has received a Visual Intelligence update with support for screen analysis

Apple unveiled enhanced Visual Intelligence capabilities at WWDC 2025. The new feature lets you analyze iPhone screen content and perform actions on displayed objects.
Previously, Visual Intelligence only worked with the camera, recognizing real-world objects. Now it can analyze everything a user sees on the screen, from interface elements to images in apps. You can ask ChatGPT questions about the current screen content or search for similar products and objects through Google, Etsy, and other supported platforms.
The system can analyze everything on the screen, from interface elements to app images.
Recognize objects and events in apps
Visual Intelligence automatically detects individual interface elements. For example, if you hover over an image of a lamp, the system will suggest finding similar models on the internet. If an event is displayed on the screen, the technology retrieves the date, time and location, prompting you to add the appointment to your Calendar.
Visual Intelligence automatically identifies individual interface elements.
The feature is also integrated with other apps. Users can press a standard combination of buttons to create a screenshot, and then choose to save or share the snapshot, or run advanced analytics through Visual Intelligence.
Universal screen search with privacy protection
The updated system turns Visual Intelligence into a universal tool for searching and interacting with content on the iPhone. Apple says the technology takes an Apple Intelligence-based approach, processing data on the device and ensuring the user’s privacy. This allows you to get personalized prompts without transferring information to the cloud.
The story iOS 26 gets Visual Intelligence update with support for screen analysis was first published on ITZine.ru.