When it began, Apple’s visual intelligence feature allowed you to identify your synchronized phone camera on things around you and either to perform Google image search or ask questions through Chat GPT. In the WWDC 2025, the company demonstrated updates to expand the utility of visual intelligence, adding it to the screenshots system extensively. Regarding the company’s press release, “Visual Intelligence already helps users learn about the items and places around them using their iPhone camera, and now it enables users to faster with the content on their iPhone screen.”
He reminded me of “screen awareness”, which Apple described as one of Siri’s capabilities when he announced Apple Intelligence last year. I They The press release, the company, said, “With screen awareness, more apps will be able to understand and take action with users’ content over time.” Although this is not the same, the latest screenshot -based visual intelligence allows your iPhone to take context measures from your screen content, not by Siri.
In a way, it makes sense. Most people are already accustomed to taking a screenshot when they want to distribute or save important information viewing on a website or Instagram post. Here integrating Apple Intelligence operations will theoretically set the tools where you expect them, rather than let users talk to Siri (or wait for an update).
Basically, in iOS 26 (on Apple Intelligence support devices), a new page will be pulled as a result of pressing the power and volume down button to take the screenshot. Instead of the thumbnail of your saved photo appearing on the bottom left, you will see that the file editing, sharing or saving the file, along with the options around it, you are taking the picture almost all displaying all the display to get the Apple intelligence -based answers and actions. In the bottom left and right corner, chat GPT is sitting and looking for Google image.
Depending on what is in your screenshot, Apple Intelligence can suggest various steps below your image. It may ask where to buy a similar visible thing, adding an event to your calendar or identifying plants, animals or food types for example. If a lot is going on in your screenshot, you can pull to highlight an item (just as you select something to erase in the photo) and get information about this section of the image.
Third -party apps or services that have enabled the app like Google, Ati and Pennist, can also appear here so that you can take action inside this place. For example, if you have found a dwarf of your choice, a screenshot has been taken and indicated, you can buy it on Etsy or pin it on a panderist.
One of the aspects of this update of visual intelligence is that for people like me who intelligently screenshot and do not want to do anything other than receiving receipts, it can include a disappointing move between capturing the screenshot and preserving it in photos. It seems that you may be able to turn off this interface and stick to the existing screenshot system.
The examples that Apple gave for the ability to understand Siri was on your screen. In his press release from last year, Apple said, “For example, if a friend writes his new address in messages to a user, the recipient can say, ‘Add this address to your contact card.”
Like visual intelligence in screenshots, it includes scanning screen content for appropriate information and helping you to put you in a place (such as contacts or calendars) where it is most useful. However, the promise of Siri’s new era was more about interacting with all parts of your phone, the first and third party apps, equally. So you can ask the assistant to open an article that you have added to your reading list in Safari or send photos to a particular event.
It is clear that Apple has not yet provided this progress to Siri, and as Craig Federgie has said in the WWDC 2025 key note, they can only be discussed later this year. Nevertheless, as we look forward to the refreshment of this stagnation, the changes in the screenshots may be previewed.
If you buy something through a link in this article, we can get a commission.


