
At its I/O developer conference today, Google announced two new ways to access its AI-powered “Live” mode, which lets users search for and ask about anything they can point their camera at. The feature will arrive in Google Search as part of its expanded AI Mode and is also coming to the Gemini app on iOS, having been available in Gemini on Android for around a month.
The camera-sharing feature debuted at Google I/O last year as part of the company’s experimental Project Astra, before an official rollout as part of Gemini Live on Android. It allows the company’s AI chatbot to “see” everything in your camera feed, so you can have an ongoing conversation about the world around you — asking for recipe suggestions based on the ingredients in your fridge, for example.
Google is now bringing that functionality directly into Search’s new AI Mode, along with Google Lens. By tapping the “Live” icon, users will be able to share their camera feed with Search and ask direct questions about what’s in front of them.
The feature, which is branded Live Search, isn’t arriving until “later this summer” and will be available in beta to Labs testers first. It’s one of several upcoming AI Mode features, including research-based Deep Search, an AI agent that can take actions on the web for you, and a variety of new shopping tools.
The Gemini app on iOS is being updated with the same feature, together with the option to talk to Gemini about what’s on your screen rather than in your camera feed. Camera and screen sharing launched in Gemini Live on the Pixel 9 and Galaxy S25 last month, before expanding to all Android devices a few weeks later. Google had originally planned to lock the feature behind its paid Gemini Advanced subscription but changed its mind, so it will be free to iOS users, just as it is on Android.