The Search Live feature is now available in the Google app in the U.S., combining camera, voice, and AI in searches
Compartir:
Google officially launched Search Live, a feature that combines the phone's camera with voice to perform real-time searches.
This is one of the company's major bets on generative artificial intelligence, after being showcased for the first time at the Google I/O event in May.
Google lanzó oficialmente Search Live
How Search Live works
The idea is for the search engine to see the same thing as the phone's camera while the user interacts with Gemini. This way, the answers are displayed in natural language with web links for further exploration.
The difference from Gemini Live, which shares the screen, is that here the result is enhanced with visual information and direct access to different sources.
Where it's available
Search Live can already be used starting today in the Google app for Android and iOS, although only in English and in the United States. It is activated by pressing the new "Live" icon below the search bar.
It also appears as an option within Google Lens, integrated at the bottom of the screen next to the camera viewfinder.
What practical uses it offers
Google showed some examples of how Search Live can be used:
During a trip, point the camera at a monument or building and receive historical data.
Learn a hobby, such as watercolor painting techniques, guided step by step by AI.
Solve everyday problems, from a flat tire on the car to computer malfunctions.
A more immersive experience
The videos shared by Google show that more than half of the screen is dedicated to the camera stream. Below are the recommended links and a series of icons indicating the number of sources available to expand the information.