close
close

Google Lens lets you search the internet using self-recorded videos

Google Lens is a great way to search the web, especially in the age of AI. You can use your phone's camera to record content around you and then search the Internet for related information. Google Lens also supports multi-search, so you can use text and images to search the internet for more details about something you just saw in the real world or online.

However, since a photo may not always be enough, Google has decided to take things to the next level. At I/O 2024, Google introduced a Search with Video feature that lets you upload videos to Google Lens to search the web based on their content. This is something that not even ChatGPT can currently do.

On the other hand, the new Google Lens feature should not be confused with Google's Project Astra. The latter was also demonstrated at I/O 2024 and represents a major upgrade for Gemini. Project Astra lets Gemini “see” through your phone's camera, allowing it to react in real time based on your surroundings.

Accordingly Android AuthorityGoogle Lens' new video search feature is finally rolling out to users. You'll soon be able to take video of your surroundings and use the videos to query Google for information.

As you'll see in Mishaal Rahman's demo of X, the feature is straightforward. You can launch the Google Lens app on your phone, then tap and hold the shutter button to record a short video of the object of your curiosity.

In the clip, Rahman also asks Google via voice for information about a smartwatch he is holding. This demonstrates Google Lens' multimodal capabilities, building on its previous multi-search functionality.

In the past, Google enabled the use of voice and text to perform Google searches with Google Lens. Combining video and voice is the natural evolution of this, especially in a world where we can talk about AI chatbots.

As Rahman notes, you can get AI answers to your Google Lens video searches if AI overviews are available in Google Search in your region. Otherwise, you will continue to receive relevant answers to your query.

The results may not be perfect, but they could still help you. In the example above, Google does not perfectly identify the smartwatch, but rather provides the manufacturer and operating system. Google assumes that the wearable in the video is the OnePlus Watch 2R. But Rahman uses a Watch 2. Still, Google Lens isn't too far away. The results should improve in the future.

The new Google Lens feature is intended to complement your device's search capabilities. You can always use the Circle to Find feature on Android phones to find more details about what's on your screen.

The feature will likely be available to Android users first, although I wouldn't be surprised if Google makes it available to iPhone users soon.

Apple has also developed a function similar to Google Lens for the iPhone 16. It's called Visual Intelligence and is supposed to give the AI ​​eyes. You need to tap the camera control button on iPhone 16 so that the AI ​​can see what you see and respond to appropriate prompts. Visual Intelligence may be more of a competitor to Project Astra than Google Lens.