Google Lens Video Search Tool

Google Lens video search tool feature that was introduced as part of the I/O 2024 seems to have started coming to more Android users now. The search tool helps people to find content through a photograph or through a video. The actions you perform using a video search for example will provide you with related content for purposes of identifying an object or discovering similar products and even more similar images.

It means that to get answers, the user can make a short video and ask the tool questions related to the video. AI-generated responses will follow for those in supported regions as well.  Google lens video search tool feature integrated with AI overviews so it would be available in region where AI powered search is available.

Although one can use the feature to enable fast translation of street signs in an unfamiliar language and find the name of a flower, there are some constraints as well. They could not look for a moving object or type more specific request about something.

That is the case for Google Lens video search tool function which is one of the capabilities of this technology that should be optimized. For you to use Google Lens, you need to launch the app on your smartphone and then focus your smartphone’s camera on the object.

Rather than clicking the search button which is depicted as a magnifying glass, you hold down your finger on it to make it record a video.

How does the new Google Lens Video Search Tool work?

The search tool lets users find material using a photograph or a video. If you search with a video, for example, it will return results for you to help you categorize objects, make product recommendations based on similarity, or search for images in a similar style.

Users can use the tool to record a short video and then ask the tool questions about that video. Thus AI responses will follow for those in supported regions.

The entered text is processed by an AI Overviews system through access to computer vision and giving a response. During tests, the feature also proved useful in detecting moving objects, describing their color and form, as well as the material that formed them.

Source

By Goldy Choudhary

Goldy Choudhary serves as the Manager of Clinic Beauty Store in Raleigh, North Carolina, USA where she leverages AI tools such as Lumen5, ChatGPT, and Gemini to drive innovation and enhance operational efficiency. With a deep-seated passion for the AI revolution, Goldy contributes to AyuTechno as a part-time author, where she plays a crucial role in content creation.Her commitment to the field of artificial intelligence is evident through her daily experiences and research, which she translates into valuable content for AyuTechno. Goldy’s role is instrumental in providing readers with comprehensive insights into AI and guiding them on secure and effective usage of these technologies. Her mission is to empower individuals with knowledge and ensure they are well-informed about the latest advancements in AI, reflecting her dedication to making AI accessible and safe for all.

Leave a Reply

Your email address will not be published. Required fields are marked *