City
Epaper

Google Lens takes visual search to a new level

By ANI | Updated: September 30, 2021 19:05 IST

With a new, more advanced AI system, Google Lens can now understand and answer questions about an image.

Open in App

With a new, more advanced AI system, Google Lens can now understand and answer questions about an image.

According to Mashable, Google has unveiled what seems like a really useful and almost scarily advanced new way to search with images.

Google Lens already lets you search based on an image. For example, if you take a picture of an elephant, you'll probably get Google Lens search results back for "elephant".

But now you can tap a picture you've taken, or one that you've saved in your library, and ask a question about it.

Take the elephant: Just tap the photo for the ability to "Add Questions" and a text box will pop up where you can plumb Google for more information about that specific image, like "What kind of elephant is this?" or "How many of these elephants are left in the world?"

That involves so many layers of AI processing it's actually hard to comprehend.

It understands what's in the picture, it understands your question, and it understands how your question relates to the picture. And, of course, it (ostensibly) gives you the answers you're looking for.

Making it all possible is a new, more advanced AI system called Multitask Unified Model (or MUM), announced in May, that is beginning to power Search.

Google has been slowly rolling out applications for the new tech that's capable of processing queries in more complex ways, and delivering results that Google believes will be more relevant or instructive than before.

The change to Lens is one of the most eye-popping yet, and the examples for queries Google provides shows just how smart MUM is. You can ask Google Lens about a pattern on a shirt, and whether that same pattern comes available in socks: Et voila, you get the exact product listing.

Or here's an example of someone snapping a picture of a broken bicycle component. Showing Google Lens the picture of the broken part, and asking "how to fix" delivers both what the exact broken part is, and how to fix it.

Both of these sorts of questions would be difficult to answer without the visual component. For the pattern, you'd likely get non-specific results for a verbal search for, say, socks in a floral pattern.

And for the bike example, you'd probably have to figure out what the specific bike part that's broken is called before you could even start worrying about how to fix it.

The machines are getting smarter, which is a possibly scary prospect for the world, but great news for finding answers to tricky questions.

( With inputs from ANI )

Disclaimer: This post has been auto-published from an agency feed without any modifications to the text and has not been reviewed by an editor

Tags: googleWord on macWho dgGoogle lensMicrosoft incUs google & youtubeSk duaDan patelBacPrivate institutes
Open in App

Related Stories

TechnologyGoogle Birthday Doodle: Search Engine Giant Brings Back 1998 Logo as It Turns 27 Today

NationalPrayagraj: UPSC Aspirant Attempts Sex Change After Google Search, Hospitalized in Critical Condition

FootballEnglish Premier League 2025-26 Google Doodle: Search Engine Giant Celebrates Start of Season With Football-Themed

MumbaiCyber Fraud in Mumbai: Elderly Woman Loses Rs 98,202 After Calling Fake Customer Care Number Found on Internet

NationalBITS Pilani Placement 2025: Over 80% Students Placed, Average Salary Rises to Rs 19.4 Lakh

Technology Realted Stories

TechnologyMP to create 20,000 vacancies in police, AI-based chatbot launched: CM Yadav

TechnologyPakistan deploys digital technology to spy on citizens

TechnologyAPEDA showcases India’s agri-food strengths as export partner at World Food India

TechnologyIndustry hails India’s success as leading electronics manufacturing destination globally

TechnologyDutch chip giant ASML lauds PM Modi for being accessible to investors