Google will now let you search for things you can’t even describe

Andrew Griffin
Thursday 07 April 2022 11:41 EDT
Comments
(AFP via Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Google will now let people search for things that they can’t even describe.

The new feature, called Google Lens Multisearch, means that people will be able to use an image to explain the thing they are looking for.

If someone has found a dress that they like but want it in green rather than the yellow they have a picture of, for instance, they can upload that picture and tell Google to look for “green”. If they have a plant they want to know how to look after but don’t know the name, they can take a picture of it and add “care instructions” to find out how it should be treated.

Google describes the tool as “an entirely new way to search”. It said that it is part of an attempt to get people to “go beyond the search box and ask questions about what you see”.

The feature is powered by Google’s updates artificial intelligence, which it says will make it easier for people to find what they are looking for. In the future, the feature could be improved by “MUM”, or Multitask Unified Model, a new technology that Google says will make searching much easier.

Google has been working on that tool for some time, and announced a number of changes built around it at a “Search On” event last year.

The feature is rolling out in beta now, and can be found in the iOS and Android version of the app. It will only be available for users in the US.

To find it, users can just open up the app and will be presented the option to use their camera or voice as well as text.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in