Google says ‘Lens’ can now search for skin conditions based on images. Here’s how

Tool will also identify clothes, food and other objects in images

Andrew Griffin
Friday 16 June 2023 01:14 EDT
Comments
(Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Google says its “Lens” image search can now help people understand what is going on with their skin.

The tool is intended as a smart image search: users can take pictures and use them to search for whatever is in them. It has previously suggested it is useful for finding the details of the clothes that make up an outfit, for instance, or looking up certain items of food.

But ens can also be used for looking up skin conditions or other unusual things on the body, the company suggested.

It warns that the tool is “informational only and not a diagnosis” and urges users to consult authorities for advice. But it suggested that it could be a useful way of starting to look up certain things on the body that might be otherwise hard to put into words.

“Describing an odd mole or rash on your skin can be hard to do with words alone,” Google said. “This feature also works if you’re not sure how to describe something else on your body, like a bump on your lip, a line on your nails or hair loss on your head.”

The feature was described in a more wide-ranging Google blog that focused on other more obvious uses, such as pointing the camera at a “cool building or landmark” or to translate street signs or menus.

Google said the feature was new within lens, but did not specify when it had been released.

The company has tried to use artificial intelligence to help with skin conditions before. In 2021, it released a new tool called “DermAssist”.

Google says it sees “billions of skin-related searches each year”. DermAssist was built to assist with those, though it too includes a disclaimer indicating it is only intended “for informational purposes” and not for a medical diagnosis.

Since that DermAssist feature is more specifically focused on helping with medical conditions, it is subject to more stringent regulation. As such, Google has still only made it available in a “limited release” and asks people to sign up to be part of that testing on its website.

DermAssist required users to answer a few questions and upload three photos. Lens on the other hand simply appears to use Google’s algorithms to match one picture with similar images of skin conditions, and give some indication of what that condition might be.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in