Technology

You can now use your phone to ID skin conditions

You can now use your phone to ID skin conditions
Dr Google will examine your rash and suggest matching conditions and treatments
Dr Google will examine your rash and suggest matching conditions and treatments
View 2 Images
Dr Google will examine your rash and suggest matching conditions and treatments
1/2
Dr Google will examine your rash and suggest matching conditions and treatments
A new feature of Google Lens
2/2
A new feature of Google Lens

While it’s not at all aimed to replace medical screening, the visual search function of Google Lens has now moved beyond identifying plants and birds to being able to act as a preliminary skin check tool.

Simply take a snap in the Google app, or upload an earlier shot from your library, and it’ll return to you image-based links that best match your image. Sure, it’s not faultless – when we tried it out, it suggested a slightly raised mole was a wart – but it does mean you can now look up something you may not have the right words for in order to narrow down search results.

And, let’s face it, no-one wants to trawl through pages and pages of images of skin conditions looking for their own specific issue. This certainly takes some of the pain out of the process.

“Describing an odd mole or rash on your skin can be hard to do with words alone,” said Google in a statement. “Fortunately, there’s a new way Lens can help, with the ability to search skin conditions that are visually similar to what you see on your skin.

“This feature also works if you're not sure how to describe something else on your body, like a bump on your lip, a line on your nails or hair loss on your head.”

While it’s certainly not meant to be used instead of professional treatment and diagnosis, it can provide a lot more information about any concerns, particularly if seeing a doctor is difficult, and offers a degree of privacy that can stop many people seeking help for minor conditions.

A new feature of Google Lens
A new feature of Google Lens

However, activity is saved to the cloud, so if you're sensitive about medical history being archived in this domain, you’ll need to turn off Google Lens saves in the Web & App Activity section of your Google Account first.

The company also announced that Google Lens will be integrated into generative AI chatbot Bard, which will allow for real-time feedback on image prompts.

Google Lens already assists with visual cues such as translating foreign street signs, directions and menus. Some of the new features include uploading an image of a type of food with a “near me” prompt and it will return a list (with pictures) of local spots where you can potentially find the item or dish.

The Google Lens app is available on Android, while iOS users can access it through the Google app.

For more on how Google Lens works, check out this video from the archives.

How Google Lens helps you search what you see | Search

Source: Google

2 comments
2 comments
Jay Gatto
I've given up on the GP's, maybe this AI app has more knowledgeability.
Gill Picard
for people with hearing problems, the above video was horrible. The speach is very hard to understand through the overbearing pseudo music. Had to guess at much of it and hoping that I got it right.