During a recent Paris showcase, Google announced a new update to Google Lens that’s coming to Android mobile devices. “In the coming months”, Google will allow anyone using Lens on Android to perform a search of anything that’s displayed on the screen. As Google put it during the presentation: “If you can see it, you can search it”.
In the coming months, we’re introducing a ✨major update ✨ to help you search what’s on your mobile screen.
You’ll soon be able to use Lens through Assistant to search what you see in photos or videos across websites and apps on Android. #googlelivefromparis pic.twitter.com/UePB421wRY
— Google Europe (@googleeurope) February 8, 2023
Users would be able to search for a building name, food recipes, car models, or any images that may contain searchable information. It will also work across websites and apps, and you wouldn’t need to leave the screen to perform a Lens search.
At the same event, Google announced an update to its Multisearch feature: which lets users perform an image search and add text to refine search results. You can now add text to specify the kinds of results you’re looking for with an image.
Multisearch is now live globally! Try out this new way to search with images and text at the same time. ?
So if you see something you like, but want it in a different style, colour or fit, just snap or upload a photo with Lens then add text to find it. ?#googlelivefromparis pic.twitter.com/4yT6voiJkn
— Google Europe (@googleeurope) February 8, 2023
In the above Tweet, Google demonstrates that you can perform an image search and further refine results by using Lens and then adding a text search, specifying a different style.
At the same event, Google also unveiled Bard, its ChatGPT rival that’ll soon be featured in Search. Just this week, Microsoft also announced its AI-powered enhancements coming to Bing and Edge browser.
Author: Enrique
Source: GSMArena