Google Search’s recently introduced multisearch feature is about to get way more useful

google-search-scene-exploration

Local multisearch and scene exploration will help you up your Lens searching game

Source: Google

Search was Google’s very first product when the company was born in 1998, and to date remains its most used, serving billions of people and processing countless queries every day. And it’s one of the services Google works to make better every day. Google I/O 2022 has kicked offand Google has announced a handful of changes to its most recent search engine innovation — multisearch — that will most certainly make your searching and online shopping experience massively better.

Just a month ago, Google introduced us to the concept of multisearch. Leveraging Lens’ image identification abilities, multisearch allows you to search for a picture and add additional context to what you’re looking for with additional text, so you can move your search in the right direction. While it hasn’t reached everyone yet, it’s already getting massive improvements at Google’s developer conference.

ANDROIDPOLICE VIDEO OF THE DAY

Things are going local with local multisearch

Multisearch was pitched as the virtual equivalent of pointing to something and asking a friend if they’ve seen it anywhere. But a friend would likely let you know if they saw it at a store nearby rather than pointing you to a link to buy it online. While online shopping is more popular than ever, you might still want to check out some things in the flesh, so Lens wants to do the same thing your friend would do.

When using multisearch, you’ll now be able to search for local information in Google’s vast catalog of millions of businesses. Just look for a picture or screenshot containing what you’re looking for and add “near me” as the additional text context, and Google will try to find what you’re looking for in businesses that are physically close to you.

This is great if you want to search for a dress or suit you want to try on before purchasing or if you’re looking for a meal or dessert that looks tasty but don’t know what it’s called. Now, you can use Lens to see if there are places near you selling what you want, and walk or drive to see it yourself.

Explore your surroundings with scene exploration

Source: Google

Currently, you can use Lens search to recognize a single object in a frame and check out if Google can find it in some corner of the internet. While that’s already massively useful, especially with both multisearch and local multisearch, if you want to scan multiple things at once, you have to do it one by one — and that can quickly get tiring. That’s why Google is adding a whole new angle to multisearch with scene exploration.


Scene exploration will allow you to pan your camera in a wider scene and get context about multiple things in the frame at once. One example Google cites of when this can come in handy is if you’re shopping for a chocolate bar without nuts for a friend and you don’t know exactly which one to buy — you can scan the entire candy shelf to make sure and buy the right one.


Local multisearch will be available later this year in English, and you can expect it to roll out to more languages ​​from 2023 onwards. As for scene exploration, Google isn’t giving a timeline for its release, just mentioning that it hopes to roll it out in the future.


illustration with the text Android 13

How to install Android 13 Beta 2 on your Android phone

Read Next


About The Author

Leave a Reply

Your email address will not be published.