Google's Multisearch Feature Enhanced By Showing Nearby Products And Restaurants

 


This year Google Multisearch was introduced on Google Lens which allows context to be given in more detail when performing a search. For example you might like a picture of a shirt and then use Google Lens to find out the brand. Then with Multisearch you can order the same shirt searched in other colors from the picture. With a better understanding of context, Google Search can perform searches using images and text provided.



The Multisearch feature has been further enhanced in Google I/O with the Near Me feature. After the search is performed by the user, Search can find the location of the searched item near the user. In the demo shown, the user may be looking for lunch. Search can then provide restaurant recommendations near the menu you want complete with review scores and then integration with Google Maps.



Also launched is a "scene exploration" feature on Google Lens that displays information about food in the grocery store in augmented reality (AR). Review scores and information on nutrition can be accessed by simply placing the product in front of the smartphone when the Google Lens app is opened. With this feature important information about the product to be purchased such as content that may cause allergies can be provided without a manual search being performed.

Previous Post Next Post

Contact Form