Google Lens ‘Multisearch’ Lets You Search Using Text and Images at the Same Time

BY Chandraveer Mathur

Published 7 Apr 2022

Google has finally started rolling out “multisearch” in Google Lens as a beta test on iPhone and Android. The feature grabbed headlines when it was first showcased at the Search-centric keynote address in September.

To get you up to speed, multisearch is what Google believes is the “new way to search” for something you can’t describe. It is integrated into Lens. It allows you to tweak specific search parameters to arrive at what you were looking for eventually. The best way to explain what it does is to walk you through how it works.

To use the new feature in the beta version of Lens, you start by capturing an image using Lens’ camera or choosing one from the Photos app on iPhone and importing it into the app. Then, you can swipe up to see the search results that resemble the image. multisearch can be activated by tapping the new Add to your search button at the top of the results screen.

After you tap the button, you can “refine your search by color, brand,” and other visual attributes of the image. For instance, if your photo shows yellow heeled slippers, multisearch lets you find visually similar footwear with a flat sole if you tweak the search with the query “flats.” The company says advancements in AI have enabled this novel search tool.

Google also highlighted other examples where multisearch can come in handy. You could take a photo of your dining set and query “coffee table” to find furniture to match. You could even photograph a plant and find care instructions for it in a similar manner. Using multisearch, the company hopes to eliminate the crude method most people use where they first identify the plant or type of footwear using Lens and then perform a separate conventional Search to find sole variations or care instructions.

The search giant says that presently, multisearch works the best with shopping and fashion-related searches. However, it probably won’t work with complex search queries in beta because it doesn’t use the Multitask Unified Model.

To get started using multisearch on iPhone, make sure you’re using the latest version of the Google app on the App Store.

[Via Google Blog]