As you can see in this example, by using Google’s new multisearch capacity, powered by its evolving AI and machine learning tools, searchers will now be able to add additional parameters to visual search queries, in order to further refine their results, and hone in on exactly what they’re looking for.
As per Google:
“With multisearch, you can ask a question about an object in front of you or refine your search by color, brand or a visual attribute.”
That could provide a whole new way to improve your shopping search matches, by using visual cues to enhance your results. And with people shopping across Google more than a billion times a day, the platform remains a key facilitator of eCommerce discovery, which it’s also looking to enhance with ‘more browsable search results’ for fashion and apparel shopping queries.
Visual search qualifiers could also become a much bigger element in future, as AR glasses and other visual tools come into play. As people get more accustomed to using visual references, and being able to capture visuals via their glasses, the capacity to add those same elements into search could become a much more significant discovery element.
It’s interesting to see how Google’s looking to get ahead of this, while also adding immediate utility in refining search queries, leaning into the growing use of online shopping options.
Multisearch is now available as a beta feature in the Google app for English language searches in the US.