Google’s impressive multisearch tool will be available in a lot more languages

An illustration of Google’s multicolor “G” logo
The Verge

Google’s impressive multisearch tool, which lets you search using both an image and some text, will be expanding to more than 70 global languages in the next few months, the company announced at its Search On event on Wednesday.

Multisearch uses Google Lens to make it easier to search for things that might be tricky to dig into with just text. Let’s say you see a jacket you like but want to find it in a different color. With multisearch, you can open up Google Lens in Google’s Android or iOS apps, snap a pic of the jacket, type out the color you want to find the jacket in, and search. By making multisearch available in many more languages, a lot more people will be able to use it; the tool initially rolled out in April in a US-only beta, and it’s currently available globally in English.

A GIF of Google’s multisearch tool. Google Lens captures a print of a shirt, and then the user searches that they want to find a tie that’s in a similar print. GIF: Google
This GIF shows how multisearch can be used to find a tie that matches the pattern on a shirt.

However, there is a new multisearch feature on the way that will be available first in the US. At this year’s Google I/O, the company previewed what it calls “multisearch near me,” which lets you find things locally. That could be useful if you’re looking for a certain food dish that might be available at a restaurant nearby, for example. At Search On, Google announced that this feature will be coming to the US sometime this fall.

Google showed off a few other handy-looking search features as well. Beginning Wednesday, Google’s iOS app will begin showing shortcuts under the search bar to surface powerful things search can already do, like translate text with your camera or allow you to hum to search for a song. (These shortcuts will be coming to the Android app “soon,” Cathy Edwards, Google’s VP and GM of search, said in a press briefing.)

In this GIF of a mobile phone, a set of shortcuts appears under Google’s search bar. The shortcuts include things like “Shop for products in your screenshots” and “Translate text with camera.” GIF: Google
The shortcuts under the search bar will show some tools you can use to search.

The company is also introducing some new tools to help you discover more about a particular topic. For example, when you start typing something in the search box, Google is building a feature that will suggest keywords and topic options you can click on to help fill out your query.

That might sound like autocomplete, but it seems like it will be a bit different in practice. You can get an idea of how it works in the GIF below. And Google can also show you information like the weather right under these suggestions. This feature will be launching in English in the US on mobile in the coming months.

A GIF of a phone showing a new Google search feature. As the person types out a query, individual words appear below the search box as suggestions for what to search for. GIF: Google
The suggestions appear above the autocomplete recommendations you might be familiar with.

And in the actual results, Google plans to show things in a more visual way. In one example shown to press, instead of just a list of links about Oaxaca, Google showed info boxes with things like the weather and a video taken in the city. These changes will be available in English in the coming months in the US, also on mobile.



Source: The Verge

Post a Comment

Previous Post Next Post