Google’s New ‘Multiple Search’ Features Suggest Future Of AR Glasses – TechCrunch

Google’s New ‘Multiple Search’ Features Suggest Future Of AR Glasses – TechCrunch

In April, Google introduced a new “multisearch” feature that offered a way to search the web using both text and images. Today at Google’s I/O Developer Conference, the company announced an extension to this feature, called “Multisearch Near Me.” This addition, coming later in 2022, will allow Google app users to combine an image or screenshot with the text “near me” to be directed to options for local retailers or restaurants. who would have the clothes, household items or food you are looking for. It also announces an upcoming development for multiple search which appears to be built with AR glasses in mind, as it can visually search for multiple objects in a scene based on what you are currently “seeing” through a camera’s viewfinder. smartphone camera.

With the new “near me” multi-search query, you’ll be able to find local options related to your current visual and text search combination. For example, if you were working on a DIY project and came across a part that you needed to replace, you could take a picture of the part with your phone’s camera to identify it, then find a local hardware store that has a replacement in stock.

It’s not that different from the way multiple search already works, Google says — it’s just adding the local component.

Image Credits: Google

The idea with multiple search was originally to allow users to ask questions about an object in front of them and narrow down those results by color, brand, or other visual attributes. The feature works best with shopping searches today, as it allows users to narrow down product searches in a way that standard text-based web searches might sometimes struggle to manage. For example, a user could take a photo of a pair of sneakers and then add text to request to see them in blue so that they are only shown shoes of the specified color. They could choose to visit the website for the sneakers and purchase them immediately. The extension to include the “near me” option now simply limits results to direct users to a local retailer where the given product is available.

In terms of helping users find local restaurants, the feature works the same way. In this case, a user could search based on a photo they found on a food blog or elsewhere on the web to find out what the dish is and which local restaurants might have the option on their dinner menu. , pick up or deliver. Here google search combines the image with the intention that you are looking for a nearby restaurant and will scan millions of images, reviews and community contributions to Google Maps to find the local place .

The new “near me” feature will be available globally in English and will roll out to more languages ​​over time, Google said.

The most interesting addition to multi-search is the ability to search within a scene. Going forward, Google says users will be able to pan their camera to learn more about multiple objects in this larger scene.

Google suggests the feature could be used to scan the shelves of a bookstore and then see several useful pieces of information superimposed in front of you.

Screen Shot 2022 05 11 at 1.23.07 PM

Image Credits: Google

“To make this possible, we not only bring together computer vision, natural language understanding, but we also combine that with knowledge of the web and technology on the device,” said Nick Bell, senior director of Google. Search. “So the possibilities and capabilities of that are going to be huge and significant,” he noted.

The company – which came to the AR market early with its google glass release – hasn’t confirmed that it has some sort of new AR glasses-like device in the works, but has hinted at the possibility.

“With AI systems now, what’s possible today — and will be in the next few years — kind of opens up a lot of opportunities,” Bell said. In addition to voice search, desktop and mobile search, the company believes that visual search will also be a bigger part of the future, he noted.

Screen Shot 2022 05 11 at 1.22.32 PM

“There are 8 billion visual searches on Google with Lens each month now and that number is three times greater than just a year ago,” Bell continued. “What we’re certainly seeing from people is that the appetite and the desire to search visually is there. And what we’re trying to do now is look at the use cases and identify where it’s most useful,” he said. “I think when we think about the future of search, visual search is definitely a key part of that. »

The company, of course, is supposedly working on a secret project, codenamed Project Iris, to build a new AR headset with a slated release date of 2024. It’s easy to imagine not only how this scanning capability of scene might work on such a device, but also how any sort of image-plus-text (or voice!) search function might be used on an AR headset. Imagine looking again at the pair of sneakers you liked, for example, then asking a device to go to the nearest store you could make the purchase.

“Looking further, this technology could be used beyond day-to-day needs to help address societal challenges, such as helping conservationists identify plant species that need protection, or aiding humanitarian workers in disaster to quickly sort donations when needed,” suggested Prabhakar Raghavan, senior vice president of google search, speaking on stage at Google I/O.

Unfortunately, Google didn’t offer a timeline for when it expected to get the scene scanning capability into users’ hands, as the feature is still “in development.”

"Read


Leave a Reply

Your email address will not be published.