With our next generation of AI-powered technology, we’re making it more visual, natural and intuitive to explore information.
Our products at Google have a singular goal: to be as helpful to you as possible, in moments big and small. And we’ve long believed that artificial intelligence can supercharge how we deliver on that goal.
From our partners:
Since the early days of Search, AI has helped us with language understanding, making results more helpful. Over the years, we’ve deepened our investment in AI and can now understand information in its many forms — from language understanding to image understanding, video understanding and even understanding the real world.
Today, we’re sharing a few new ways we’re applying our advancements in AI to make exploring information even more natural and intuitive.
If you can see it, you can search it
Cameras have become a powerful way to explore and understand the world around you. In fact, Lens is now used more than 10 billion times per month as people search what they see using their camera or images.
With Lens, we want to connect you to the world’s information, one visual at a time. You can already use Lens to search from your camera or photos, right from the Search bar. Now, we’re introducing a major update to help you search what’s on your mobile screen.
In the coming months, you’ll be able to use Lens to “search your screen” on Android globally. With this technology, you can search what you see in photos or videos across the websites and apps you know and love, like messaging and video apps — without having to leave the app or experience.
Say your friend sends you a message with a video of them exploring Paris. If you want to learn more about the landmark you spot in the background, you can simply long-press the power or home button on your Android phone (which invokes your Google Assistant) and then tap “search screen.” Lens identifies it as Luxembourg Palace — and you can click to learn more.
Mix and match ways to search
With multisearch, you can search with a picture and text at the same time — opening up entirely new ways to express yourself. Today, multisearch is available globally on mobile, in all languages and countries where Lens is available.
We recently took multisearch even further by adding the ability to search locally. You can take a picture and add “near me” to find what you need, whether you’re looking to support neighborhood businesses or just need to find something in a hurry. This is currently available in English in the U.S., and in the coming months, we’ll be expanding globally.
And sometimes, you might already be searching when you find something that catches your eye and inspires you. In the next few months, you’ll be able to use multisearch globally on any image you see on the search results page on mobile.
For example, you might be searching for “modern living room ideas” and see a coffee table that you love, but you’d prefer it in another shape — say, a rectangle instead of a circle. You’ll be able to use multisearch to add the text “rectangle” to find the style you’re looking for.
We’re creating search experiences that are more natural and visual — but we’ve only scratched the surface. In the future, with the help of AI, the possibilities will be endless.
By: Elizabeth Reid
Source: Google Blog
For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!
Our humans need coffee too! Your support is highly appreciated, thank you!