Tag Archives: Google Lens

Seniors search what they see, using a new Lens

Technology shines when it helps us get things done in our daily lives, and that’s exactly why a group of around 100 very eager seniors gathered in Odense, Denmark. All older than 65, many up to 85, they decided to stay on top of the latest technological tricks and tools. On this March day, the eye-opener was the often overlooked potential in searching for information using visual tools, like Google Lens.

So now the seniors searched their surroundings directly: Scanned trees, plants, animals and buildings, used Translate to get hold of Turkish language menu cards or Japanese sayings, and found product declarations through barcode scanning.

The group was taking part in a training set up by Faglige Seniorer, which organizes 300,000 seniors in total. They first partnered with Google back in 2019 to train seniors in using voice to search, and now the time had come to use live images.

“Often, when I go for a walk, I stumble upon an unknown flower or a tree. Now I can just take a picture to discover what kind of plant I am standing before,” Verner Madsen, one of the participants, remarked. “I don’t need to bring my encyclopedia. It is really smart and helpful.”

Seniors in a country like Denmark are generally very tech savvy, but with digitization constantly advancing — accelerating even faster during two years of COVID-19 — some seniors risk being left behind, creating gaps between generations. During worldwide lockdowns, technological tools have helped seniors stay connected with their family and friends, and smartphone features have helped improve everyday life. One key element of that is delivering accurate and useful information when needed. And for that, typed words on a smartphone keyboard can often be substituted with a visual search, using a single tap on the screen.

Being able to "search what you see" in this way was an eye-opener to many. As the day ended, another avid participant, Henrik Rasmussen, declared he was heading straight home to continue his practice.

“I thought I was up to speed on digital developments, but after today I realize that I still have a lot to learn and discover,” he said.

Search your world, any way and anywhere

People have always gathered information in a variety of ways — from talking to others, to observing the world around them, to, of course, searching online. Though typing words into a search box has become second nature for many of us, it’s far from the most natural way to express what we need. For example, if I’m walking down the street and see an interesting tree, I might point to it and ask a friend what species it is and if they know of any nearby nurseries that might sell seeds. If I were to express that question to a search engine just a few years ago… well, it would have taken a lot of queries.

But we’ve been working hard to change that. We've already started on a journey to make searching more natural. Whether you're humming the tune that's been stuck in your head, or using Google Lens to search visually (which now happens more than 8 billion times per month!), there are more ways to search and explore information than ever before.

Today, we're redefining Google Search yet again, combining our understanding of all types of information — text, voice, visual and more — so you can find helpful information about whatever you see, hear and experience, in whichever ways are most intuitive to you. We envision a future where you can search your whole world, any way and anywhere.

Find local information with multisearch

The recent launch of multisearch, one of our most significant updates to Search in several years, is a milestone on this path. In the Google app, you can search with images and text at the same time — similar to how you might point at something and ask a friend about it.

Now we’re adding a way to find local information with multisearch, so you can uncover what you need from the millions of local businesses on Google. You’ll be able to use a picture or screenshot and add “near me” to see options for local restaurants or retailers that have the apparel, home goods and food you’re looking for.

An animation of a phone showing a search. A photo is taken of Korean cuisine, then Search scans it for restaurants near the user that serve it.

Later this year, you’ll be able to find local information with multisearch.

For example, say you see a colorful dish online you’d like to try – but you don’t know what’s in it, or what it’s called. When you use multisearch to find it near you, Google scans millions of images and reviews posted on web pages, and from our community of Maps contributors, to find results about nearby spots that offer the dish so you can go enjoy it for yourself.

Local information in multisearch will be available globally later this year in English, and will expand to more languages over time.

Get a more complete picture with scene exploration

Today, when you search visually with Google, we’re able to recognize objects captured in a single frame. But sometimes, you might want information about a whole scene in front of you.

In the future, with an advancement called “scene exploration,” you’ll be able to use multisearch to pan your camera and instantly glean insights about multiple objects in a wider scene.

In the future, “scene exploration” will help you uncover insights across multiple objects in a scene at the same time.

Imagine you’re trying to pick out the perfect candy bar for your friend who's a bit of a chocolate connoisseur. You know they love dark chocolate but dislike nuts, and you want to get them something of quality. With scene exploration, you’ll be able to scan the entire shelf with your phone’s camera and see helpful insights overlaid in front of you. Scene exploration is a powerful breakthrough in our devices’ ability to understand the world the way we do – so you can easily find what you’re looking for– and we look forward to bringing it to multisearch in the future.

These are some of the latest steps we’re taking to help you search any way and anywhere. But there’s more we’re doing, beyond Search. AI advancements are helping bridge the physical and digital worlds in Google Maps, and making it possible to interact with the Google Assistant more naturally and intuitively. To ensure information is truly useful for people from all communities, it’s also critical for people to see themselves represented in the results they find. Underpinning all these efforts is our commitment to helping you search safely, with new ways to control your online presence and information.

Go beyond the search box: Introducing multisearch

How many times have you tried to find the perfect piece of clothing, a tutorial to recreate nail art or even instructions on how to take care of a plant someone gifted you — but you didn’t have all the words to describe what you were looking for?

At Google, we’re always dreaming up new ways to help you uncover the information you’re looking for — no matter how tricky it might be to express what you need. That’s why today, we’re introducing an entirely new way to search: using text and images at the same time. With multisearch in Lens, you can go beyond the search box and ask questions about what you see.

Let’s take a look at how you can use multisearch to help with your visual needs, including style and home decor questions. To get started, simply open up the Google app on Android or iOS, tap the Lens camera icon and either search one of your screenshots or snap a photo of the world around you, like the stylish wallpaper pattern at your local coffee shop. Then, swipe up and tap the "+ Add to your search" button to add text.

Multisearch allows people to search with both images and text at the same time.

With multisearch, you can ask a question about an object in front of you or refine your search by color, brand or a visual attribute. Give it a go yourself by using Lens to:

  • Screenshot a stylish orange dress and add the query “green” to find it in another color
  • Snap a photo of your dining set and add the query “coffee table” to find a matching table
  • Take a picture of your rosemary plant and add the query “care instructions”

All this is made possible by our latest advancements in artificial intelligence, which is making it easier to understand the world around you in more natural and intuitive ways. We’re also exploring ways in which this feature might be enhanced by MUM– our latest AI model in Search– to improve results for all the questions you could imagine asking.

This is available as a beta feature in English in the U.S., with the best results for shopping searches. Try out multisearch today in the Google app, the best way to search with your camera, voice and now text and images at the same time.

Here’s how online shoppers are finding inspiration

People shop across Google more than a billion times a day — and we have a pretty good sense of what they’re browsing for. For instance, our Search data shows that the early 2000’s are having a moment. We’re seeing increased search interest in “Y2k fashion” and products like bucket hats and ankle bracelets. Also popular? The iconic Clinique “Happy” perfume, Prada crochet bags and linen pants.

While we know what’s trending, we also wanted to understand how people find inspiration when they’re shopping for lifestyle products. So we surveyed 2,000 U.S. shoppers of apparel, beauty and home decor for our first Inspired Shopping Report. Read on to find out what we learned.

Shopping isn’t always a checklist

According to our findings, most fashion, beauty and home shoppers spend up to two weeks researching products before they buy them. Many, though, are shopping online just for fun — 65% say they often or sometimes shop or browse online when they’re not looking for anything in particular. To help make online shopping even easier and more entertaining, we recently added more browsable search results for fashion and apparel shopping queries. So when you search for chunky loafers, a lime green dress or a raffia bag on Google, you’ll scroll through a visual feed with various colors and styles — alongside other helpful information like local shops, style guides and videos.

Phone screens show animations of a Google search for various clothing items with visual scrolling results

Apparel queries on Search show a more visual display of products

Inspiration can strike anywhere

We know shopping inspiration can strike at any moment. In fact, 60% of shoppers say they often or sometimes get inspired or prompted to buy something even when they aren’t actively shopping. That can come from spotting great street style: 39% of shoppers say they often or sometimes look for a specific outfit online after they see someone wearing it. Or it can come from browsing online: 48% of shoppers have taken a screenshot of a piece of clothing, accessory or home decor item they liked (and 70% of them say they’ve searched for or bought it afterwards). Google Lens can help you shop for looks as soon as you spot them. Just snap a photo or screenshot and you’ll find exact or similar results to shop from.

Sometimes words aren’t enough

We know it can be hard to find what you’re looking for using words alone, even when you do have an image — like that multi-colored, metallic floral wallpaper you took a photo of that would go perfectly with your living room rug. Half of shoppers say they often or sometimes have failed to find a specific piece of clothing or furniture online after trying to describe it with just words. And 66% of shoppers wished they could find an item in a different color or print.

To help you track down those super specific pieces, we’re introducing an entirely new way to search — using text and images at the same time. With multisearch on Lens, you can better uncover the products you’re looking for even when you don’t have all the words to describe them. For example, you might be on the lookout for a scarf in the same pattern as one of your handbags. Just snap a photo of the patterned handbag on Lens and add the query “scarf” to complete your look. Or take a photo of your favorite heels and add the query “flats” to find a more comfortable version.

Phone screen shows the ability to search for a flat version of a pair of yellow high heels, using text and images at the same time.

With multisearch on Lens, you can search with both images and text at the same time

Trying before you buy matters

It’s not always possible to make it to the store and try something on before you buy it — but it matters. Among online beauty shoppers, more than 60% have decided not to purchase a beauty or cosmetic item online because they didn’t know what color or shade to choose, and 41% have decided to return an item because it was the wrong shade. With AR Beauty experiences, you can virtually discover and “try on” thousands of products from brands like Maybelline New York, M.A.C. and Charlotte Tilbury — helping you make more informed decisions. And now, shoppers can try on cosmetics from a variety of brands carried at Ulta Beauty right in Google Search. Just search for a product, like the Morphe Matte Liquid Lipstick or Kylie Cosmetics High Gloss, and find the best shade for you.

Phone screens show animations of models virtually trying on various lipstick and eyeshadow shades.

Google’s AR Beauty experience features products from Ulta Beauty

No matter where you find your shopping inspiration, we hope these features and tools help you discover new products, compare different options and ultimately make the perfect purchase.

5 tips to finish your holiday shopping with Chrome

We’re coming down to the wire with holiday shopping, and many of us are frantically searching online for last-minute stocking stuffers. Luckily, a few new features are coming to Chrome that will make these final rounds of shopping easier — helping you keep track of what you want to buy and finally hit "order."

Here are five ways to use Chrome for a stress-free shopping experience.

1. Keep track of price drops: Are you waiting for a good deal on that pair of headphones, but don’t have time to constantly refresh the page? A new mobile feature, available this week on Chrome for Android in the U.S., will show an item’s updated price right in your open tabs grid so you can easily see if and when the price has dropped. This same feature will launch on iOS in the coming weeks.

Screenshot showing a grid of four tabs in Chrome. Two tabs are product pages and show a price drop on top of the tab preview, highlighted in green.

2. Search with a snapshot from the address bar: If something catches your eye while you’re out window shopping, you can now search your surroundings with Google Lens in Chrome for Android. From the address bar, tap the Lens icon and start searching with your camera.

Coming soon, you’ll also be able to use Lens while you’re browsing in Chrome on your desktop. If you come across a product in an image and want to find out what it is, just right-click and select the “Search images with Google Lens” option.

3. Rediscover what’s in your shopping cart: You know you have items in your shopping cart, but you can't remember where exactly. No need to search all over again. Starting with Chrome on Windows and Mac in the U.S., you can now open up a new tab and scroll to the “Your carts” card to quickly see any site where you’ve added items to a shopping cart. Some retailers, like Zazzle, iHerb, Electronic Express and Homesquare, might even offer a discount when you come back to check out.

4. Get passwords off your plate: Don’t worry about setting up and remembering your account details for your favorite shopping sites. Chrome can help create unique, secure passwords and save your login details for future visits.

5. Simplify the checkout process: By saving your address and payment information with Autofill, Chrome can automatically fill out your billing and shipping details. And when you enter info into a new form, Chrome will ask if you’d like to save it.

How AI is making information more useful

Today, there’s more information accessible at people’s fingertips than at any point in human history. And advances in artificial intelligence will radically transform the way we use that information, with the ability to uncover new insights that can help us both in our daily lives and in the ways we are able to tackle complex global challenges.


At our Search On livestream event today, we shared how we’re bringing the latest in AI to Google’s products, giving people new ways to search and explore information in more natural and intuitive ways.


Making multimodal search possible with MUM

Earlier this year at Google I/O, we announced we’ve reached a critical milestone for understanding information with Multitask Unified Model, or MUM for short.


We’ve been experimenting with using MUM’s capabilities to make our products more helpful and enable entirely new ways to search. Today, we’re sharing an early look at what will be possible with MUM. 


In the coming months, we’ll introduce a new way to search visually, with the ability to ask questions about what you see. Here are a couple of examples of what will be possible with MUM.




With this new capability, you can tap on the Lens icon when you’re looking at a picture of a shirt, and ask Google to find you the same pattern — but on another article of clothing, like socks. This helps when you’re looking for something that might be difficult to describe accurately with words alone. You could type “white floral Victorian socks,” but you might not find the exact pattern you’re looking for. By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural ways. 



Some questions are even trickier: Your bike has a broken thingamajig, and you need some guidance on how to fix it. Instead of poring over catalogs of parts and then looking for a tutorial, the point-and-ask mode of searching will make it easier to find the exact moment in a video that can help.


Helping you explore with a redesigned Search page

We’re also announcing how we’re applying AI advances like MUM to redesign Google Search. These new features are the latest steps we’re taking to make searching more natural and intuitive.


First, we’re making it easier to explore and understand new topics with “Things to know.” Let’s say you want to decorate your apartment, and you’re interested in learning more about creating acrylic paintings.



If you search for “acrylic painting,” Google understands how people typically explore this topic, and shows the aspects people are likely to look at first. For example, we can identify more than 350 topics related to acrylic painting, and help you find the right path to take.


We’ll be launching this feature in the coming months. In the future, MUM will unlock deeper insights you might not have known to search for — like “how to make acrylic paintings with household items” — and connect you with content on the web that you wouldn’t have otherwise found.

Second, to help you further explore ideas, we’re making it easy to zoom in and out of a topic with new features to refine and broaden searches. 


In this case, you can learn more about specific techniques, like puddle pouring, or art classes you can take. You can also broaden your search to see other related topics, like other painting methods and famous painters. These features will launch in the coming months.


Third, we’re making it easier to find visual inspiration with a newly designed, browsable results page. If puddle pouring caught your eye, just search for “pour painting ideas" to see a visually rich page full of ideas from across the web, with articles, images, videos and more that you can easily scroll through. 

This new visual results page is designed for searches that are looking for inspiration, like “Halloween decorating ideas” or “indoor vertical garden ideas,” and you can try it today.

Get more from videos

We already use advanced AI systems to identify key moments in videos, like the winning shot in a basketball game, or steps in a recipe. Today, we’re taking this a step further, introducing a new experience that identifies related topics in a video, with links to easily dig deeper and learn more. 


Using MUM, we can even show related topics that aren’t explicitly mentioned in the video, based on our advanced understanding of information in the video. In this example, while the video doesn’t say the words “macaroni penguin’s life story,” our systems understand that topics contained in the video relate to this topic, like how macaroni penguins find their family members and navigate predators. The first version of this feature will roll out in the coming weeks, and we’ll add more visual enhancements in the coming months.


Across all these MUM experiences, we look forward to helping people discover more web pages, videos, images and ideas that they may not have come across or otherwise searched for. 


A more helpful Google

The updates we’re announcing today don’t end with MUM, though. We’re also making it easier to shop from the widest range of merchants, big and small, no matter what you’re looking for. And we’re helping people better evaluate the credibility of information they find online. Plus, for the moments that matter most, we’re finding new ways to help people get access to information and insights. 


All this work not only helps people around the world, but creators, publishers and businesses as well.  Every day, we send visitors to well over 100 million different websites, and every month, Google connects people with more than 120 million businesses that don't have websites, by enabling phone calls, driving directions and local foot traffic.


As we continue to build more useful products and push the boundaries of what it means to search, we look forward to helping people find the answers they’re looking for, and inspiring more questions along the way.


Posted by Prabhakar Raghavan, Senior Vice President




Rediscover your city through a new Lens this summer

With warmer weather upon us and many places reopening in the U.K., it’s the perfect time to go out and reconnect with your surroundings. Whether it’s soaking up that panoramic view of a city skyline that you’ve really missed, or wondering what that interesting tree species was that you pass every day on your park walk, many of us feel ready to reconnect with our cities in new ways.


British cities are especially ripe for rediscovery. As the country emerges from a long lockdown and people start to reintegrate with their cities this summer, we’re launching a campaign called Behind the Lens with Google Pixel, which aims to help people rediscover their cities using Google Lens on Pixel. We’ll do that through a series of events over the coming weeks, alongside some very special guests in London, Bristol and Liverpool.


Vibrant orange and purple flower shown on a Google Pixel 5 using Google Lens, which has identified the flower as a bird of paradise. The result shows information about the plant: “Strelitzia reginae, commonly called a crane flower or bird of paradise, is a genus of perennial plants, native to South Africa…”

Vibrant orange and purple flower shown on a Google Pixel 5 using Google Lens, which has identified the flower as a Bird of Paradise.

Behind the Lens with Google Pixel encourages people to search what they see using the magic of Lens, and rediscover some forgotten pockets of their city using its updated features. Identifying the species of that bird you keep seeing in the communal gardens of London has never been easier, while discovering new, secret ingredients at a farmer’s market in Liverpool can also be done in a snap. Or, perhaps you’ve always wanted to know more about that forgotten landmark from a viewpoint in Bristol. Lens can give you on-the-spot information about a subject with a single long tap on the Pixel camera viewfinder, which is handy since we often have our cameras open and ready to capture the moment. 


With restrictions being lifted in the U.K. this summer, Search trends reveal that there is an opportunity to rediscover our cities through the interests we have acquired over lockdown. From March 23, 2020 through April 21, 2021, Google searches incrementally increased for new skills and classes: Hiking trails near me (+200%), Online gardening courses (+300%) and Online cooking classes (+800%). 


This suggests not only that some of the hobbies the nation nurtured during lockdown are still very much of interest, but also now people can rediscover these within the backdrop of their city, alongside their communities and friends. 


Within Google Lens, the Places filter is selected and the view is showing a clock tower against a bright, cloudy sky. Lens identifies the clock tower as Big Ben and gives results, including a star rating, two alternative views of the tower and an option to search Google.

Within Google Lens, the Places filter is selected and the view is showing a clock tower against a bright, cloudy sky.

A new tool for rediscovery


Google Lens is now used over three billion times per month by people around the world, and with many ready to explore this summer and rediscover their cities, we’re officially launching the new Places filter in Lens. Now available globally, the Places filter makes it easy to identify buildings and landmarks using your phone camera, combining 3D models from Google Earth and Lens’ powerful image recognition technology to create an in-depth, real-time AR experience, similar to Live View on Google Maps.


The Google Lens app Places filter is open on a black Google Pixel 5, showing a view that scans the River Thames and settles on a large bridge with two towers. Upon identification of the structure as Tower Bridge, Lens results show the star rating, alternative images of Tower Bridge to scroll through, and the option to search Google for more information.

The Google Lens app Places filter is open on a Google Pixel 5, showing a view that scans the River Thames and settles on a large bridge with two towers.

Just open the Google app on your phone and tap the camera icon in the search bar to open Lens. Then, switch to the Places filter and point your camera at notable places around you.


We hope Lens makes rediscovering and learning about your city even more enjoyable.


Search, explore and shop the world’s information, powered by AI

AI advancements push the boundaries of what Google products can do. Nowhere is this clearer than at the core of our mission to make information more accessible and useful for everyone.

We've spent more than two decades developing not just a better understanding of information on the web, but a better understanding of the world. Because when we understand information, we can make it more helpful — whether you’re a remote student learning a complex new subject, a caregiver looking for trusted information on COVID vaccines or a parent searching for the best route home.

Deeper understanding with MUM

One of the hardest problems for search engines today is helping you with complex tasks — like planning what to do on a family outing. These often require multiple searches to get the information you need. In fact, we find that it takes people eight searches on average to complete complex tasks.

With a new technology called Multitask Unified Model, or MUM, we're able to better understand much more complex questions and needs, so in the future, it will require fewer searches to get things done. Like BERT, MUM is built on a Transformer architecture, but it’s 1,000 times more powerful and can multitask in order to unlock information in new ways. MUM not only understands language, but also generates it. It’s trained across 75 different languages and many different tasks at once, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models. And MUM is multimodal, so it understands information across text and images and in the future, can expand to more modalities like video and audio.

Imagine a question like: “I’ve hiked Mt. Adams and now want to hike Mt. Fuji next fall, what should I do differently to prepare?” This would stump search engines today, but in the future, MUM could understand this complex task and generate a response, pointing to highly relevant results to dive deeper. We’ve already started internal pilots with MUM and are excited about its potential for improving Google products.

Information comes to life with Lens and AR

People come to Google to learn new things, and visuals can make all the difference. Google Lens lets you search what you see — from your camera, your photos or even your search bar. Today we’re seeing more than 3 billion searches with Lens every month, and an increasingly popular use case is learning. For example, many students might have schoolwork in a language they aren't very familiar with. That’s why we’re updating the Translate filter in Lens so it’s easy to copy, listen to or search translated text, helping students access education content from the web in over 100 languages.

Animated GIF showing Google Lens’s Translate filter applied to homework.

AR is also a powerful tool for visual learning. With the new AR athletes in Search, you can see signature moves from some of your favorite athletes in AR — like Simone Biles’s famous balance beam routine.

Animated GIF showing Simone Biles’s balance beam routine surfaced by the AR athletes in Search feature.

Evaluate information with About This Result 

Helpful information should be credible and reliable, and especially during moments like the pandemic or elections, people turn to Google for trustworthy information. 

Our ranking systems are designed to prioritize high-quality information, but we also help you evaluate the credibility of sources, right in Google Search. Our About This Result feature provides details about a website before you visit it, including its description, when it was first indexed and whether your connection to the site is secure. 

Animated GIF showing the About This Result features applied to the query "How to invest in ETFs."

This month, we’ll start rolling out About This Result to all English results worldwide, with more languages to come. Later this year, we’ll add even more detail, like how a site describes itself, what other sources are saying about it and related articles to check out. 

Exploring the real world with Maps

Google Maps transformed how people navigate, explore and get things done in the world — and we continue to push the boundaries of what a map can be with industry-first features like AR navigation in Live View at scale. We recently announced we’re on track to launch over 100 AI-powered improvements to Google Maps by the end of year, and today, we’re introducing a few of the newest ones. Our new routing updates are designed to reduce the likelihood of hard-braking on your drive using machine learning and historical navigation information — which we believe could eliminate over 100 million hard-braking events in routes driven with Google Maps each year.

If you’re looking for things to do, our more tailored map will spotlight relevant places based on time of day and whether or not you’re traveling. Enhancements to Live View and detailed street maps will help you explore and get a deep understanding of an area as quickly as possible. And if you want to see how busy neighborhoods and parts of town are, you’ll be able to do this at a glance as soon as you open Maps.

More ways to shop with Google 

People are shopping across Google more than a billion times per day, and our AI-enhanced Shopping Graph — our deep understanding of products, sellers, brands, reviews, product information and inventory data — powers many features that help you find exactly what you’re looking for.

Because shopping isn’t always a linear experience, we’re introducing new ways to explore and keep track of products. Now, when you take a screenshot, Google Photos will prompt you to search the photo with Lens, so you can immediately shop for that item if you want. And on Chrome, we’ll help you keep track of shopping carts you’ve begun to fill, so you can easily resume your virtual shopping trip. We're also working with retailers to surface loyalty benefits for customers earlier, to help inform their decisions.

Last year we made it free for merchants to sell their products on Google. Now, we’re introducing a new, simplified process that helps Shopify’s 1.7 million merchants make their products discoverable across Google in just a few clicks.  

Whether we’re understanding the world’s information, or helping you understand it too, we’re dedicated to making our products more useful every day. And with the power of AI, no matter how complex your task, we’ll be able to bring you the highest quality, most relevant results. 

Source: Google LatLong


“L10n” – Localisation: Breaking down language barriers to unleash the benefits of the internet for all Indians

In July, at the Google for India event, we outlined our vision to make the Internet helpful for a billion Indians, and power the growth of India’s digital economy. One critical area that we need to overcome is the challenge of India’s vast linguistic diversity, with dialects changing every hundred kilometres. More often than not, one language doesn’t seamlessly map to another. A word in Bengali roughly translates to a full sentence in Tamil and there are expressions in Urdu which have no adequately evocative equivalent in Hindi. 


This poses a formidable challenge for technology developers, who rely on commonly understood visual and spoken idioms to make tech products work universally. 


We realised early on that there was no way to simplify this challenge - that there wasn’t any one common minimum that could address the needs of every potential user in this country. If we hoped to bring the potential of the internet within reach of every user in India, we had to invest in building products, content and tools in every popularly spoken Indian language. 


India’s digital transformation will be incomplete if English proficiency continues to be the entry barrier for basic and potent uses of the Internet such as buying and selling online, finding jobs, using net banking and digital payments or getting access to information and registering for government schemes.


The work, though underway, is far from done. We are driving a 3-point strategy to truly digitize India:


  1. Invest in ML & AI efforts at Google’s research center in India, to make advances in machine learning and AI models accessible to everyone across the ecosystem.

  2. Partner with innovative local startups who are building solutions to cater to the needs of Indians in local languages

  3. Drastically improve the experience of Google products and services for Indian language users


And so today, we are happy to announce a range of features to help deliver an even richer language experience to millions across India.

Easily toggling between English and Indian language results

Four years ago we made it easier for people in states with a significant Hindi-speaking population to flip between English and Hindi results for a search query, by introducing a simple ‘chip’ or tab they could tap to see results in their preferred language. In fact, since the launch of this Hindi chip and other language features, we have seen more than a 10X increase in Hindi queries in India.

We are now making it easier to toggle Search results between English and four additional Indian languages: Tamil, Telugu, Bangla and Marathi.

People can now tap a chip to see Search results in their local language

Understanding which language content to surface, when

Typing in an Indian language in its native script is typically more difficult, and can often take three times as long, compared to English. As a result, many people search in English even if they really would prefer to see results in a local language they understand.

Search will show relevant results in more Indian languages

Over the next month, Search will start to show relevant content in supported Indian languages where appropriate, even if the local language query is typed in English. This functionality will also better serve bilingual people who are comfortable reading both English and an Indian language. It will roll out in five Indian languages: Hindi, Bangla, Marathi, Tamil, and Telugu.

Enabling people to use apps in the language of their choice

Just like you use different tools for different tasks, we know (because we do it ourselves) people often select a specific language for a particular situation. Rather than guessing preferences, we launched the ability to easily change the language of Google Assistant and Discover to be different from the phone language. Today in India, more than 50 percent of the content viewed on Google Discover is in Indian languages. A third of Google Assistant users in India are using it in an Indian language, and since the launch of Assistant language picker, queries in Indian languages have doubled.

Maps will now able people to select up to nine Indian languages

We are now extending this ability to Google Maps, where users can quickly and easily change their Maps experience into one of nine Indian languages, by simply opening the app, going to Settings, and tapping ‘App language’. This will allow anyone to search for places, get directions and navigation, and interact with the Map in their preferred local language.

Homework help in Hindi (and English)

Meaning is also communicated with images: and this is where Google Lens can help. From street signs to restaurant menus, shop names to signboards, Google Lens lets you search what you see, get things done faster, and understand the world around you—using just your camera or a photo. In fact more people use Google Lens in India every month than in any other country worldwide. As an example of its popularity, over 3 billion words have been translated in India with Lens in 2020.

Lens is particularly helpful for students wanting to learn about the world. If you’re a parent, you’ll be familiar with your kids asking you questions about homework. About stuff you never thought you’d need to remember, like... quadratic equations.

Google Lens can now help you solve math problems by simply pointing your camera 

Now, right from the Search bar in the Google app, you can use Lens to snap a photo of a math problem and learn how to solve it on your own, in Hindi (or English). To do this, Lens first turns an image of a homework question into a query. Based on the query, we will show step-by-step guides and videos to help explain the problem.

Helping computer systems understand Indian languages at scale

At Google Research India, we have spent a lot of time helping computer systems understand human language. As you can imagine, this is quite an exciting challenge.The new approach we developed in India is called Multilingual Representations for Indian Languages (or ‘MuRIL’). Among many other benefits of this powerful multilingual model that scales across languages, MuRIL also provides support for transliterated text such as when writing Hindi using Roman script, which was something missing from previous models of its kind. 

One of the many tasks MuRIL is good at, is determining the sentiment of the sentence. For example, “Achha hua account bandh nahi hua” would previously be interpreted as having a negative meaning, but MuRIL correctly identifies this as a positive statement. Or take the ability to classify a person versus a place: ‘Shirdi ke sai baba’ would previously be interpreted as a place, which is wrong, but MuRIL correctly interprets it as a person.

MuRIL currently supports 16 Indian languages as well as English -- the highest coverage for Indian languages among any other publicly available model of its kind.

MuRIL is free & Open Source,

available on TensorFlow Hub

https://tfhub.dev/google/MuRIL/1



We are thrilled to announce that we have made MuRIL open source, and it is currently available to download from the TensorFlow Hub, for free. We hope MuRIL will be the next big evolution for Indian language understanding, forming a better foundation for researchers, students, startups, and anyone else interested in building Indian language technologies, and we can’t wait to see the many ways the ecosystem puts it to use.

We’re sharing this to provide a flavor of the depth of work underway -- and which is required -- to really make a universally potent and accessible Internet a reality. This said, the Internet in India is the sum of the work of millions of developers, content creators, news media and online businesses, and it is only when this effort is undertaken at scale by the entire ecosystem, that we will help fulfil the truly meaningful promise of the billionth Indian coming online.

Posted by the Google India team


Visual ways to search and understand our world

Whether you’re a student learning about photosynthesis or a parent researching the best cars for your growing family, people turn to Google with all sorts of curiosities. And we can help you understand in different ways—through text, your voice or even your phone’s camera. Today, as part of the SearchOn event, we’re announcing new ways you can use Google Lens and augmented reality (AR) while learning and shopping.

Visual tools to help you learn 

For many families, adjusting to remote learning hasn’t been easy, but tools like Google Lens can help lighten the load. With Lens, you can search what you see using your camera. Lens can now recognize 15 billion things—up from 1 billion just two years ago—to help you identify plants, animals, landmarks and more. If you’re learning a new language, Lens can also translate more than 100 languages, such as Spanish and Arabic, and you can tap to hear words and sentences pronounced out loud


If you’re a parent, your kids may ask you questions about things you never thought you’d need to remember, like quadratic equations. From the search bar in the Google app on Android and iOS, you can use Lens to get help on a homework problem. With step-by-step guides and videos, you can learn and understand the foundational concepts to solve math, chemistry, biology and physics problems.

Lens Homework

Sometimes, seeing is understanding. For instance, visualizing the inner workings of a plant cell or the elements in the periodic table in 3D is more helpful than reading about them in a textbook. AR brings hands-on learning home, letting you explore concepts up close in your space. Here’s how Melissa Brophy-Plasencio, an educator from Texas, is incorporating AR into her lesson plans.

Melissa Brophy-Plasencio, an educator from Texas, shares how she's using AR into her science lessons.

Shop what you see with Google Lens 

Another area where the camera can be helpful is shopping—especially when what you’re looking for is hard to describe in words. With Lens, you can already search for a product by taking a photo or screenshot. Now, we’re making it even easier to discover new products as you browse online on your phone. When you tap and hold an image on the Google app or Chrome on Android, Lens will find the exact or similar items, and suggest ways to style it. This feature is coming soon to the Google app on iOS.

Lens Shopping

Lens uses Style Engine technology which combines the world’s largest database of products with millions of style images. Then, it pattern matches to understand concepts like “ruffle sleeves” or “vintage denim” and how they pair with different apparel. 

Bring the showroom to you with AR

When you can’t go into stores to check out a product up close, AR can bring the showroom to you. If you’re in the market for a new car, for example, you’ll soon be able to search for it on Google and see an AR model right in front of you. You can easily check out what the car looks like in different colors, zoom in to see intricate details like buttons on the dashboard, view it against beautiful backdrops and even see it in your driveway. We’re experimenting with this feature in the U.S. and working with top auto brands, such as Volvo and Porsche, to bring these experiences to you soon.

ColorSwaps_Volvo_560_sq.gif

AR experience of the 2020 Volvo XC40 Recharge

Everyone’s journey to understand is different. Whether you snap a photo with Lens or immerse yourself in AR, we hope you find what you’re looking for...

Ladybug.gif

...and even have some fun along the way.

Source: Search