Tag Archives: Google Lens

How AI is making information more useful

Today, there’s more information accessible at people’s fingertips than at any point in human history. And advances in artificial intelligence will radically transform the way we use that information, with the ability to uncover new insights that can help us both in our daily lives and in the ways we are able to tackle complex global challenges.


At our Search On livestream event today, we shared how we’re bringing the latest in AI to Google’s products, giving people new ways to search and explore information in more natural and intuitive ways.


Making multimodal search possible with MUM

Earlier this year at Google I/O, we announced we’ve reached a critical milestone for understanding information with Multitask Unified Model, or MUM for short.


We’ve been experimenting with using MUM’s capabilities to make our products more helpful and enable entirely new ways to search. Today, we’re sharing an early look at what will be possible with MUM. 


In the coming months, we’ll introduce a new way to search visually, with the ability to ask questions about what you see. Here are a couple of examples of what will be possible with MUM.




With this new capability, you can tap on the Lens icon when you’re looking at a picture of a shirt, and ask Google to find you the same pattern — but on another article of clothing, like socks. This helps when you’re looking for something that might be difficult to describe accurately with words alone. You could type “white floral Victorian socks,” but you might not find the exact pattern you’re looking for. By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural ways. 



Some questions are even trickier: Your bike has a broken thingamajig, and you need some guidance on how to fix it. Instead of poring over catalogs of parts and then looking for a tutorial, the point-and-ask mode of searching will make it easier to find the exact moment in a video that can help.


Helping you explore with a redesigned Search page

We’re also announcing how we’re applying AI advances like MUM to redesign Google Search. These new features are the latest steps we’re taking to make searching more natural and intuitive.


First, we’re making it easier to explore and understand new topics with “Things to know.” Let’s say you want to decorate your apartment, and you’re interested in learning more about creating acrylic paintings.



If you search for “acrylic painting,” Google understands how people typically explore this topic, and shows the aspects people are likely to look at first. For example, we can identify more than 350 topics related to acrylic painting, and help you find the right path to take.


We’ll be launching this feature in the coming months. In the future, MUM will unlock deeper insights you might not have known to search for — like “how to make acrylic paintings with household items” — and connect you with content on the web that you wouldn’t have otherwise found.

Second, to help you further explore ideas, we’re making it easy to zoom in and out of a topic with new features to refine and broaden searches. 


In this case, you can learn more about specific techniques, like puddle pouring, or art classes you can take. You can also broaden your search to see other related topics, like other painting methods and famous painters. These features will launch in the coming months.


Third, we’re making it easier to find visual inspiration with a newly designed, browsable results page. If puddle pouring caught your eye, just search for “pour painting ideas" to see a visually rich page full of ideas from across the web, with articles, images, videos and more that you can easily scroll through. 

This new visual results page is designed for searches that are looking for inspiration, like “Halloween decorating ideas” or “indoor vertical garden ideas,” and you can try it today.

Get more from videos

We already use advanced AI systems to identify key moments in videos, like the winning shot in a basketball game, or steps in a recipe. Today, we’re taking this a step further, introducing a new experience that identifies related topics in a video, with links to easily dig deeper and learn more. 


Using MUM, we can even show related topics that aren’t explicitly mentioned in the video, based on our advanced understanding of information in the video. In this example, while the video doesn’t say the words “macaroni penguin’s life story,” our systems understand that topics contained in the video relate to this topic, like how macaroni penguins find their family members and navigate predators. The first version of this feature will roll out in the coming weeks, and we’ll add more visual enhancements in the coming months.


Across all these MUM experiences, we look forward to helping people discover more web pages, videos, images and ideas that they may not have come across or otherwise searched for. 


A more helpful Google

The updates we’re announcing today don’t end with MUM, though. We’re also making it easier to shop from the widest range of merchants, big and small, no matter what you’re looking for. And we’re helping people better evaluate the credibility of information they find online. Plus, for the moments that matter most, we’re finding new ways to help people get access to information and insights. 


All this work not only helps people around the world, but creators, publishers and businesses as well.  Every day, we send visitors to well over 100 million different websites, and every month, Google connects people with more than 120 million businesses that don't have websites, by enabling phone calls, driving directions and local foot traffic.


As we continue to build more useful products and push the boundaries of what it means to search, we look forward to helping people find the answers they’re looking for, and inspiring more questions along the way.


Posted by Prabhakar Raghavan, Senior Vice President




Rediscover your city through a new Lens this summer

With warmer weather upon us and many places reopening in the U.K., it’s the perfect time to go out and reconnect with your surroundings. Whether it’s soaking up that panoramic view of a city skyline that you’ve really missed, or wondering what that interesting tree species was that you pass every day on your park walk, many of us feel ready to reconnect with our cities in new ways.


British cities are especially ripe for rediscovery. As the country emerges from a long lockdown and people start to reintegrate with their cities this summer, we’re launching a campaign called Behind the Lens with Google Pixel, which aims to help people rediscover their cities using Google Lens on Pixel. We’ll do that through a series of events over the coming weeks, alongside some very special guests in London, Bristol and Liverpool.


Vibrant orange and purple flower shown on a Google Pixel 5 using Google Lens, which has identified the flower as a bird of paradise. The result shows information about the plant: “Strelitzia reginae, commonly called a crane flower or bird of paradise, is a genus of perennial plants, native to South Africa…”

Vibrant orange and purple flower shown on a Google Pixel 5 using Google Lens, which has identified the flower as a Bird of Paradise.

Behind the Lens with Google Pixel encourages people to search what they see using the magic of Lens, and rediscover some forgotten pockets of their city using its updated features. Identifying the species of that bird you keep seeing in the communal gardens of London has never been easier, while discovering new, secret ingredients at a farmer’s market in Liverpool can also be done in a snap. Or, perhaps you’ve always wanted to know more about that forgotten landmark from a viewpoint in Bristol. Lens can give you on-the-spot information about a subject with a single long tap on the Pixel camera viewfinder, which is handy since we often have our cameras open and ready to capture the moment. 


With restrictions being lifted in the U.K. this summer, Search trends reveal that there is an opportunity to rediscover our cities through the interests we have acquired over lockdown. From March 23, 2020 through April 21, 2021, Google searches incrementally increased for new skills and classes: Hiking trails near me (+200%), Online gardening courses (+300%) and Online cooking classes (+800%). 


This suggests not only that some of the hobbies the nation nurtured during lockdown are still very much of interest, but also now people can rediscover these within the backdrop of their city, alongside their communities and friends. 


Within Google Lens, the Places filter is selected and the view is showing a clock tower against a bright, cloudy sky. Lens identifies the clock tower as Big Ben and gives results, including a star rating, two alternative views of the tower and an option to search Google.

Within Google Lens, the Places filter is selected and the view is showing a clock tower against a bright, cloudy sky.

A new tool for rediscovery


Google Lens is now used over three billion times per month by people around the world, and with many ready to explore this summer and rediscover their cities, we’re officially launching the new Places filter in Lens. Now available globally, the Places filter makes it easy to identify buildings and landmarks using your phone camera, combining 3D models from Google Earth and Lens’ powerful image recognition technology to create an in-depth, real-time AR experience, similar to Live View on Google Maps.


The Google Lens app Places filter is open on a black Google Pixel 5, showing a view that scans the River Thames and settles on a large bridge with two towers. Upon identification of the structure as Tower Bridge, Lens results show the star rating, alternative images of Tower Bridge to scroll through, and the option to search Google for more information.

The Google Lens app Places filter is open on a Google Pixel 5, showing a view that scans the River Thames and settles on a large bridge with two towers.

Just open the Google app on your phone and tap the camera icon in the search bar to open Lens. Then, switch to the Places filter and point your camera at notable places around you.


We hope Lens makes rediscovering and learning about your city even more enjoyable.


Search, explore and shop the world’s information, powered by AI

AI advancements push the boundaries of what Google products can do. Nowhere is this clearer than at the core of our mission to make information more accessible and useful for everyone.

We've spent more than two decades developing not just a better understanding of information on the web, but a better understanding of the world. Because when we understand information, we can make it more helpful — whether you’re a remote student learning a complex new subject, a caregiver looking for trusted information on COVID vaccines or a parent searching for the best route home.

Deeper understanding with MUM

One of the hardest problems for search engines today is helping you with complex tasks — like planning what to do on a family outing. These often require multiple searches to get the information you need. In fact, we find that it takes people eight searches on average to complete complex tasks.

With a new technology called Multitask Unified Model, or MUM, we're able to better understand much more complex questions and needs, so in the future, it will require fewer searches to get things done. Like BERT, MUM is built on a Transformer architecture, but it’s 1,000 times more powerful and can multitask in order to unlock information in new ways. MUM not only understands language, but also generates it. It’s trained across 75 different languages and many different tasks at once, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models. And MUM is multimodal, so it understands information across text and images and in the future, can expand to more modalities like video and audio.

Imagine a question like: “I’ve hiked Mt. Adams and now want to hike Mt. Fuji next fall, what should I do differently to prepare?” This would stump search engines today, but in the future, MUM could understand this complex task and generate a response, pointing to highly relevant results to dive deeper. We’ve already started internal pilots with MUM and are excited about its potential for improving Google products.

Information comes to life with Lens and AR

People come to Google to learn new things, and visuals can make all the difference. Google Lens lets you search what you see — from your camera, your photos or even your search bar. Today we’re seeing more than 3 billion searches with Lens every month, and an increasingly popular use case is learning. For example, many students might have schoolwork in a language they aren't very familiar with. That’s why we’re updating the Translate filter in Lens so it’s easy to copy, listen to or search translated text, helping students access education content from the web in over 100 languages.

Animated GIF showing Google Lens’s Translate filter applied to homework.

AR is also a powerful tool for visual learning. With the new AR athletes in Search, you can see signature moves from some of your favorite athletes in AR — like Simone Biles’s famous balance beam routine.

Animated GIF showing Simone Biles’s balance beam routine surfaced by the AR athletes in Search feature.

Evaluate information with About This Result 

Helpful information should be credible and reliable, and especially during moments like the pandemic or elections, people turn to Google for trustworthy information. 

Our ranking systems are designed to prioritize high-quality information, but we also help you evaluate the credibility of sources, right in Google Search. Our About This Result feature provides details about a website before you visit it, including its description, when it was first indexed and whether your connection to the site is secure. 

Animated GIF showing the About This Result features applied to the query "How to invest in ETFs."

This month, we’ll start rolling out About This Result to all English results worldwide, with more languages to come. Later this year, we’ll add even more detail, like how a site describes itself, what other sources are saying about it and related articles to check out. 

Exploring the real world with Maps

Google Maps transformed how people navigate, explore and get things done in the world — and we continue to push the boundaries of what a map can be with industry-first features like AR navigation in Live View at scale. We recently announced we’re on track to launch over 100 AI-powered improvements to Google Maps by the end of year, and today, we’re introducing a few of the newest ones. Our new routing updates are designed to reduce the likelihood of hard-braking on your drive using machine learning and historical navigation information — which we believe could eliminate over 100 million hard-braking events in routes driven with Google Maps each year.

If you’re looking for things to do, our more tailored map will spotlight relevant places based on time of day and whether or not you’re traveling. Enhancements to Live View and detailed street maps will help you explore and get a deep understanding of an area as quickly as possible. And if you want to see how busy neighborhoods and parts of town are, you’ll be able to do this at a glance as soon as you open Maps.

More ways to shop with Google 

People are shopping across Google more than a billion times per day, and our AI-enhanced Shopping Graph — our deep understanding of products, sellers, brands, reviews, product information and inventory data — powers many features that help you find exactly what you’re looking for.

Because shopping isn’t always a linear experience, we’re introducing new ways to explore and keep track of products. Now, when you take a screenshot, Google Photos will prompt you to search the photo with Lens, so you can immediately shop for that item if you want. And on Chrome, we’ll help you keep track of shopping carts you’ve begun to fill, so you can easily resume your virtual shopping trip. We're also working with retailers to surface loyalty benefits for customers earlier, to help inform their decisions.

Last year we made it free for merchants to sell their products on Google. Now, we’re introducing a new, simplified process that helps Shopify’s 1.7 million merchants make their products discoverable across Google in just a few clicks.  

Whether we’re understanding the world’s information, or helping you understand it too, we’re dedicated to making our products more useful every day. And with the power of AI, no matter how complex your task, we’ll be able to bring you the highest quality, most relevant results. 

Source: Google LatLong


“L10n” – Localisation: Breaking down language barriers to unleash the benefits of the internet for all Indians

In July, at the Google for India event, we outlined our vision to make the Internet helpful for a billion Indians, and power the growth of India’s digital economy. One critical area that we need to overcome is the challenge of India’s vast linguistic diversity, with dialects changing every hundred kilometres. More often than not, one language doesn’t seamlessly map to another. A word in Bengali roughly translates to a full sentence in Tamil and there are expressions in Urdu which have no adequately evocative equivalent in Hindi. 


This poses a formidable challenge for technology developers, who rely on commonly understood visual and spoken idioms to make tech products work universally. 


We realised early on that there was no way to simplify this challenge - that there wasn’t any one common minimum that could address the needs of every potential user in this country. If we hoped to bring the potential of the internet within reach of every user in India, we had to invest in building products, content and tools in every popularly spoken Indian language. 


India’s digital transformation will be incomplete if English proficiency continues to be the entry barrier for basic and potent uses of the Internet such as buying and selling online, finding jobs, using net banking and digital payments or getting access to information and registering for government schemes.


The work, though underway, is far from done. We are driving a 3-point strategy to truly digitize India:


  1. Invest in ML & AI efforts at Google’s research center in India, to make advances in machine learning and AI models accessible to everyone across the ecosystem.

  2. Partner with innovative local startups who are building solutions to cater to the needs of Indians in local languages

  3. Drastically improve the experience of Google products and services for Indian language users


And so today, we are happy to announce a range of features to help deliver an even richer language experience to millions across India.

Easily toggling between English and Indian language results

Four years ago we made it easier for people in states with a significant Hindi-speaking population to flip between English and Hindi results for a search query, by introducing a simple ‘chip’ or tab they could tap to see results in their preferred language. In fact, since the launch of this Hindi chip and other language features, we have seen more than a 10X increase in Hindi queries in India.

We are now making it easier to toggle Search results between English and four additional Indian languages: Tamil, Telugu, Bangla and Marathi.

People can now tap a chip to see Search results in their local language

Understanding which language content to surface, when

Typing in an Indian language in its native script is typically more difficult, and can often take three times as long, compared to English. As a result, many people search in English even if they really would prefer to see results in a local language they understand.

Search will show relevant results in more Indian languages

Over the next month, Search will start to show relevant content in supported Indian languages where appropriate, even if the local language query is typed in English. This functionality will also better serve bilingual people who are comfortable reading both English and an Indian language. It will roll out in five Indian languages: Hindi, Bangla, Marathi, Tamil, and Telugu.

Enabling people to use apps in the language of their choice

Just like you use different tools for different tasks, we know (because we do it ourselves) people often select a specific language for a particular situation. Rather than guessing preferences, we launched the ability to easily change the language of Google Assistant and Discover to be different from the phone language. Today in India, more than 50 percent of the content viewed on Google Discover is in Indian languages. A third of Google Assistant users in India are using it in an Indian language, and since the launch of Assistant language picker, queries in Indian languages have doubled.

Maps will now able people to select up to nine Indian languages

We are now extending this ability to Google Maps, where users can quickly and easily change their Maps experience into one of nine Indian languages, by simply opening the app, going to Settings, and tapping ‘App language’. This will allow anyone to search for places, get directions and navigation, and interact with the Map in their preferred local language.

Homework help in Hindi (and English)

Meaning is also communicated with images: and this is where Google Lens can help. From street signs to restaurant menus, shop names to signboards, Google Lens lets you search what you see, get things done faster, and understand the world around you—using just your camera or a photo. In fact more people use Google Lens in India every month than in any other country worldwide. As an example of its popularity, over 3 billion words have been translated in India with Lens in 2020.

Lens is particularly helpful for students wanting to learn about the world. If you’re a parent, you’ll be familiar with your kids asking you questions about homework. About stuff you never thought you’d need to remember, like... quadratic equations.

Google Lens can now help you solve math problems by simply pointing your camera 

Now, right from the Search bar in the Google app, you can use Lens to snap a photo of a math problem and learn how to solve it on your own, in Hindi (or English). To do this, Lens first turns an image of a homework question into a query. Based on the query, we will show step-by-step guides and videos to help explain the problem.

Helping computer systems understand Indian languages at scale

At Google Research India, we have spent a lot of time helping computer systems understand human language. As you can imagine, this is quite an exciting challenge.The new approach we developed in India is called Multilingual Representations for Indian Languages (or ‘MuRIL’). Among many other benefits of this powerful multilingual model that scales across languages, MuRIL also provides support for transliterated text such as when writing Hindi using Roman script, which was something missing from previous models of its kind. 

One of the many tasks MuRIL is good at, is determining the sentiment of the sentence. For example, “Achha hua account bandh nahi hua” would previously be interpreted as having a negative meaning, but MuRIL correctly identifies this as a positive statement. Or take the ability to classify a person versus a place: ‘Shirdi ke sai baba’ would previously be interpreted as a place, which is wrong, but MuRIL correctly interprets it as a person.

MuRIL currently supports 16 Indian languages as well as English -- the highest coverage for Indian languages among any other publicly available model of its kind.

MuRIL is free & Open Source,

available on TensorFlow Hub

https://tfhub.dev/google/MuRIL/1



We are thrilled to announce that we have made MuRIL open source, and it is currently available to download from the TensorFlow Hub, for free. We hope MuRIL will be the next big evolution for Indian language understanding, forming a better foundation for researchers, students, startups, and anyone else interested in building Indian language technologies, and we can’t wait to see the many ways the ecosystem puts it to use.

We’re sharing this to provide a flavor of the depth of work underway -- and which is required -- to really make a universally potent and accessible Internet a reality. This said, the Internet in India is the sum of the work of millions of developers, content creators, news media and online businesses, and it is only when this effort is undertaken at scale by the entire ecosystem, that we will help fulfil the truly meaningful promise of the billionth Indian coming online.

Posted by the Google India team


Visual ways to search and understand our world

Whether you’re a student learning about photosynthesis or a parent researching the best cars for your growing family, people turn to Google with all sorts of curiosities. And we can help you understand in different ways—through text, your voice or even your phone’s camera. Today, as part of the SearchOn event, we’re announcing new ways you can use Google Lens and augmented reality (AR) while learning and shopping.

Visual tools to help you learn 

For many families, adjusting to remote learning hasn’t been easy, but tools like Google Lens can help lighten the load. With Lens, you can search what you see using your camera. Lens can now recognize 15 billion things—up from 1 billion just two years ago—to help you identify plants, animals, landmarks and more. If you’re learning a new language, Lens can also translate more than 100 languages, such as Spanish and Arabic, and you can tap to hear words and sentences pronounced out loud


If you’re a parent, your kids may ask you questions about things you never thought you’d need to remember, like quadratic equations. From the search bar in the Google app on Android and iOS, you can use Lens to get help on a homework problem. With step-by-step guides and videos, you can learn and understand the foundational concepts to solve math, chemistry, biology and physics problems.

Lens Homework

Sometimes, seeing is understanding. For instance, visualizing the inner workings of a plant cell or the elements in the periodic table in 3D is more helpful than reading about them in a textbook. AR brings hands-on learning home, letting you explore concepts up close in your space. Here’s how Melissa Brophy-Plasencio, an educator from Texas, is incorporating AR into her lesson plans.

Melissa Brophy-Plasencio, an educator from Texas, shares how she's using AR into her science lessons.

Shop what you see with Google Lens 

Another area where the camera can be helpful is shopping—especially when what you’re looking for is hard to describe in words. With Lens, you can already search for a product by taking a photo or screenshot. Now, we’re making it even easier to discover new products as you browse online on your phone. When you tap and hold an image on the Google app or Chrome on Android, Lens will find the exact or similar items, and suggest ways to style it. This feature is coming soon to the Google app on iOS.

Lens Shopping

Lens uses Style Engine technology which combines the world’s largest database of products with millions of style images. Then, it pattern matches to understand concepts like “ruffle sleeves” or “vintage denim” and how they pair with different apparel. 

Bring the showroom to you with AR

When you can’t go into stores to check out a product up close, AR can bring the showroom to you. If you’re in the market for a new car, for example, you’ll soon be able to search for it on Google and see an AR model right in front of you. You can easily check out what the car looks like in different colors, zoom in to see intricate details like buttons on the dashboard, view it against beautiful backdrops and even see it in your driveway. We’re experimenting with this feature in the U.S. and working with top auto brands, such as Volvo and Porsche, to bring these experiences to you soon.

ColorSwaps_Volvo_560_sq.gif

AR experience of the 2020 Volvo XC40 Recharge

Everyone’s journey to understand is different. Whether you snap a photo with Lens or immerse yourself in AR, we hope you find what you’re looking for...

Ladybug.gif

...and even have some fun along the way.

Source: Search


Use Google to read and translate text—now on KaiOS

Google’s philosophy has always been to build for everyone -- to break down language barriers, make knowledge accessible, and enable people to communicate how they want and what they want, effortlessly. In India, our rich diversity of languages presents an exciting challenge especially in the context of millions of new users coming online every day. Nine out of ten of these new users are non-English speakers. While many would be fluent at speaking and understanding their native language, there are others who might struggle when it comes to reading and writing it.


Google Assistant has made it easy for users in India to find answers and get things done on their devices using their voice. Since its launch at Google for India in 2017, we’ve worked hard to bring more helpful features like integrated voice typing on KaiOS, voice-based language selection, and support for Indian languages to help first-time internet users overcome barriers to literacy and interact with technology and their devices more naturally. 


At Google I/O in 2019, we brought camera-based translation to Google Lens to help you understand information you find in the real world. With Lens, you can point your camera at text you see and translate it into more than 100 languages. Lens can even speak the words out loud in your preferred language. We brought these Lens capabilities to Google Go, too, so even those on the most affordable smartphones can access them.




Today we are extending this capability to the millions of Google Assistant users on KaiOS devices in India. From Assistant, they can click the camera icon to simply point their phone at real-world text (like a product label, street sign, or document, for example,) and have it read back in their preferred language, translated, or defined. Just long press the center button from the home screen to get started with Assistant.

Within Google Assistant, KaiOS users can now use Google Lens  to read, translate and define words in the real word


It is currently available for English and several Indian languages including Hindi, Bengali, Telugu, Marathi and Tamil, and will soon be available in Kannada and Gujarati. Users can simply press the right soft key once within Assistant to access and use this feature.


This is another step in our commitment to make language more accessible to everyone, and we hope this will enable millions of KaiOS users across the country to have a more seamless language experience.

Posted by Shriya Raghunathan, Product Manager Google Assistant, and Harsh Kharbanda, Product Manager Google Lens

Use Google to read and translate text—now on KaiOS

Google’s philosophy has always been to build for everyone -- to break down language barriers, make knowledge accessible, and enable people to communicate how they want and what they want, effortlessly. In India, our rich diversity of languages presents an exciting challenge especially in the context of millions of new users coming online every day. Nine out of ten of these new users are non-English speakers. While many would be fluent at speaking and understanding their native language, there are others who might struggle when it comes to reading and writing it.


Google Assistant has made it easy for users in India to find answers and get things done on their devices using their voice. Since its launch at Google for India in 2017, we’ve worked hard to bring more helpful features like integrated voice typing on KaiOS, voice-based language selection, and support for Indian languages to help first-time internet users overcome barriers to literacy and interact with technology and their devices more naturally. 


At Google I/O in 2019, we brought camera-based translation to Google Lens to help you understand information you find in the real world. With Lens, you can point your camera at text you see and translate it into more than 100 languages. Lens can even speak the words out loud in your preferred language. We brought these Lens capabilities to Google Go, too, so even those on the most affordable smartphones can access them.




Today we are extending this capability to the millions of Google Assistant users on KaiOS devices in India. From Assistant, they can click the camera icon to simply point their phone at real-world text (like a product label, street sign, or document, for example,) and have it read back in their preferred language, translated, or defined. Just long press the center button from the home screen to get started with Assistant.

Within Google Assistant, KaiOS users can now use Google Lens  to read, translate and define words in the real word


It is currently available for English and several Indian languages including Hindi, Bengali, Telugu, Marathi and Tamil, and will soon be available in Kannada and Gujarati. Users can simply press the right soft key once within Assistant to access and use this feature.


This is another step in our commitment to make language more accessible to everyone, and we hope this will enable millions of KaiOS users across the country to have a more seamless language experience.

Posted by Shriya Raghunathan, Product Manager Google Assistant, and Harsh Kharbanda, Product Manager Google Lens

New Google Lens features to help you be more productive at home

Lately our family dining table has also become a work desk, a video conference room and … a kid’s playground. As I learn how to become a full time kids-entertainer, I welcome anything that can help me stay productive. And while I usually turn to Search when learning about new things, sometimes what I’m looking for is hard to describe in words.

This is where Google Lens can help. When my family’s daily activity involves a walk in the neighborhood, Lens lets me search what I see, like a flower in our neighbor’s front yard.

Lens_Identify_Bird_of_Paradise@2x.png

But it can also be a helpful tool for getting things done while working and learning from home. Today, we’re adding a few new features to make you more productive.

Copy text from paper to your laptop

You can already use Lens to quickly copy and paste text from paper notes and documents to your phone to save time. Now, when you select text with Lens, you can tap "copy to computer" to quickly paste it on another signed-in device with Chrome. This is great for quickly copying handwritten notes (if you write neatly!) and pasting it on your laptop without having to retype them all.

E694_Lens_Productivity_Assets_INT_CopyToDevice_HANDWRITING.gif

Copying text to your computer requires the latest version of Chrome, and for both devices to be signed into the same Google account.

Learn new words and how to pronounce them

Searches for learn a new language have doubled over the last few months. If you're using the extra time at home to pick up a new language, you can already use Lens to translate words in Spanish, Chinese and more than 100 other languages, by pointing your camera at the text.

Lens_Translate_Chinese@2x.png

Now, you can also use Lens to practice words or phrases that are difficult to say.  Select the text with Lens and tap the new Listen button to hear it read out loud—and finally figure out how to say “hipopótamo!”

Lens read out loud feature

Quickly look up new concepts

If you come across a word or phrase you don’t understand in a book or newspaper, like “gravitational waves,” Google Lens can help. Now, with in-line Google Search results, you can select complex phrases or words to quickly learn more.

E694_Lens_Productivity_Assets_INT_HomeworkHelp.gif

These features are rolling out today, except for Listen which is available on Android and coming soon to iOS. Lens is available in the Google app on iOS and the Google Lens app on Android.

Lens_Productivity_App_Entry@2x.png

We look forward to hearing about the ways you use Lens to learn new things and get stuff done while at home.

Go beyond the page with Google Lens and NYT Magazine

When you read an article online—including this one—videos, GIFs, photo galleries and links to related articles can help you get a better sense of the story. But in print, you’re limited by what can fit on the page. So how do you go beyond what’s on the physical page for more? 

Throughout the first half of this year, we’re working with The New York Times so that readers of the print edition of The New York Times Magazine can use Google Lens to unlock more information by simply pointing their smartphone camera at the pages. On Sunday, when The Times Magazine’s annual Music Issue hits newsstands, readers can use Lens to access videos, animations and in-depth digital content that help you go beyond what’s included in print. Readers will also be able to access a playlist of all the music on the magazine’s list of “25 Songs That Matter Now” using Lens.

Point Lens at the cover of the magazine to learn about how it was designed. And as you're reading the magazine, you'll be able to get more information related to the articles that pique your curiosity and digitally save or share them by simply pointing Lens at the page. In addition to the cover and articles, we're excited to work with The New York Times and other brands to bring interactive elements to print ads in the magazine through Lens.

Using Lens, you can already find out what kind of flower you’re looking at, see what’s popular on the menu at a restaurant, translate text into another language or figure out where to get a pair of shoes like the ones you just saw. Now, Lens can help you go deeper into the stories you care about in The New York Times Magazine. 

Let Google be your holiday travel tour guide

When it comes to travel, I’m a planner. I’m content to spend weeks preparing the perfect holiday getaway: deciding on the ideal destination, finding the cheapest flights and sniffing out the best accommodations. I’ve been dreaming about a trip to Greece next year, and—true story—I’ve already got a spreadsheet to compare potential destinations, organized by flight length and hotel perks.

But the thing I don’t like to do is plot out the nitty-gritty details. I want to visit the important museums and landmarks, but I don’t want to write up a daily itinerary ahead of time. I’m a vegetarian, so I need to find veggie-friendly restaurants, but I’d prefer to stumble upon a good local spot than plan in advance. And, since I don’t speak Greek, I want to be able to navigate transportation options without having to stop and ask people for help all the time.

So I’ve come to rely on some useful Google tools to make my trips work for the way I like to travel. Here’s what I’ve learned so far.

Let Maps do the talking

Getting dropped into a new city is disorienting, and all the more so when you need to ask for help but don’t know how to pronounce the name of the place you’re trying to get to. Google Maps now has a fix for this: When you’ve got a place name up in Maps, just press the new little speaker button next to it, and it will speak out a place's name and address in the local lingo. And if you want to continue the conversation, Google Maps will quickly link you to the Google Translate app.

gif of Google Translate feature in Google Maps

Let your phone be your guidebook

New cities are full of new buildings, new foods and even new foliage. But I don’t want to just see these things; I want to learn more about them. That’s where Google Lens comes in as my know-it-all tour guide and interpreter. It can translate a menu, tell me about the landmark I’m standing in front of or identify a tree I’ve never seen before. So whenever I think, “I wonder what that building is for,” I can just use my camera to get an answer in real time. 

using Google Lens to identify a flower

Photo credit: Joao Nogueira

Get translation help on the go

The Google Assistant’s real-time translation feature, interpreter mode, is now available on Android and iOS phones worldwide, enabling you to have a conversation with someone speaking a foreign language. So if I say, “Hey Google, be my Greek translator,” I can easily communicate with, say, a restaurant server who doesn’t speak English. Interpreter mode works across 44 languages, and it features different ways to communicate suited to your situation: you can type using a keyboard for quiet environments, or manually select what language to speak.

gif of Google Assistant interpreter mode

Use your voice to get things done

Typing is fine, but talking is easier, especially when I’m on vacation and want to make everything as simple as possible. The Google Assistant makes it faster to find what I’m looking for and plan what’s next, like weather forecasts, reminders and wake-up alarms. It can also help me with conversions, like “Hey Google, how much is 20 Euros in pounds?”

Using Google Assistant to answer questions

Photo credit: Joao Nogueira

Take pics, then chill

When I’m in a new place, my camera is always out. But sorting through all those pictures is the opposite of relaxing. So I offload that work onto Google Photos: It backs up my photos for free and lets me search for things in them . And when I want to see all the photos my partner has taken, I can create an album that we can both add photos to. And Photos will remind me of our vacation in the future, too, with story-style highlights at the top of the app.

photo of leafy old town street

Photo credit: Joao Nogueira

Look up

I live in a big city, which means I don’t get to see the stars much. Traveling somewhere a little less built up means I can hone my Pixel 4 astrophotography skills. It’s easy to use something stable, like a wall, as a makeshift tripod, and then just let the camera do its thing.

a stone tower at night with a starry sky in the background

Photo credit: DDay

Vacation unplugged

As useful as my phone is, I try to be mindful about putting it down and ignoring it as much as I can. And that goes double for when I’m on vacation. Android phones have a whole assortment of Digital Wellbeing features to help you disconnect. My favorite is definitely flip to shhh: Just place your phone screen-side down and it silences notifications until you pick it back up.

someone sitting on a boat at sunset watching the shoreline

Photo credit: Joao Nogueira

Source: Google LatLong