Tag Archives: Google Lens

New Google Lens features to help you be more productive at home

Lately our family dining table has also become a work desk, a video conference room and … a kid’s playground. As I learn how to become a full time kids-entertainer, I welcome anything that can help me stay productive. And while I usually turn to Search when learning about new things, sometimes what I’m looking for is hard to describe in words.

This is where Google Lens can help. When my family’s daily activity involves a walk in the neighborhood, Lens lets me search what I see, like a flower in our neighbor’s front yard.

Lens_Identify_Bird_of_Paradise@2x.png

But it can also be a helpful tool for getting things done while working and learning from home. Today, we’re adding a few new features to make you more productive.

Copy text from paper to your laptop

You can already use Lens to quickly copy and paste text from paper notes and documents to your phone to save time. Now, when you select text with Lens, you can tap "copy to computer" to quickly paste it on another signed-in device with Chrome. This is great for quickly copying handwritten notes (if you write neatly!) and pasting it on your laptop without having to retype them all.

E694_Lens_Productivity_Assets_INT_CopyToDevice_HANDWRITING.gif

Copying text to your computer requires the latest version of Chrome, and for both devices to be signed into the same Google account.

Learn new words and how to pronounce them

Searches for learn a new language have doubled over the last few months. If you're using the extra time at home to pick up a new language, you can already use Lens to translate words in Spanish, Chinese and more than 100 other languages, by pointing your camera at the text.

Lens_Translate_Chinese@2x.png

Now, you can also use Lens to practice words or phrases that are difficult to say.  Select the text with Lens and tap the new Listen button to hear it read out loud—and finally figure out how to say “hipopótamo!”

Lens read out loud feature

Quickly look up new concepts

If you come across a word or phrase you don’t understand in a book or newspaper, like “gravitational waves,” Google Lens can help. Now, with in-line Google Search results, you can select complex phrases or words to quickly learn more.

E694_Lens_Productivity_Assets_INT_HomeworkHelp.gif

These features are rolling out today, except for Listen which is available on Android and coming soon to iOS. Lens is available in the Google app on iOS and the Google Lens app on Android.

Lens_Productivity_App_Entry@2x.png

We look forward to hearing about the ways you use Lens to learn new things and get stuff done while at home.

Go beyond the page with Google Lens and NYT Magazine

When you read an article online—including this one—videos, GIFs, photo galleries and links to related articles can help you get a better sense of the story. But in print, you’re limited by what can fit on the page. So how do you go beyond what’s on the physical page for more? 

Throughout the first half of this year, we’re working with The New York Times so that readers of the print edition of The New York Times Magazine can use Google Lens to unlock more information by simply pointing their smartphone camera at the pages. On Sunday, when The Times Magazine’s annual Music Issue hits newsstands, readers can use Lens to access videos, animations and in-depth digital content that help you go beyond what’s included in print. Readers will also be able to access a playlist of all the music on the magazine’s list of “25 Songs That Matter Now” using Lens.

Point Lens at the cover of the magazine to learn about how it was designed. And as you're reading the magazine, you'll be able to get more information related to the articles that pique your curiosity and digitally save or share them by simply pointing Lens at the page. In addition to the cover and articles, we're excited to work with The New York Times and other brands to bring interactive elements to print ads in the magazine through Lens.

Using Lens, you can already find out what kind of flower you’re looking at, see what’s popular on the menu at a restaurant, translate text into another language or figure out where to get a pair of shoes like the ones you just saw. Now, Lens can help you go deeper into the stories you care about in The New York Times Magazine. 

Let Google be your holiday travel tour guide

When it comes to travel, I’m a planner. I’m content to spend weeks preparing the perfect holiday getaway: deciding on the ideal destination, finding the cheapest flights and sniffing out the best accommodations. I’ve been dreaming about a trip to Greece next year, and—true story—I’ve already got a spreadsheet to compare potential destinations, organized by flight length and hotel perks.

But the thing I don’t like to do is plot out the nitty-gritty details. I want to visit the important museums and landmarks, but I don’t want to write up a daily itinerary ahead of time. I’m a vegetarian, so I need to find veggie-friendly restaurants, but I’d prefer to stumble upon a good local spot than plan in advance. And, since I don’t speak Greek, I want to be able to navigate transportation options without having to stop and ask people for help all the time.

So I’ve come to rely on some useful Google tools to make my trips work for the way I like to travel. Here’s what I’ve learned so far.

Let Maps do the talking

Getting dropped into a new city is disorienting, and all the more so when you need to ask for help but don’t know how to pronounce the name of the place you’re trying to get to. Google Maps now has a fix for this: When you’ve got a place name up in Maps, just press the new little speaker button next to it, and it will speak out a place's name and address in the local lingo. And if you want to continue the conversation, Google Maps will quickly link you to the Google Translate app.

gif of Google Translate feature in Google Maps

Let your phone be your guidebook

New cities are full of new buildings, new foods and even new foliage. But I don’t want to just see these things; I want to learn more about them. That’s where Google Lens comes in as my know-it-all tour guide and interpreter. It can translate a menu, tell me about the landmark I’m standing in front of or identify a tree I’ve never seen before. So whenever I think, “I wonder what that building is for,” I can just use my camera to get an answer in real time. 

using Google Lens to identify a flower

Photo credit: Joao Nogueira

Get translation help on the go

The Google Assistant’s real-time translation feature, interpreter mode, is now available on Android and iOS phones worldwide, enabling you to have a conversation with someone speaking a foreign language. So if I say, “Hey Google, be my Greek translator,” I can easily communicate with, say, a restaurant server who doesn’t speak English. Interpreter mode works across 44 languages, and it features different ways to communicate suited to your situation: you can type using a keyboard for quiet environments, or manually select what language to speak.

gif of Google Assistant interpreter mode

Use your voice to get things done

Typing is fine, but talking is easier, especially when I’m on vacation and want to make everything as simple as possible. The Google Assistant makes it faster to find what I’m looking for and plan what’s next, like weather forecasts, reminders and wake-up alarms. It can also help me with conversions, like “Hey Google, how much is 20 Euros in pounds?”

Using Google Assistant to answer questions

Photo credit: Joao Nogueira

Take pics, then chill

When I’m in a new place, my camera is always out. But sorting through all those pictures is the opposite of relaxing. So I offload that work onto Google Photos: It backs up my photos for free and lets me search for things in them . And when I want to see all the photos my partner has taken, I can create an album that we can both add photos to. And Photos will remind me of our vacation in the future, too, with story-style highlights at the top of the app.

photo of leafy old town street

Photo credit: Joao Nogueira

Look up

I live in a big city, which means I don’t get to see the stars much. Traveling somewhere a little less built up means I can hone my Pixel 4 astrophotography skills. It’s easy to use something stable, like a wall, as a makeshift tripod, and then just let the camera do its thing.

a stone tower at night with a starry sky in the background

Photo credit: DDay

Vacation unplugged

As useful as my phone is, I try to be mindful about putting it down and ignoring it as much as I can. And that goes double for when I’m on vacation. Android phones have a whole assortment of Digital Wellbeing features to help you disconnect. My favorite is definitely flip to shhh: Just place your phone screen-side down and it silences notifications until you pick it back up.

someone sitting on a boat at sunset watching the shoreline

Photo credit: Joao Nogueira

Source: Google LatLong


7 ways Google Lens can help during the holidays

This holiday season, your phone’s camera can do more than just capture your favorite moments. Whether you’re jet-setting off to a new place, brainstorming gift ideas, or learning a family holiday recipe, here are 7 ways Google Lens can help: 

Get style recommendations

If you’re in need of style inspiration for your holiday festivities, look no further. Point Lens at a piece of clothing, like a dress or jacket, to get style ideas from across the web. Lens will show you how others are wearing—and pairing—similar pieces so you can make the most of your closet over the season. 

StyleRecommendations (1).gif

Get style inspiration

Find gifts in a snap

Do you like something you see in the real world? Use Lens to identify products similar to it. You can even sort by prices to help you get the best deal. Just remember to ask before taking a photo of a random person’s shoes.

ShoeShopping.gif

Find similar items

Track your packages

Gift shopping can be fun, but it can also be difficult to keep track of all your orders and be assured that they’ll arrive in time. Point Lens at a tracking number to quickly see the delivery status of your package.

PackageDelivery.png

Easily track packages

Make your camera your travel companion

If you’re taking a holiday trip where you don’t speak the local language, Lens can instantly translate the text in front of you, whether you’re looking at a menu or street sign. Point Lens at any text and it will automatically detect the language and overlay the translations right on top of the original worlds—in more than 100 languages.


Translate (1).gif

Quickly translate text

Don’t just snap food pics, get dish
recommendations too

When you’re out celebrating at a restaurant, Lens can help you decide what to order. Point your camera at a menu to see popular dishes highlighted. Tap on a dish to see what it actually looks like and what other customers are saying about it with photos and reviews from Google Maps.

Dining (1).gif

See popular menu items

Settle the bill with ease

Unless you’re playing credit card roulette, splitting the bill can be a pain. With Lens, you can easily figure out everyone’s share of the tab or calculate the tip by pointing your camera at the receipt.

BillCalculator.gif

Split the tab

Copy and paste written text

If you don’t want to-do lists scattered everywhere, Lens makes it easy to copy handwritten or printed text directly to your phone. Whether you’re scanning a grocery list, a gift card code, a family recipe, or even a long Wi-Fi password you don’t want to manually enter, use Lens to copy it to your device.

Lens_Recipe_v2.png

Copy text to your device

To check out these features, download the Google Lens app on the Play Store or the Google app on the App Store. You can also find Lens in your Google Assistant or Google Photos. 

We hope these Lens tips provide you with new, fun ways to use your smartphone camera this holiday season and throughout the year.


Record a lecture, name that song: Pixel 4 uses on-device AI

Pixel 4 is our latest phone that can help you do a lot of stuff, like take pictures at night or multitask using the Assistant. With on-device AI, your camera can translate foreign text or quickly identify a song that’s playing around you. Everything needed to make these features happen is processed on your phone itself, which means that your Pixel can move even quicker and your information is more secure. 


Lens Suggestions

When you point your camera at a phone number, a URL, or an email address using Pixel, Google Lens already helps you take action by showing you Lens Suggestions. You can call the number, visit the URL or add the email address to your contact with single tap. Now, there are even more Lens Suggestions on Pixel 4. If you’re traveling in a foriegn country and see a language you can’t read, just open your camera and point it at the text, and you’ll see a suggestion to Translate that text using Lens. For now, this works on English, Spanish, German, Hindi, and Japanese text, with more to come soon. 


There are also Lens Suggestions for copying text and scanning a document, which are processed and recognized on-device as well. So if you point your camera at a full page document, you’ll see a suggestion to scan it and save it for later using Lens. 

just-chip-tight-fit.gif

Lens will prompt you with a suggestion to translate foreign text, which happens on device. Then, you’ll see the translation in your native language.

Recorder

Remember that time you were in a brainstorm, and everyone had good ideas, but no one could remember them the next day? Or that meeting when you weren’t paying attention because you were too busy taking notes? With the Recorder app on Pixel 4, you can record, transcribe and search for audio clips. It automatically transcribes speech and tags sounds like applause (say your great idea was met with cheers!), music or whistling, and more, so you can find exactly what you’re looking for. You can search within a specific recording, or your entire library of recordings, and everything you record stays on your phone and private to you. We're starting with English for transcription and search, with more languages coming soon.

Now Playing

Now Playing is a Pixel feature that identifies songs playing around you. If that song gets stuck in your head and you want to play it again later, Now Playing History will play it on your favorite streaming service (just find the song you want, tap it to listen to it on Spotify, YouTube Music and more). On Pixel 4, Now Playing uses a privacy-preserving technology called Federated Analytics, which figures out the most frequently-recognized songs on Pixel devices in your region, without collecting individual audio data from your phone. This makes Now Playing even more accurate because the database will update with the songs people are most likely to hear (without Google ever seeing what you listen to).


With so much processing happening directly on your Pixel 4, it’s even faster to access the features that make you love being a #teampixel member. Pre-order Pixel 4 or head out to your local AT&T, Verizon, T-Mobile or Sprint store on October 24. 


Get outfit inspiration with style ideas in Google Lens

Whether you’re window shopping or searching for new clothes on your phone, it’s easy to identify what you like, but it’s not always easy to figure out how you’d wear it yourself. That’s where Google Lens can help. You can already use Lens to get similar item suggestions for clothing and home decor, and today we’re adding a new feature in the U.S. called “style ideas” to give you outfit inspiration from around the web.

So if you see a leopard print skirt you like on social media, take a screenshot and use Lens in Google Photos to see how other people have styled similar looks. See a winter coat that catches your eye in a store, but need some inspiration on how to rock it? Just open Lens and point your camera.

Lens style ideas on coat

Style ideas can also show you new ways to style clothes you already own. Give new life to that old sweater you haven’t picked up in a year—simply point Lens at it to see how others have worn a similar one and find pieces that might match it.

As the weather changes, get your wardrobe fall-ready with style ideas in Lens.

Pixel 3a helped me see my vacation through a new Lens

When I was a kid, my mom would tell me on every birthday she wanted me to have a big goal in life: Travel to as many countries as my years on Earth. And though I'm far from that ambitious target, my mom did instill a major travel bug in me. 

Briana Feigon in Oaxaca

Settling in at the Casa Oaxaca hotel. 


But no matter where I travel, I struggle with the same issues many people face: pricey phone bills, subpar photos, a language barrier and, well, getting extremely lost.

So when I traveled to Oaxaca, Mexico last month, I sought out ways to combat these typical tourist problems. And thanks to my Pixel 3a, I was able to make real progress for the next time I visit more countries on my bucket list. Here’s how I did it. 

Navigating on Maps without pricey data fees

Even when I’m traveling, I like to be able to use my phone the same way I would at home. (Meaning, a lot.) For this trip, I decided to set my phone up with Google Fi so I could have unlimited international usage and great coverage. At the end of my trip, my phone bill netted out to be a fraction of my typical charge when I travel internationally.

Thanks to my cheaper data plan, I was also able to navigate with help from Maps. I’d never admit it myself, but some people might say I’m bad at directions. (Okay, a lot of people might say that.) In any case, I really leaned into using Live View in Google Maps, a tool that literally has a big blue arrow staring at me on my screen, pointing me exactly in the direction I should go. Even when in rural areas, outside of cell service, I was grateful to be able to use Google Maps in offline mode—like when I visited the Monte Alban ruins.

Taking in the beauty of Monte Alban with friends.

Taking in the beauty of Monte Alban with friends.

Lens translate

When ordering a juice from a mercado stand, I was able to use Translate in Lens to decipher many of the blends, opting for a juice that promised benefits for my skin. 

A new way to break down the language barrier 

I’m ashamed to say my Spanish isn’t great, so I put the Pixel 3a to the test. Could it magically help me speak a new language? 

Within the camera app, there’s a nifty feature in Google Lens that allows you to hover over text in another language for real-time translations. This came in handy in bustling markets, local restaurants and juice stands that only had menus in Spanish. Even if you don’t have a Pixel phone, you can download the Google Lens app on other Android or iOS devices to try it out yourself.  

The Google Assistant also came in handy when I needed language help. It was easy to ask the Assistant questions like, “Hey Google, how do you say ‘where is the bathroom’ in Spanish?” and get help converting costs from pesos to dollars.

Taking my vacation photos to the next level

In a city as beautiful as Oaxaca, I knew I’d be leaning heavily on the camera quality of the Pixel 3a. I snapped photos throughout a cooking demo making tortillas from scratch, and used features like portrait mode and Night Sight to make the most out of my vacation pics. Here are just a few highlights: 

My Pixel 3a was the ultimate tour guide

I know, I know, it’s just a phone, but I have to say I feel indebted to my Pixel 3a for showing me such a special time in Oaxaca. I think I’ll take it to my next dream travel destination: Japan. 

Source: Google LatLong


With Google Lens, Things get Strange in today’s New York Times

Demogorgons. Mindflayers. Shadowy government agencies. Things aren’t always what they seem in Hawkins, Indiana, and Season 3 of Netflix’s “Stranger Things” is no different. It’s 1985, and the newly-opened Starcourt Mall is center stage. But for those adventurous enough to look beneath the surface, they’re bound to find a lot more than they bargained for.

We’re bringing a special, and strange, Google Lens experience to today’s print edition of The New York Times. How do you know it’s special? Because friends don’t lie. So grab your walkie, strap it to your bike, and pedal over to your local newsstand to pick one up. In it, you’ll find three ads for Starcourt Mall. Scan them with Google Lens, and you might find that things are stranger than they seem.

How to use Google Lens in New York Times.gif

This “Stranger Things” project is the latest in our work with partners—like museums, magazines and retailers—to use Google Lens to overlay digital information on things you see in the real world. Share what you’re discovering in today’s paper with #StrangerThings3.


Google Lens is available in the Google Assistant on Android, and the Google app on iOS. It’s also directly integrated into the camera app on many Android devices. Learn more about what Lens can do by visiting g.co/Lens. Season 3 of "Stranger Things" is now streaming on Netflix.


Find the hidden stories behind art at the de Young with Google Lens

One of the privileges of working at the de Young museum in San Francisco is getting to regularly spend time in front of masterworks by artists like Ruth Asawa, Albert Bierstadt, and Mary Cassatt, and learn about the often fascinating stories surrounding their art. Spanning four centuries, the de Young museum’s American art collection includes paintings, sculpture, and decorative art from the 17th century to the present day. We have so many stories to tell.

As the museum’s director of digital strategy, it’s my job to find ways to make these stories more readily accessible for our visitors and to help people understand what the art says about the world, and the cultures, viewpoints, and moments in time that don’t always fit within the short labels in the galleries.

Our newest collaboration with Google Arts & Culture shows visitors the hidden stories behind the paintings in this collection. Now, using Google Lens, you can search what you see. Point your phone's camera at a work like Edmund Charles Tarbell’s The Blue Veil, and you’ll have a curator at the tap of your finger to tell you learn more about the artist’s origins, and his fascination with the veil.

Find out more with Google Lens

Learn more about art with Google Arts & Culture and Google Lens.

This is a way for artists to share their perspective, too. In a new exhibition, Detour, artist Ana Prvački takes you on a tour of the museum, guiding you to specific spots and asking you to rethink parts of the museum visitors many not normally consider, such as the material of the museum’s copper facade. Visitors can trigger Prvački’s short videos on mobile devices via Google Lens at sites throughout the free public spaces of the museum. When you watch the videos, it feels like you’re getting a personal tour from the artist herself.

If you can’t make it to San Francisco before the exhibition concludes in September, you can experience a version of Detour online on Google Arts & Culture.  

In a new exhibition, Detour, artist Ana Prvački takes you on a tour of the museum.

At I/O ’19: Building a more helpful Google for everyone

Today, we welcomed thousands of people to I/O, our annual developer’s conference. It’s one of my favorite events of the year because it gives us a chance to show how we’re bringing Google’s mission to life through new technological breakthroughs and products.

Our mission to make information universally accessible and useful hasn’t changed over the past 21 years, but our approach has evolved over time. Google is no longer a company that just helps you find answers. Today, Google products also help you get stuff done, whether it’s finding the right words with Smart Compose in Gmail, or the fastest way home with Maps.

Simply put, our vision is to build a more helpful Google for everyone, no matter who you are, where you live, or what you’re hoping to accomplish. When we say helpful, we mean giving you the tools to increase your knowledge, success, health, and happiness. I’m excited to share some of the products and features we announced today that are bringing us closer to that goal.

Helping you get better answers to your questions

People turn to Google to ask billions of questions every day. But there’s still more we can do to help you find the information you need. Today, we announced that we’ll bring the popular Full Coverage feature from Google News to Search. Using machine learning, we’ll identify different points of a story—from a timeline of events to the key people involved—and surface a breadth of content including articles, tweets and even podcasts.

Sometimes the best way to understand new information is to see it. New features in Google Search and Google Lens use the camera, computer vision and augmented reality (AR) to provide visual answers to visual questions. And now we’re bringing AR directly into Search. If you’re searching for new shoes online, you can see shoes up close from different angles and even see how they go with your current wardrobe. You can also use Google Lens to get more information about what you’re seeing in the real world. So if you’re at a restaurant and point your camera at the menu, Google Lens will highlight which dishes are popular and show you pictures and reviews from people who have been there before. In GoogleGo, a search app for first-time smartphone users, Google Lens will read out loud the words you see, helping the millions of adults around the world who struggle to read everyday things like street signs or ATM instructions.

Google Lens: Urmila’s Story

Google Lens: Urmila’s Story

Helping to make your day easier

Last year at I/O we introduced our Duplex technology, which can make a restaurant reservation through the Google Assistant by placing a phone call on your behalf. Now, we’re expanding Duplex beyond voice to help you get things done on the web. To start, we’re focusing on two specific tasks: booking rental cars and movie tickets. Using “Duplex on the Web,” the Assistant will automatically enter information, navigate a booking flow, and complete a purchase on your behalf. And with massive advances in deep learning, it’s now possible to bring much more accurate speech and natural language understanding to mobile devices—enabling the Google Assistant to work faster for you.

We continue to believe that the biggest breakthroughs happen at the intersection of AI, software and hardware, and today we announced two Made by Google products: the new Pixel 3a (and 3a XL), and the Google Nest Hub Max. With Pixel 3a, we’re giving people the same features they love on more affordable hardware. Google Nest Hub Max brings the helpfulness of the Assistant to any room in your house, and much more.

Building for everyone

Building a more helpful Google is important, but it’s equally important to us that we are doing this for everyone. From our earliest days, Search has worked the same, whether you’re a professor at Stanford or a student in rural Indonesia. We extend this approach to developing technology responsibly, securely, and in a way that benefits all.

This is especially important in the development of AI. Through a new research approach called TCAV—or testing with concept activation vectors—we’re working to address bias in machine learning and make models more interpretable. For example, TCAV could reveal if a model trained to detect images of “doctors” mistakenly assumed that being male was an important characteristic of being a doctor because there were more images of male doctors in the training data. We’ve open-sourced TCAV so everyone can make their AI systems fairer and more interpretable, and we’ll be releasing more tools and open datasets soon.

Another way we’re building responsibly for everyone is by ensuring that our products are safe and private. We’re making a set of privacy improvements so that people have clear choices around their data. Google Account, which provides a single view of your privacy control settings, will now be easily accessible in more products with one tap. Incognito mode is coming to Maps, which means you can search and navigate without linking this activity with your Google account, and new auto-delete controls let you choose how long to save your data. We’re also making several security improvements on Android Q, and we’re building the protection of a security key right into the phone for two-step verification.

As we look ahead, we’re challenging the notion that products need more data to be more helpful. A new technique called federated learning allows us to train AI models and make products smarter without raw data ever leaving your device. With federated learning, Gboard can learn new words like “zoodles” or “Targaryen” after thousands of people start using them, without us knowing what you’re typing. In the future, AI advancements will provide even more ways to make products more helpful with less data.

Building for everyone also means ensuring that everyone can access and enjoy our products, including people with disabilities. Today we introduced several products with new tools and accessibility features, including Live Caption, which can caption a conversation in a video, a podcast or one that’s happening in your home. In the future, Live Relay and Euphonia will help people who have trouble communicating verbally, whether because of a speech disorder or hearing loss.

Project Euphonia: Helping everyone be better understood

Project Euphonia: Helping everyone be better understood

Developing products for people with disabilities often leads to advances that improve products for all of our users. This is exactly what we mean when we say we want to build a more helpful Google for everyone. We also want to empower other organizations who are using technology to improve people’s lives. Today, we recognized the winners of the Google AI Impact Challenge, 20 organizations using AI to solve the world’s biggest problems—from creating better air quality monitoring systems to speeding up emergency responses.

Our vision to build a more helpful Google for everyone can’t be realized without our amazing global developer community. Together, we’re working to give everyone the tools to increase their knowledge, success, health and happiness. There’s a lot happening, so make sure to keep up with all the I/O-related news.

Source: Android