Tag Archives: Google Assistant

6 new features on Android this summer

From keeping your account password safe to scheduling text messages to send at the right moment, we’re constantly rolling out new updates to the 3 billion active Android devices around the world. Today, we’re welcoming summer with six updates for your Android that focus on safety  — so you’re protected at every turn.


1. Android Earthquake Alerts System is rolling out globally

Earthquake alert screen that clicks through to an earthquake safety info screen

Last year, we embarked on a mission to build the world’s largest earthquake detection network, based on technology built into Android devices. With this free system, people in affected areas can get alerts seconds before an earthquake hits, giving you advance notice in case you need to seek safety. We recently launched the Android Earthquake Alerts System in New Zealand and Greece. Today, we’re introducing the Android Earthquake Alerts System in Turkey, the Philippines, Kazakhstan, Kyrgyz Republic, Tajikistan, Turkmenistan and Uzbekistan.

We are prioritizing launching Earthquake Alerts in countries with higher earthquake risks, and hope to launch in more and more countries over the coming year.


2. Star what’s important with the Messages app

With tons of messages from family, friends, colleagues and others, it’s easy for information to get lost. Now, you can star a message on your Messages app to keep track of what’s important, and easily find it later without scrolling through all of your conversations. Just tap and hold your message, then star it. And when you want to revisit a message, like your friend’s address or the photo from your family reunion, tap on the starred category. 


Starred messages will start to roll out more broadly over the coming weeks.


3. Find the perfect Emoji Kitchen sticker at the perfect time

After typing a message, relevant emoji mixes are proactively displayed at the top of the keyword

In May, we introduced a new section in your recently used Emoji Kitchen stickers so you can quickly get back to the ones you use most frequently. Soon you’ll also start to see contextual suggestions in Emoji Kitchen once you’ve typed a message. These will help you discover the perfect emoji combination at the exact moment you need it.


Contextual Emoji Kitchen suggestions are available in Gboard beta today and are coming to all Gboard users this summer for messages written in English, Spanish and Portuguese on devices running Android 6.0 and above.


4. Access more of your favorite apps with just your voice

Ask Google to open or search many of your favorite apps using just your voice — you can say things like,  “Hey Google, pay my Capital One bill” to jump right into the app and complete the task or “Hey Google, check my miles on Strava” to quickly see your weekly progress right on the lock screen. See what else you can do by saying “Hey Google, shortcuts.” 


5. Improved Password Input and gaze detection on Voice Access

A gaze detection icon on a screen changes from crossed out to active when a character turns its head towards the device to speak the "scroll down" command in Voice Access

Built with and for people with motor disabilities, and helpful for those without, Voice Access gives you quick and efficient phone and app navigation with just your voice.


With gaze detection, now in beta, you can ask Voice Access to work only when you are looking at the screen — so you can naturally move between talking to friends and using your phone. 


Voice Access now has enhanced password input. When it recognizes a password field, it will let you input letters, numbers and symbols. For example, you can say “capital P a s s w o r d” or names of symbols (like “dollar sign” to input a $), so it’s faster to safely enter your password.


6. More customization and new app experiences on Android Auto

After a user taps on the Messages app icon and + New, Google Assistant is activated to help send a new message from the launcher screen

You can now customize more of your Android Auto experience for easier use, like personalizing your launcher screen directly from your phone and manually setting dark mode. It’s also easier to browse content with new tabs in your media apps, a “back to top” option and an A to Z button in the scroll bar. And, if it’s your first time using Android Auto, you can now get started faster in your car with a few simple taps.


We’ve also added new app experiences to help enhance your drive. EV charging, parking and navigation apps are now available to use in Android Auto. Plus, we’ve improved the messaging experience, so you can access your favorite messaging apps  from the launcher screen. You can easily read and send new messages directly from apps like WhatsApp or Messages — now available globally. 


These Android Auto features are available on phones running Android 6.0 or above, and when connected to your compatible car.

16 updates from Google I/O that’ll make your life easier

Part of our mission is to help make your daily life easier. At I/O this year, we shared news about a wide range of products and services that’ll do just that, from starting your car with your phone to searching your screenshots using Google Lens. Here are just a few of the features you should keep an eye out for. 

Quickly view your notifications, invoke Google Assistant on Android.

Android 12 includes the biggest design change since 2014. We rethought the entire experience, from the colors to the shapes, light and motion, and made it easier to access some of the most used features:

  • To invoke Google Assistant wherever you are, long press the power button.
  • Swipe down to view your new notification shade, an at-a-glance view of all your app notifications in one place.
  • And to make it easier to access everything you need, Google Pay and Device Controls have been added to your customizable quick settings.

Learn about all the big changes in Android 12.

Manage your privacy settings more easily on Android.

On top of the new design changes, we’ve also launched a new Privacy Dashboard, giving you easy access to your permissions settings, visibility into what data is being accessed and the ability to revoke permissions on the spot. You also have new indicators that let you know when apps are using your microphone and camera, as well as a way to quickly shut off that access. And we’ve added new microphone and camera toggles into quick settings so you can easily remove app access to these sensors for the entire system. Learn about new privacy controls in Android 12.

Change the channel with your phone.

Lost your TV remote? Don’t sweat it — we’re building remote-control features directly into your Android phone. Another bonus: If you need to enter a long password to log into one of your many streaming services subscriptions, you can save time and use your phone’s keyboard to enter the text. This built-in remote control will be compatible with devices powered by Android TV OS, including Google TV, and it’ll roll out later this year. Learn more about how we’re helping your devices work better together.

GIF of a user typing a password onto a phone and that password appearing on a TV screen

Use your phone to enter your password for your streaming services.

And unlock your car with your phone while you’re at it.

We’re working with car manufacturers to develop a new digital car key in Android 12. This feature will enable you to use your phone to lock, unlock and even start your car — and in some cases you won’t even need to take it out of your pocket. And because it’s digital, you’ll also be able to securely and remotely share your car key with friends and family if needed. Read more about Android Auto.

Understand more about your Search results.

When you’re looking up information online, it’s important to check  how credible a source is, especially if you aren’t familiar with the website. Our About This Result feature in Google Search provides details about a website before you visit it, including its description, when it was first indexed and whether your connection to the site is secure. This month, we’ll start rolling out About This Result to all English results worldwide, with more languages to come. And later this year, we’re going to add even more helpful contextual details — like how the site describes itself, what other sources are saying about it and related articles to check out.

Change your password using Chrome and Assistant.

Chrome on Android will help you change your passwords with a simple click. On supported sites, whenever you check your passwords and Chrome finds a password that may have been compromised,  you will see a "Change password" button from Assistant.  Powered by Duplex on the Web, Assistant will not only navigate to the site, but actually go through the entire process of changing your password for you.  This feature is already available for purchasing movie tickets, ordering food, and checking into flights.  

Use Google Lens to translate your homework into a language you’re more comfortable with.

Google Lens enables you to search what you see — from your camera, your photos and even your search bar. For a lot of students, their schoolwork might be in a language they’re not as comfortable with. That’s why we’re updating the Translate filter in Lens, making it easy to copy, listen to or search translated text in over 100 languages. Learn more about how information comes to life with Lens and AR.

And search your screenshots with Google Lens.

Lots of people take screenshots of things they’re interested in buying — but it can be hard to follow up on those screenshots afterward. Now when you look at any screenshot in Google Photos, we’ll prompt you to search the photo with Lens. This will help you find that pair of shoes or wallpaper pattern that you liked so much. 

A GIF demonstrating using Google Lens to search a screen shot of a basketball player, returning results for his shoes

Search your screenshots using Google Lens.

When shopping online, keep track of your open carts when you open a new tab.

Raise your hand if this has ever happened to you: You’ve got a browser open to do some online shopping, but then you get distracted and open up two, three, or 10 other windows — and you forget what you were online to do in the first place. We’re introducing a new feature in Chrome that shows you your open carts when you open a new tab. No more lost shopping carts here.

And get the best value for products you’re buying online.

Coming soon, we’ll let you link your favorite loyalty programs from merchants like Sephora to your Google account to show you the best purchase options across Google. Learn more about all our latest shopping updates.

Explore unfamiliar neighborhoods with more detailed views in Maps.

If you’re traveling by foot, augmented reality in Live View will show you helpful details about the shops and restaurants around you – including how busy they are, and recent reviews and photos. And if you’re traveling, Live View will tell you where you are relative to your hotel – so you can always find your way back. 

Avoid the crowds with area busyness.

Maps already shows the busyness of specific places — in fact, more than 80 million people use the live busyness information on Google every day. Now we’re expanding that functionality to show the busyness of an entire area, allowing you to see just how bustling a neighborhood or part of town is at any given moment. This means that if you want to keep things low-key, you can use Maps to see the hotspots to avoid. And if you’re looking for the most popular places to visit, you can use area busyness to scope out the liveliest neighborhoods at a glance.

See breakfast spots in the morning and dinner joints at night. 

We’re updating Maps to show you more relevant information based on what time of day it is and whether you’re traveling. That means we’ll show you things like coffee shops in the morning, when you need that caffeine fix, and burger joints at night, when you’re hungry for dinner. And if you’re on a weekend getaway, we’ll make tourist attractions and local landmarks easier to spot. Learn more about our latest updates to Maps

Discover unexpected Memories in Photos.

Starting later this summer, when we find a set of three or more photos with similarities like shape or color, we'll highlight these little patterns for you in your Memories. For example, Photos might surface a pattern of your family hanging out on the same couch over the years — something you wouldn’t have ever thought to search for, but that tells a deeply meaningful story about your daily life. Learn more about Little patterns in Photos.

Bring your pictures to life with Cinematic moments.

When you’re trying to get the perfect photo, you usually take the same shot two or three (or twenty) times. Using neural networks, we can take two nearly identical images and fill in the gaps by creating new frames in between. This creates vivid, moving images called Cinematic moments. Producing this effect from scratch would take professional animators hours, but with machine learning we can automatically generate these moments and bring them to your Recent Highlights. Learn more about Cinematic moments.

A GIF showing two similar pictures of a child and his baby sibling being converted into a moving image.

Cinematic moments will bring your photos to life.

Transform how you work with smart canvas in Google Workspace. 

As part of our mission to build the future of work, we’re launching smart canvas, a bunch of exciting updates across Docs, Sheets and Meet. New features include interactive building blocks—smart chips, templates, and checklists—as well as a new pageless format in Docs and emoji reactions. We're also bringing Meet closer to Docs, Sheets and Slides, and much more. See all of the big updates to Google Workspace.

Four Google smart home updates that Matter

Today, there are nine smart devices in the average smart home — in 2016, there were only three. While this is explosive growth, the industry is still evolving. Selecting the right devices or connecting them with the ones you already have can be frustrating. 

It’s up to us to simplify the smart home, and to start we must change the way device makers build products. There should be one standard that simplifies selection, setup and control, and makes it easy for our partners to create products and experiences for your home. Here’s how we’re making that happen:    

1. Google’s bringing Matter to Nest and Android

Google and other leading tech companies are working together to develop Matter, the new protocol that simplifies smart homes by using one standard across the industry — and we’re committed to supporting Matter. We’re bringing Matter to Android and capable Nest products, powering them with interoperable control and enabling simpler setups.

Android will be one of the leading operating systems with built-in support for Matter, letting you quickly set up devices with Google and link your favorite Android apps. You’ll only need a few taps to set up your Matter devices, and you’ll have lots of ways to instantly control them such as Matter-enabled Android apps, Google Assistant, the Google Home app, Android Power Controls and compatible Google devices. It also allows over one billion Android devices to enable simple setup and control all Matter-certified products.    

Nest is committed to making our devices connect better and respond faster. Thread, a technology we cofounded in 2014 that helps smart home devices work faster and more securely, will work in conjunction with Matter. Devices with Thread built-in, like Nest Wifi, Nest Hub Max and the second-generation Nest Hub will become connection points for Matter devices, creating even stronger, faster connections across your home. All Nest displays and speakers, like the Nest Hub and Nest Mini, will be automatically updated to control Matter devices, giving you faster and more reliable experiences whether they use Wi-Fi, Thread or ethernet. 

Plus, we’ll update the newest Nest Thermostat to support Matter - meaning for the first time it can be controlled on other platforms that have certified with Matter.

The bottom line: Matter devices will work everywhere your Google smart home does. 

2. One location for smart home information

Smart home information should be available in one trustworthy place. We’re unveiling a new Google smart home directory, an online destination to discover Google Assistant-compatible devices, answer your questions and learn from educational videos. You’ll find products across more than 30 categories, from brands like Philips Hue, Nanoleaf, Samsung, LG, Dyson, Netatmo, Wyze and more. It’s easy to search and filter compatible products, see product details, read reviews and find the best prices.

Google smart home directory

3. Better streaming

We’ve added support for WebRTC, an open-source communications protocol that reduces latency for an improved live video and audio streaming experience between security cameras, video doorbells, smart displays and mobile devices. Top device manufacturers, including Arlo, Logitech, Netatmo and Wyze, are among our first partners to integrate WebRTC with Google Assistant and more will join in the coming weeks.


4. Control your home, from anywhere 

We’re also using Google technology to improve Home & Away Routines, enabling automatic control of Nest cameras, Nest thermostats, smart lights, smart plugs and smart switches based on when you’re home or away. When you leave home, your Away Routine can automatically turn on your Nest cameras and turn off the lights and plugs. When someone arrives home, your Home Routine can turn off the cameras and turn on the lights. 


We’re committed to making the smart home more helpful. The Google smart home will keep finding ways to bring Google Assistant, Nest devices, industry-leading partners and new technology together to help you get things done, stay on track...and sometimes just sit back and enjoy your home.  


New for I/O: Assistant tools and features for Android apps and Smart Displays

Posted by Rebecca Nathenson, Director of Product for the Google Assistant Developer Platform

New Assistant tools at Google IO header

Today at I/O, we shared some exciting new product announcements to help you more easily bring Google Assistant to your Android apps and create more engaging content on smart displays.

Assistant development made easy with new Android APIs

App Actions helps you easily bring Google Assistant to your Android app and complete user queries of all kinds, from booking a ride to posting a message on social media. Companies such as MyFitnessPal and Twitter are already using App Actions to help their users get things done, just by using their voice. You can enable App Actions in Android Studio by mapping built-in intents to specific features and experiences within your apps. Here are new ways you can help users easily navigate your content through voice queries and proactive suggestions.

Better support for Assistant built-in intents with Capabilities

Capabilities is a new framework API available in beta today that lets you declare support for common tasks defined by built-in intents. By leveraging pre-built requests from our catalog of intents, you can offer users ways to jump to specific activities within your app.

For example, the Yahoo Finance app uses Capabilities to let users jump directly to the Verizon stock page just by saying “Hey Google, show me Verizon’s stock on Yahoo Finance.” Similarly, Snapchat users can use their voice to add filters and send them to friends: “Hey Google, send a snap with my Curry sneakers.”

Improved user discoverability with Shortcuts in Android 12

App shortcuts are already a popular way to automate most common tasks on Android. Thanks to the new APIs for Shortcuts in Android 12, it’s now easier to find all the Assistant queries that are supported with apps. If you build an Android Shortcut, it will automatically show up in the Assistant Shortcuts gallery, so users can choose to set up a personal voice command in your app, when they say “Hey Google, shortcuts.”

3 phones showing shortcuts from Assistant

Google Assistant can also suggest relevant shortcuts to help drive traffic to your app. For example, when using the eBay app, people will see a suggested Google Assistant Shortcut appear on the screen and have the option to create a shortcut for "show my bids."

We also introduced the Google Shortcuts Integration library, which identifies shortcuts pushed by Shortcuts Jetpack Module and makes them available to Assistant for use in managing related voice queries. By doing so, Google Assistant can suggest relevant shortcuts to users and help drive traffic to your app.

Get immediate answers and updates right from Assistant using Widgets, coming soon

Improvements to Android 12 also makes it easier to discover glanceable content with Widgets by mapping them to specific built-in intents using the Capabilities API. We're also looking at how to easily bring driving optimized widgets to Android Auto in the future. The integration with Assistant will enable one shot answers, quick updates and multi-step interactions with the same widget.

For example, with Dunkin’s widget implementation, you can say “Hey Google, reorder from Dunkin’ to select from previous drinks and place the order. Strava’s widget helps a user track how many miles they ran in a week by saying “Hey Google, check my miles on Strava”, and it will show up right on the lock screen.

Strava widget showing how many miles ran in a week

Build high quality Conversational Actions for smart displays

Last year, we introduced a number of improvements to the Assistant platform for smart displays, such as Actions Builder, Actions SDK and new built-in intents to improve the experience for both developers and users. Here are more improvements rolling out soon to make building conversational actions on smart displays even better.

New features to improve the developer experience

Interactive Canvas helps you build touch- and voice-controlled games and storytelling experiences for the Assistant using web technologies like HTML, CSS, and JavaScript. Companies such as CoolGames, Zynga, and GC Turbo have already used Canvas to build games for smart displays.

Since launch, we've gotten great feedback from developers that it would be simpler and faster to implement core logic in web code. To enable this, the Interactive Canvas API will soon provide access to text-to-speech (TTS), natural language understanding (NLU), and storage APIs that will allow developers to trigger these capabilities from client-side code. These APIs will provide experienced web developers with a familiar development flow and enable more responsive Canvas actions.

We’re also giving you a wider set of options around how to release your actions. Coming soon, in the Actions Console, you will be able to manage your releases by launching in stages. For example, you can launch to one country first and then expand to more later, or you can launch to just a smaller percentage and gradually roll out over time.

Improving the user experience on smart displays

You'll also see improvements that will enhance visual experiences on the smart display. For example, you can now remove the persistent header, which allows you to utilize full real estate of the device and provide users with fully immersive experiences.

Before Interactive Canvas brought customized touch interfaces to the Smart Display, we provided a simple way to stop TTS from playing by tapping anywhere on the screen of the device. However, with more multi-modal experiences being released on Smart Displays, there are use cases where it is important to continue playing TTS while the user touches the display. Developers will soon have the option to enable persistent TTS for their actions.

We’ve also added support for long-form media sessions with updates to the Media API so you can start playback from a specific moment, resume where a previous session stopped, and adapt conversational responses based on media playback context.

Easier transactions for your voice experiences

We know how important it is to have the tools you need to build a successful business on our platform. In October of last year, we made a commitment to make it easier for you to add seamless voice-based and display-based monetization capabilities to your experience. On-device CVC and credit card entry will soon be available on smart displays. Both of these features make on-device transactions much easier reducing the need to redirect users to their mobile devices.

We hope you are able to leverage all these new features to build engaging experiences and reach your users easily, both on mobile and at home. Check out our technical sessions, workshops and more from Google I/O on YouTube and get started with App Actions and Conversational Actions today!

New for I/O: Assistant tools and features for Android apps and Smart Displays

Posted by Rebecca Nathenson, Director of Product for the Google Assistant Developer Platform

New Assistant tools at Google IO header

Today at I/O, we shared some exciting new product announcements to help you more easily bring Google Assistant to your Android apps and create more engaging content on smart displays.

Assistant development made easy with new Android APIs

App Actions helps you easily bring Google Assistant to your Android app and complete user queries of all kinds, from booking a ride to posting a message on social media. Companies such as MyFitnessPal and Twitter are already using App Actions to help their users get things done, just by using their voice. You can enable App Actions in Android Studio by mapping built-in intents to specific features and experiences within your apps. Here are new ways you can help users easily navigate your content through voice queries and proactive suggestions.

Better support for Assistant built-in intents with Capabilities

Capabilities is a new framework API available in beta today that lets you declare support for common tasks defined by built-in intents. By leveraging pre-built requests from our catalog of intents, you can offer users ways to jump to specific activities within your app.

For example, the Yahoo Finance app uses Capabilities to let users jump directly to the Verizon stock page just by saying “Hey Google, show me Verizon’s stock on Yahoo Finance.” Similarly, Snapchat users can use their voice to add filters and send them to friends: “Hey Google, send a snap with my Curry sneakers.”

Improved user discoverability with Shortcuts in Android 12

App shortcuts are already a popular way to automate most common tasks on Android. Thanks to the new APIs for Shortcuts in Android 12, it’s now easier to find all the Assistant queries that are supported with apps. If you build an Android Shortcut, it will automatically show up in the Assistant Shortcuts gallery, so users can choose to set up a personal voice command in your app, when they say “Hey Google, shortcuts.”

3 phones showing shortcuts from Assistant

Google Assistant can also suggest relevant shortcuts to help drive traffic to your app. For example, when using the eBay app, people will see a suggested Google Assistant Shortcut appear on the screen and have the option to create a shortcut for "show my bids."

We also introduced the Google Shortcuts Integration library, which identifies shortcuts pushed by Shortcuts Jetpack Module and makes them available to Assistant for use in managing related voice queries. By doing so, Google Assistant can suggest relevant shortcuts to users and help drive traffic to your app.

Get immediate answers and updates right from Assistant using Widgets, coming soon

Improvements to Android 12 also makes it easier to discover glanceable content with Widgets by mapping them to specific built-in intents using the Capabilities API. We're also looking at how to easily bring driving optimized widgets to Android Auto in the future. The integration with Assistant will enable one shot answers, quick updates and multi-step interactions with the same widget.

For example, with Dunkin’s widget implementation, you can say “Hey Google, reorder from Dunkin’ to select from previous drinks and place the order. Strava’s widget helps a user track how many miles they ran in a week by saying “Hey Google, check my miles on Strava”, and it will show up right on the lock screen.

Strava widget showing how many miles ran in a week

Build high quality Conversational Actions for smart displays

Last year, we introduced a number of improvements to the Assistant platform for smart displays, such as Actions Builder, Actions SDK and new built-in intents to improve the experience for both developers and users. Here are more improvements rolling out soon to make building conversational actions on smart displays even better.

New features to improve the developer experience

Interactive Canvas helps you build touch- and voice-controlled games and storytelling experiences for the Assistant using web technologies like HTML, CSS, and JavaScript. Companies such as CoolGames, Zynga, and GC Turbo have already used Canvas to build games for smart displays.

Since launch, we've gotten great feedback from developers that it would be simpler and faster to implement core logic in web code. To enable this, the Interactive Canvas API will soon provide access to text-to-speech (TTS), natural language understanding (NLU), and storage APIs that will allow developers to trigger these capabilities from client-side code. These APIs will provide experienced web developers with a familiar development flow and enable more responsive Canvas actions.

We’re also giving you a wider set of options around how to release your actions. Coming soon, in the Actions Console, you will be able to manage your releases by launching in stages. For example, you can launch to one country first and then expand to more later, or you can launch to just a smaller percentage and gradually roll out over time.

Improving the user experience on smart displays

You'll also see improvements that will enhance visual experiences on the smart display. For example, you can now remove the persistent header, which allows you to utilize full real estate of the device and provide users with fully immersive experiences.

Before Interactive Canvas brought customized touch interfaces to the Smart Display, we provided a simple way to stop TTS from playing by tapping anywhere on the screen of the device. However, with more multi-modal experiences being released on Smart Displays, there are use cases where it is important to continue playing TTS while the user touches the display. Developers will soon have the option to enable persistent TTS for their actions.

We’ve also added support for long-form media sessions with updates to the Media API so you can start playback from a specific moment, resume where a previous session stopped, and adapt conversational responses based on media playback context.

Easier transactions for your voice experiences

We know how important it is to have the tools you need to build a successful business on our platform. In October of last year, we made a commitment to make it easier for you to add seamless voice-based and display-based monetization capabilities to your experience. On-device CVC and credit card entry will soon be available on smart displays. Both of these features make on-device transactions much easier reducing the need to redirect users to their mobile devices.

We hope you are able to leverage all these new features to build engaging experiences and reach your users easily, both on mobile and at home. Check out our technical sessions, workshops and more from Google I/O on YouTube and get started with App Actions and Conversational Actions today!

An update on our COVID response priorities

 Our teams at Google continue to support the tireless work of hospitals, nonprofits, and public health service providers across the country. Right now, we’re focused on three priority areas: ensuring people can access the latest and most authoritative information; amplifying vital safety and vaccination messages; and providing financial backing for affected communities, health authorities and other organizations.

Providing critical and authoritative information

On all our platforms, we’re taking steps to surface the critical information families and communities need to care for their own health and look after others.

Searches on the COVID-19 vaccine display key information around side effects, effectiveness, and registration details, while treatment-related queries surface guidance from ministry resources

When people ask questions about vaccines on Google Search, they see information panels that display the latest updates on vaccine safety, efficacy and side-effects, plus registration information that directs users to the Co-WIN website. You will also find information about prevention, self-care, and treatment under the Prevention and Treatment tab, in easy-to-understand language sourced from authorised medical sources and the Ministry of Health and Family Welfare. 

On YouTube we’re surfacing authoritative information in a set of playlists, about vaccines, preventing the spread of COVID-19, and facts from experts on COVID-19 care.

Our YouTube India channel features a set of playlists to share tips and information on COVID-19 care 

Testing and vaccination center locations

In addition to showing 2,500 testing centers on Search and Maps, we’re now sharing the locations of over 23,000 vaccination centers nationwide, in English and eight Indian languages. And we’re continuing to work closely with the Ministry of Health and Family Welfare to make more vaccination center information available to users throughout India.

Searching for vaccines in Maps and Search now shows over 23,000 vaccination centers across the country, in English and eight Indian languages

Pilot on hospital beds and medical oxygen availability

We know that some of the most crucial information people are searching for is the availability of hospital beds and access to medical oxygen. To help them find answers more easily, we’re testing a new feature using the Q&A function in Maps that enables people to ask about and share local information on availability of beds and medical oxygen in select locations. As this will be user generated content and not provided by authorised sources, it may be required to verify the accuracy and freshness of the information before utilizing it.

Amplifying vital safety and vaccination messages

As well as providing authoritative answers to queries, we’re using our channels to help extend the reach of health information campaigns. That includes the ‘Get the Facts’ around vaccines campaign, to encourage people to focus on authoritative information and content for vaccines. We’re also surfacing important safety messages through promotions on the Google homepage, Doodles and reminders within our apps and services.

Via the Google Search homepage and reminders within our apps and services, we are reminding people to stay safe and stay masked, and get authoritative information on vaccines

Supporting health authorities, organizations, and affected communities

Since the second wave began, we’ve been running an internal donation campaign to raise funds for nonprofit organizations helping those most in need, including GiveIndia, Charities Aid Foundation India, GOONJ, and United Way of Mumbai. This campaign has raised over $4.6 million (INR 33 crore) to date, and continues to generate much-needed support for relief efforts. 

We recognize that many more nonprofits need donations, and that Indians are eager to help where they can—so we’ve rolled out a COVID Aid campaign on Google Pay, featuring non-profit organizations like GiveIndia, Charities Aid Foundation, Goonj, Save the Children, Seeds, UNICEF India  (National NGOs) and United Way. We want to thank all our Google Pay users who have contributed to these organisations, and we hope this effort will make a difference where it matters most. 

On Google Pay people can contribute funds to non-profit organizations involved in COVID response

As India battles this devastating wave, we’ll keep doing all we can to support the selfless individuals and committed organizations on the front lines of the response. There’s a long way to go—but standing together in solidarity, working together with determination, we can and will turn the tide.  

Posted by the Covid Response team, Google India


Plan a perfect weekend with new Google Assistant features

Moms everywhere can likely agree that this year (and then some) has had us working overtime. As a mom of two who's working at home, I know that's how I've felt. Maybe that's why I'm extra excited for Mother's Day this year. And just in time, there are a few new Google Assistant features my family and I will be using to schedule the perfect weekend. 


First, I’ll Broadcast from my morning run

We’re extending one of our most popular Assistant features, Broadcast, so you can reach your family wherever they are, and they can respond from any device including from their phones. With Family Broadcast, when I get home from my Saturday morning run, I can broadcast to my newly created Google Family Group, “Hey Google, tell my family, how about lunch at noon?” across all our smart speakers and displays. The message will even reach my husband on his iPhone (or Android device) while he’s on the way home, letting him reply by voice or by tapping the "reply" button, “Hey Google, reply sounds good, stopping by grandma's house. See you in 15 minutes.” 


Family Broadcast from mobile

Then I’ll set a Family Bell reminder for some afternoon gardening

Two new Family Bell reminders I plan to set this weekend will remind me to water the plants (which I love, but often forget to do) as well as alert my kids to tidy up the house. It’s becoming a very popular feature. Since last summer, more than 20 million Family Bells have been rung to help families stay organized - that’s nearly 19 years worth of bells! As a quick hint, you’ll soon be able to just say “stop” to end the bell, starting in English. No need to use “Hey Google” again, just like with alarm and timers.

Over the coming weeks, we’ll be expanding Family Bell to eight new languages, including Dutch, French, German, Hindi, Italian, Japanese, Korean, Portuguese, and Spanish. Another highly requested feature we’re rolling out today is the ability to have Family Bells ring across multiple home devices at one time (not just one smart speaker or display). 


Followed by winding down with new stories with the kids

Assistant is getting new stories and games that you can access from a smart display or Android device — this weekend, we plan to learn more about Quidditch from the Harry Potter stories with a simple “Hey Google, tell me a Quidditch Story.” We’ll be partnering with Pottermore Publishing to bring more stories later in the year, so stay tuned for more Wizarding WorldTM news.

We’re also bringing the “Who Was?” series from Penguin Random House to your smart display. Just say “Hey Google, talk to Who Was Heroes’” and listen to stories about Ida B. Wells, Ruth Bader Ginsburg and over 100 others. To get a full list of all the stories that are available, simply say “Hey Google, tell me a story.” (With a parent's permission, children under 13, or the applicable age in their country, can have a personalized Google Assistant experience and access these games designed for kids and families, powered by Family Link.)

Who was?

And have some fun with new games

My husband and I love trivia, and will play the popular game show “Hey Google, talk to ‘Are You Smarter than a 5th Grader?’” on our Nest Hub. 


Are You Smarter Than a 5th Grader

Plus, a surprise or two 

We had to add a few easter eggs too. Try using a timer on Mother’s Day and see what happens!  

Since the handwashing song was so popular, we created new ones to help kids stay on task and do their chores. Try “Hey Google, Sing the clean up song,” “Hey Google, Sing the go to sleep song” or “Hey Google, Sing the brush your teeth song.”

Sing a clean up song

Hopefully this schedule gives you a little Mother’s Day inspiration — or even just a stress-free weekend. 


Loud and clear: AI is improving Assistant conversations

To get things done with the Google Assistant, it needs to understand you – it has to both recognize the words you’re saying, and also know what you mean. It should adapt to your way of talking, not require you to say exactly the right words in the right order.

Understanding spoken language is difficult because it’s so contextual, and varies so much from person to person. And names can bring up other language hiccups — for instance, some names that are spelled the same are pronounced differently. It’s this kind of complexity that makes perfectly understanding the way we speak so difficult. This is something we’re working on with Assistant, and we have a few new improvements to share.


Teach Google to recognize unique names 

Names matter, and it’s frustrating when you’re trying to send a text or make a call and Google Assistant mispronounces or simply doesn’t recognize a contact. We want Assistant to accurately recognize and pronounce people’s names as often as possible, especially those that are less common.

Starting over the next few days, you can teach Google Assistant to enunciate and recognize names of your contacts the way you pronounce them. Assistant will listen to your pronunciation and remember it, without keeping a recording of your voice. This means Assistant will be able to better understand you when you say those names, and also be able to pronounce them correctly. The feature will be available in English and we hope to expand to more languages soon.


A good conversation is all about context

Assistant’s timers are a popular tool, and plenty of us set more than one of them at the same time. Maybe you’ve got a 10-minute timer for dinner going at the same time as another to remind the kids to start their homework in 20 minutes. You might fumble and stop mid sentence to correct how long the timer should be set for, or maybe you don’t use the exact same phrase to cancel it as you did to create it. Like in any conversation, context matters and Assistant needs to be flexible enough to understand what you're referring to when you ask for help.

To help with these kinds of conversational complexities, we fully rebuilt Assistant's NLU models so it can now more accurately understand context while also improving its "reference resolution" — meaning it knows exactly what you’re trying to do with a command.  This upgrade uses machine learning technology powered by state-of-the-art BERT, a technology we invented in 2018 and first brought to Search that makes it possible to process words in relation to all the other words in a sentence, rather than one-by-one in order. Because of these improvements, Assistant can now respond nearly 100 percent accurately to alarms and timer tasks. And over time, we’ll bring this capability to other use cases, so Assistant can learn to better understand you.


These updates are now available for alarms and timers on Google smart speakers in English in the U.S. and expanding to phones and smart displays soon.


More natural conversations

We also applied BERT to further improve the quality of your conversations. Google Assistant uses your previous interactions and understands what’s currently being displayed on your smartphone or smart display to respond to any follow-up questions, letting you have a more natural, back-and-forth conversation.

Have a more natural, back and forth conversation with Google

If you’re having a conversation with your Assistant about Miami and you want more information, it will know that when you say “show me the nicest beaches” you mean beaches in Miami. Assistant can also understand questions that are referring to what you’re looking at on your smartphone or tablet screen, like [who built the first one] or queries that look incomplete like [when] or [from its construction]. 

There's a lot of work to be done, and we look forward to continue advancing our conversational AI capabilities as we move toward more natural, fluid voice interactions that truly make everyday a little easier. 

Get ready for Hollywood’s big night with Google

Sure, spring is nice, and there’s so much to celebrate in the winter but my favorite season is easily awards season! 2021 marks 93 years of Hollywood’s annual film celebration, and this year’s Academy Awards will be a combination of in-person and virtual.


Ahead of the big night every year, my friends and I try to watch all of the new nominees, along with some of our favorite past winners. With the help of Google Search, we’re able to keep track of everything we want to watch, as well as check titles off once the credits roll. Starting today, you’ll find a brand new carousel of 2021 nominated movies when searching for “what to watch.” And on Google TV, we’re featuring collections that highlight nominees and 20 years of award-winning women.

Oscars on What to Watch

During my movie list-making, I decided to take a look at Google Trends to see what  “Best Picture” winners have piqued our interest. And the award for most-searched goes to: 1997’s romantic drama about a maiden voyage across the Atlantic...with two hits from the 70s as the runners-up.

Google Trends Oscars

Best Picture winners from 1927 - 2020 ranked by global Google Search interest from 2004 to March 2021

Check out this visualization of how classic flicks have been searched over the years. 

Oscars Google Trends GIF

And here’s how fans across the U.S. have been searching for this year’s best picture nominees. 

Google Trends Oscars 2021

But Hollywood’s big night isn’t only about the movies — it’s also about the celebrities. Here’s what Google Trends revealed about our searches for award show stars. (Spoiler Alert: Leonardo DiCaprio is the most searched “Oscar snub” since 2004 in the U.S.)

Most-searched “How many Oscars does … have” since 2004 US

  1. Leonardo DiCaprio

  2. Meryl Streep

  3. Tom Hanks

  4. Denzel Washington

  5. Brad Pitt


Most-searched Oscar duos since 2004 U.S.

  1. Lady Gaga and Bradley Cooper

  2. Justin Bieber and Selena Gomez

  3. Leonardo DiCaprio and Kate Winslet

  4. Ben Affleck and Jennifer Garner

  5. Angelina Jolie and Brad Pitt


Most-searched Red Carpet celebrities since 2004 U.S.

  1. Jennifer Lawrence

  2. Lady Gaga

  3. Angelina Jolie

  4. Jennifer Lopez

  5. Billy Porter

Search “Oscars” to stay up to date on this year’s nominees throughout the show. The list of winners in each category will be updated in real time and you’ll also find live clips, top stories, and other trending content. Starting Sunday April 25 at 5 p.m. PT, you’ll also be able to find ABC’s live stream in Search. 


To hear some predictions ahead of the red carpet, ask Google Assistant, “Hey Google, what are your Oscar predictions?” or “Hey Google, who do you think is best dressed at the Oscars?” Assistant also has the full list of nominees, of course, and plenty more to talk about. You can even join in on the fun at award night by asking, “Hey Google, give me an award.” 


Grab the popcorn...it’s almost showtime!

Source: Search


5 new ways Google Assistant can make the day a little easier

Spring is here, and with it, a helping hand from Google Assistant. Today we're introducing five new features that help you tackle small things around the house (and from the car).  

1) Can’t remember where you put down your phone?Don’t sweat it. Already one of the most popular features for Google Assistant, you can tell your Nest smart speaker or smart display, “Hey Google, find my phone,” for all devices, now including iPhones. For iPhones, once you opt in to receiving notifications and critical alerts from the Google Home app, you’ll get a notification and hear a custom ringing sound (even when the phone is on silent or if Do Not Disturb is enabled).

2) Get your takeout faster.Over the last year, more and more people started ordering takeout and delivery on Google, and more restaurants added the “order” button to their Business Profiles on Search and Maps. To make online food orders even easier, Assistant can now help you complete your purchase in only a few steps powered by Duplex on the web. To get started, you’ll need to first search for a restaurant near you from the Google App on Android and select “Order Online” or “Order Pickup.”  When you finish your online takeout order from a restaurant we partner with and click “check out,” Assistant will automatically navigate the site and fill out your contact and payment details saved in Google Pay and synced to Chrome Autofill. At launch, we’re partnering with select restaurant chains and will be adding more across the U.S. later this year.  

3) Try a new sunrise or sunset Routine for your smart home devices.Now available globally, these Routines are based on your  location. For example, you can automatically have your living room lights turn on and the sprinklers start when the sun goes down. It’s easy to set up: 

  • Select the “New” routine tab in the Google Home app or Assistant settings. 
  • Under “how to start,” you’ll need to “add starter,” then you’ll see an option for “Sunrise/sunset.”
  • From there, you can customize the time and specific actions you want them to trigger. 

4) Need routine ideas?Assistant Routines make it easy to automatically perform multiple actions at once with a single command. We’ve included a dedicated section in Ready-Made Routines to highlight popular “suggested actions” to  inspire you, such as “Tell me if my battery is low” or “Tell me what happened today in history.” You can also add a “shortcut” icon to your Android home screen for your favorite Routines. Head to the overview screen for Routines in the Google Home app or Assistant Settings and click the “Add to Home Screen” icon in the top app bar.

5) Have questions about the Oscars?You can get the inside scoop from your Google Assistant. Just ask: “Hey Google, when are the Oscars?” or “Hey Google, who’s nominated for Animated Feature Film at the Oscars?” to hear the list of nominees. To hear some predictions ahead of the red carpet, try “Hey Google, what are your Oscar predictions? or “Hey Google, who do you think is best dressed at the Oscars?” You can also join in on the award night fun by saying, “Hey Google, give me an award.”