If you're planning to travel this Fall and enjoy some time off with the family, you're also doing everything you can to have fun while still protecting your health. To help make guests feel more comfortable and safe, and have a more engaging experience during their hotel stay, we worked with the hospitality industry and Volara last year to introduce a hands-free, voice-first experience with Google Assistant on Nest Hub smart displays.
Our hotel solutions are already available in thousands of hotel rooms in the U.S. and U.K., and is now available in all guest rooms at both LEGOLAND® Hotel and LEGOLAND® Castle Hotel at LEGOLAND® California Resort, and the new LEGOLAND® Hotel at LEGOLAND® New York Resort.
Here are 10 things visitors will be able to do with Google Assistant on Nest Hub smart displays at LEGOLAND® Hotels:
1. Find help right away from hotel staff by asking “Hey Google, call the front desk” or just say “Hey Google, bring me fresh towels.” You can even check out of the room with just your voice. There’s no more need to handle the phone or stand in long lines at front desk.
2. Stay entertained. Nest Hubs come with nice speakers, so you can ask Google to play music or listen to the news.
3. Get park information by saying “Hey Google, what time does LEGOLAND® open?” or “Hey Google, tell me about the theme park.”
“I want to push the envelope on extending the theme park experience in the room with the Nest Hubs” — James Barton, Group Head of Business Transformation, Hotels, Merlin Entertainment (LEGOLAND®)
4. Speak directly with your favorite LEGOLAND® characters inside your room. Just ask the Jester to set up a LEGO® alarm and get info about the park.
5. Receive recommendations for local restaurants from the hotel’s concierge, or you can ask Google what activities are nearby. Try “Hey Google, can you recommend a place for breakfast?”
6. Take a YouTube tour of the LEGOLAND® theme park before your visit on Nest Hub, so you can make a beeline for your favorite rides — like the Dragon coaster!
7. Wake up on time. No need to mess around with knobs or settings. Just say “Hey Google, set a LEGO® alarm for 8 a.m.,” so you get the day started early.
8. Check the weather again before you head out of the hotel with “Hey Google, what’s the weather today at LEGOLAND® California Resort.”
9. Let Assistant be your interpreter for up to 30 languages, which is a great feature for international guests. Just say, “Hey Google, be my Italian interpreter” to kick off the experience.
10. And remember, we’re also dedicated to protecting privacy. You won’t need to sign into the device, and no activity will be linked to your personal account. There’s no camera on the Nest Hub, and the physical mic switch can be turned off for additional privacy. No audio is ever stored, and any activities will be automatically wiped from the device when it’s reset for the next guest.
Our voice-first, hands-free hotel solution is a big plus for travelers right now and will make your stay at LEGOLAND® Hotels more convenient and fun.
Wherever your kids will be learning from this year, one thing is undeniable: The school year is fast-approaching. Over the past two years, my family's adopted a few new routines — some that will stick, and others we'll need to adjust. Here's how I'm using Google Assistant and the latest educational Search features to help keep my family on track and connected — while still making time for some fun.
Make mornings more fun
I already use Family Bell on my smart speakers and smart displays to help us stay on top of our routines throughout the day, with bells that remind me when it's time for family dinner or to water the plants. In several weeks, Family Bell will be available to ring on mobile devices, making it even easier to set up and use throughout your day. Also coming soon, Family Bell will offer new, customizable bells that will initiate a checklist on your Nest Hub. I’ve been testing this feature and it’s been a great way to remind my kids what they need to do before heading out the door for school in a fun way— things like making their beds, getting dressed and brushing their teeth. When a task is completed, my kids enjoy the fun, on-screen celebratory animations and sounds that appear. I’m also planning to make a checklist for bedtime since it’s been working so well!
To start my own day, I’ll use a new feature coming soon that will automatically start my morning routine, check the weather and play the news, once I dismiss my morning alarm. To set this up, I’ll simply add "dismiss an alarm" as the start to my morning Routine.
Feeling close to my kids, even when we’re not together
For the first time in a long time, my kids won't be home with me all day. I made sure my family group was set up so I could have peace of mind knowing where everyone is during the day. I can say "Hey Google, where's my family?" to make sure they made it to school safely or are still at soccer practice. I can also send messages to the group throughout the day with Family Broadcast. When I head back to the office later this year and I’m running a bit late, I'll be able to say “Hey Google, tell my family: I’m leaving in five!” to send a message to our personal and home devices so they know I’m on my way home.
Keep the learning going at home
I always carve out time for story hour with my kids. For Harry Potter fans, Assistant will soon have new stories from Pottermore Publishing that you can access from a smart display or Android device. Just say “Hey Google, tell me a Fantastic Beasts Story” to enjoy the magic of the Wizarding World beasts. Discover a world map of beasts and wander the globe while you listen to audiobook highlights.
Having grown up in various countries, it’s important to me that my kids have access to all kinds of stories. I'm thrilled that we've partnered with The English Schoolhouse, an award-winning, Black woman-owned publishing house, to bring more diverse teaching tools and stories to Assistant in the coming weeks. Soon you’ll be able to say, “Hey Google, tell me a ‘Tallulah the Tooth Fairy CEO’ story” — or “Hey Google, tell me an ‘Elijah Everett, Kid Principal’ story.” I’ll even play one of my favorite guessing games with the kids by saying “Hey Google, talk to Guess the Drawing for Kids” on my Nest Hub.
Discover more fun stories by asking, “Hey Google, tell me a story” or find other fun family activities by saying “Hey Google, teach my family something new.” (With a parent's permission, children under 13, or the applicable age in their country, can have a personalized Google Assistant experience, powered by Family Link.)
While my own young kids aren’t quite ready for advanced math, chemistry and physics, older learners can use different Search tools to understand complex concepts — Search’s new interactive periodic table is a great example. Using a phone or computer, these Augmented Reality and 3D models help anyone quickly visualize an atom. Zoom in and see the electrons orbiting the nucleus and the protons and neutrons that make it up.
I can even brush up on my own Spanish with the “Live Translation” Assistant button in Search on my phone when I need to remember a specific word or phrase.
Hopefully this school year goes smoother than the last, but it’s a tough transition every year. These new tools can help make the switch to the school year an easy one for the whole family.
Privacy and security is personal. It means different things to different people, but our commitment is the same to everyone who uses our products: we will keep your personal information private, safe, and secure. We think everyone should be in the know about what data is collected, how their information is used, and most importantly, how they control the data they share with us.
Here are some of the top questions that people commonly ask us:
Q. Is Google Assistant recording everything I say?
No, it isn’t.
Google Assistant is designed to wait in standby mode until it is activated, like when you say, "Hey Google" or "Ok Google". In standby mode, it processes short snippets of audio (a few seconds) to detect an activation (such as “Ok Google”). If no activation is detected, then those audio snippets won’t be sent or saved to Google. When an activation is detected, the Assistant comes out of standby mode to fulfill your request. The status indicator on your device lets you know when the Assistant is activated. And when it’s in standby mode, the Assistant won’t send what you are saying to Google or anyone else. To help keep you in control, we're constantly working to make the Assistant better at reducing unintended activations.
To better tailor Google Assistant to your environment, you can now adjust how sensitive your Assistant is to the activation phrase (like 'Hey Google') through the Google Home app for smart speakers and smart displays. We also provide controls to turn off cameras and mics, and when they’re active we’ll provide a clear visual indicator (like flashing dots on top of your device).
Deleting your Google Assistant activity is easy, by simply using your voice. Just say something like, “Hey Google, delete this week’s activity”, or “Hey Google, delete my last conversation”, and Google Assistant will delete your Assistant activity. This will reflect on your My Activity page, and you can also use this page to review and delete activity across the Google products you use. And if you have people coming over, you can also activate a “Guest Mode” on Google Assistant – Just say, “Hey Google, turn on Guest Mode,” and your Google Assistant interactions will not be saved to your account.
Q. How does Google decide what ads it shows me? How can I control this?
The Ads you see can be based on a number of things, such as your previous searches, the sites you visit, ads clicked, and more.
For example, you may discover that you are seeing a camera ad because you’ve searched for cameras, visited photography websites or clicked on ads for cameras before. The 'Why this ad?' feature helps you understand why you are seeing a given ad.
Data helps us personalise ads so that they're more useful to you, but we never use the content of your emails or documents, or sensitive information like health, race, religion or sexual orientation, to tailor ads to you.
It is also easy to personalize the kinds of ads that are shown to you, or even disable ads personalization completely. Visit your Ad Settings page.
Q. Are you building a profile of my personal information across your products, for targeting ads?
We do not sell your personal information — not to advertisers, not to anyone. And we don’t use information in apps where you primarily store personal content — such as Gmail, Drive, Calendar and Photos — for advertising purposes.
We use information to improve our products and services for you and for everyone. And we use anonymous, aggregated data to do so.
A small subset of information may be used to serve you relevant ads (for things you may actually want to hear about), but only with your consent. You can always turn these settings off.
It is also important to note that you can use most of Google’s products completely anonymously, without logging in -- you can Search in incognito mode, or clear your search history; you can watch YouTube videos and use Maps. However, when you share your data with us we can create a better experience with our products based on the information shared with us.
Q. Are you reading my emails to sell ads?
We do not scan or read your Gmail messages to show you ads.
In fact, we have a host of products like Gmail, Drive and Photos that are designed to store your personal content, and this content is never used to show ads. When you use your personal Google account and open the promotions or social tabs in Gmail, you'll see ads that were selected to be the most useful and relevant for you. The process of selecting and showing personalized ads in Gmail is fully automated. The ads you see in Gmail are based on data associated with your Google Account such as your activity in other Google services such as YouTube or Search, which could affect the types of ads that you see in Gmail. To remember which ads you've dismissed, avoid showing you the same ads, and show you ads you may like better, we save your past ad interactions, like which ads you've clicked or dismissed. Google does not use keywords or messages in your inbox to show you ads – nobody reads your email in order to show you ads.
Also, if you have a work or school account, you will never be shown ads in Gmail.
If you want to get from A to B, it’s quicker to have your phone tell us where you are, than to have you figure out your address or location. Location information helps in many other ways too, like helping us figure out how busy traffic is. If you choose to enable location sharing, your phone will send anonymous bits of information back to Google. This is combined with anonymous data from people around you to recognise traffic patterns.
This only happens for people who turn location history on. It is off by default. If you turn it on, but then change your mind, you can visit Your Data in Maps -- a single place for people to manage Google account location settings.
Q. What information does Google know about me? How do I control it?
You can see a summary of what Google services you use and the data saved in your account from your Google Dashboard. There are also powerful privacy controls like Activity Controls and Ad Settings, which allow you to switch the collection and use of data on or off to decide how all of Google can work better for you.
We’ve made it easier for you to make decisions about your data directly within the Google services you use every day. For example, without ever leaving Search, you can review and delete your recent search activity, get quick access to relevant privacy controls from your Google Account, and learn more about how Search works with your data. You can quickly access these controls in Search, Maps, and the Assistant.
Privacy features and controls have always been built into our services, and we’re continuously working to make it even easier to control and manage your privacy and security. But we know that the web is a constantly evolving space, where new threats and bad actors will unfortunately emerge. There will always be more work to be done, and safeguarding people who use our products and services every day will remain our focus.
For more on how we keep you and your information private, safe and secure visit the Google Safety Center.
Now that we’ve packed up all of the virtual stages from Google I/O 2021, let's take a look at some of the highlights and new product announcements for App Actions, Conversational Actions, and Smart Home Actions. We also held a number of amazing live events and meetups that happened during I/O - which we’ll summarize as well.
App Actions
App Actions allows developers to extend their Android App to Google Assistant. For our Android Developers, we are happy to announce that App Actions is now part of the Android framework. With the introduction of the beta shortcuts.xml configuration resource and our latest Google Assistant Plug App Actions is moving closer to the Android platform.
Capabilities
Capabilities is a new Android framework API that allows you to declare the types of actions users can take to launch your app and jump directly to performing a specific task. Assistant provides the first available concrete implementation of the capabilities API. You can utilize capabilities by creating shortcuts.xml resources and defining your capabilities. Capabilities specify two things: how it's triggered and what to do when it's triggered. To add a capability, use Built-In intents (BIIs), which are pre-built intents that provide all the Natural Language Understanding to map the user's input to individual fields. When a BII is matched by the user’s speech, your capability will trigger an Android Intent that delivers the understood BII fields to your app, so you can determine what to show in response.
This framework integration is in the Beta release stage, and will eventually replace the original implementation of App Actions that uses actions.xml. If your app provides both the new shortcuts.xml and old actions.xml, the latter will be disregarded.
Voice shortcuts for Discovery
Google Assistant suggests relevant shortcuts to users and has made it easier for users to discover and add shortcuts by saying “Hey Google, shortcuts.”
You can use the Google Shortcuts Integration library, currently in beta, to push an unlimited number of dynamic shortcuts to Google to make your shortcuts visible to users as voice shortcuts. Assistant can suggest relevant shortcuts to users to help make it more convenient for the user to interact with your Android app.
In-App Promo SDK
Not only can Assistant suggest shortcuts, with In-App Promo SDK you can proactively suggest shortcuts in your app for actions that the user can repeat with a voice command to Assistant, in beta. The SDK allows you to check if the shortcut you want to suggest already exists for that user and prompt the user to create the suggested shortcut.
Google Assistant plugin for Android Studio
To support testing Capabilities, Google Assistant plugin for Android Studio was launched. It contains an updated App Action Test Tool that creates a preview of your App Action, so you can test an integration before publishing it to the Play store.
During the What's New in Google Assistant keynote, Director of Product for the Google Assistant Developer Platform Rebecca Nathenson mentioned several coming updates and changes for Conversational Actions.
Updates to Interactive Canvas
Over the coming weeks, we’ll introduce new functionality to Interactive Canvas. Canvas developers will be able to manage intent fulfillment client-side, removing the need for intermediary webhooks in some cases. For use cases which require server-side fulfillment, like transactions and account linking, developers will be able to opt-in to server-side fulfillment as needed.
We’re also introducing a new function, outputTts(), which allows you to trigger Text to Speech client-side. This should help reduce latency for end users.
Additionally, there will be updates to the APIs available to get and set storage for both the home and individual users, allowing for client-side storage of user information. You’ll be able to persist user information within your web app, which was previously only available for access by webhook.
These new features for Interactive Canvas will be made available soon as part of a developer preview for Conversational Actions Developers. For more details on these new features, check out the preview page.
Updates to Transaction UX for Smart Displays
Also coming soon to Conversational Actions - we’re updating the workflow for completing transactions, allowing users to complete transactions from their smart screens, by confirming the CVC code from their chosen payment method. Watch our demo video showing new transaction features on smart devices to get a feel for these changes.
Tips on Launching your Conversational Action
Make sure to catch our technical session Driving a successful launch for Conversational Actions to learn about some strategies for putting together a marketing team and go-to-market plan for releasing your Conversational Action.
AMA: Games on Google Assistant
If you’re interested in building Games for Google Assistant with Conversational Actions, you should check out the recording of our AMA, where Googlers answered questions from I/O attendees about designing, building, and launching games.
Smart Home Actions
The What's new in Smart Home keynote covered several updates for Smart Home Actions. Following our continued emphasis on quality smart home integrations with the updated policy launch, we added new features to help you build engaging, reliable Actions for your users.
Test Suite and Analytics
The updated Test Suite for Smart Home now supports automatic testing, without the use of TTS. Additionally, the Analytics dashboards have been expanded with more detailed logs and in-depth error reporting to help you more quickly identify any potential issues with your Action. For a deeper dive into these enhancements, try out the Debugging the Smart Home workshop. There are also two new debugging codelabs to help you get more familiar with using these tools to improve the quality of your Action.
Notifications
We expanded support for proactive notifications to include the device traits RunCycle and SensorState. Users can now be proactively notified for multiple different device events. We also announced the release of follow-up responses. These follow-up responses enable your smart devices to notify users asynchronously to device changes succeeding or failing.
WebRTC
We added support for WebRTC to the CameraStream trait. Smart camera users can now benefit from lower latency and half-duplex talk between devices. As mentioned in the keynote, we will also be making updates to the other currently supported protocols for smart cameras.
Bluetooth Seamless Setup
To improve the on-boarding experience, developers can now enable BLE (bluetooth low energy) for device onboarding with Bluetooth Seamless Setup. Google Home and Nest devices can act as local hubs to provision and register nearby devices for any Action configured with local fulfillment.
Matter
Project CHIP has officially rebranded as Matter. Once the IP-based connectivity protocol officially launches, we will be supporting devices running the protocol. Watch the Getting started with Project CHIPtech session to learn more.
Ecosystem and Community
The women building voice AI and their role in the voice revolution
Voice AI is fundamentally changing how we interact with technology and its future will be a product of the people that build it. Watch this session to hear about the talented women shaping the Voice AI field, including an interview with Lilian Rincon, Sr. Director of Product Management at Google. Leslie also discusses strategies for achieving equal gender representation in Voice AI, an ambitious but essential goal.
AMA: How the Assistant Investment Program can help fund your startup
This "Ask Me Anything" session was hosted by the all-star team who runs the Google for Startups Accelerator: Voice AI. The team fielded questions from startups and investors around the world who are interested in building businesses based on voice technology. Check out the recording of this event here. The day after the AMA session, the 2021 cohort for the Voice AI accelerator had their demo day - you can catch the recording of their presentations here.
One of the perks of I/O being virtual this year was the ability to connect with students, hobbyists, and developers around the globe to discuss the current state of Smart Home, as well as some of the upcoming features. We hosted 3 meetups for the APAC, Americas, and EMEA regions and gathered some great feedback from the community.
Assistant Google Developers Experts Meetup
Every year we host an Assistant Google Developer Expert meetup to connect and share knowledge. This year we were able to invite everyone who is interested in building for Google Assistant to network and connect with one another. At the end several attendees came together at the Assistant Sandbox for a virtual photo!
Thanks for reading! To share your thoughts or questions, join us on Reddit at r/GoogleAssistantDev.
Follow @ActionsOnGoogle on Twitter for more of our team's updates, and tweet using #AoGDevs to share what you’re working on. Can’t wait to see what you build!
1. Android Earthquake Alerts System is rolling out globally
Last year, we embarked on a mission to build the world’s largest earthquake detection network, based on technology built into Android devices. With this free system, people in affected areas can get alerts seconds before an earthquake hits, giving you advance notice in case you need to seek safety. We recently launched the Android Earthquake Alerts System in New Zealand and Greece. Today, we’re introducing the Android Earthquake Alerts System in Turkey, the Philippines, Kazakhstan, Kyrgyz Republic, Tajikistan, Turkmenistan and Uzbekistan.
We are prioritizing launching Earthquake Alerts in countries with higher earthquake risks, and hope to launch in more and more countries over the coming year.
With tons of messages from family, friends, colleagues and others, it’s easy for information to get lost. Now, you can star a message on your Messages app to keep track of what’s important, and easily find it later without scrolling through all of your conversations. Just tap and hold your message, then star it. And when you want to revisit a message, like your friend’s address or the photo from your family reunion, tap on the starred category.
Starred messages will start to roll out more broadly over the coming weeks.
3. Find the perfect Emoji Kitchen sticker at the perfect time
In May, we introduced a new section in your recently used Emoji Kitchen stickers so you can quickly get back to the ones you use most frequently. Soon you’ll also start to see contextual suggestions in Emoji Kitchen once you’ve typed a message. These will help you discover the perfect emoji combination at the exact moment you need it.
Contextual Emoji Kitchen suggestions are available in Gboard beta today and are coming to all Gboard users this summer for messages written in English, Spanish and Portuguese on devices running Android 6.0 and above.
4. Access more of your favorite apps with just your voice
Ask Google to open or search many of your favorite apps using just your voice — you can say things like, “Hey Google, pay my Capital One bill” to jump right into the app and complete the task or “Hey Google, check my miles on Strava” to quickly see your weekly progress right on the lock screen. See what else you can do by saying “Hey Google, shortcuts.”
5. Improved Password Input and gaze detection on Voice Access
Built with and for people with motor disabilities, and helpful for those without, Voice Access gives you quick and efficient phone and app navigation with just your voice.
With gaze detection, now in beta, you can ask Voice Access to work only when you are looking at the screen — so you can naturally move between talking to friends and using your phone.
Voice Access now has enhanced password input. When it recognizes a password field, it will let you input letters, numbers and symbols. For example, you can say “capital P a s s w o r d” or names of symbols (like “dollar sign” to input a $), so it’s faster to safely enter your password.
6. More customization and new app experiences on Android Auto
You can now customize more of your Android Auto experience for easier use, like personalizing your launcher screen directly from your phone and manually setting dark mode. It’s also easier to browse content with new tabs in your media apps, a “back to top” option and an A to Z button in the scroll bar. And, if it’s your first time using Android Auto, you can now get started faster in your car with a few simple taps.
We’ve also added new app experiences to help enhance your drive. EV charging, parking and navigation apps are now available to use in Android Auto. Plus, we’ve improved the messaging experience, so you can access your favorite messaging apps from the launcher screen. You can easily read and send new messages directly from apps like WhatsApp or Messages — now available globally.
These Android Auto features are available on phones running Android 6.0 or above, and when connected to your compatible car.
Part of our mission is to help make your daily life easier. At I/O this year, we shared news about a wide range of products and services that’ll do just that, from starting your car with your phone to searching your screenshots using Google Lens. Here are just a few of the features you should keep an eye out for.
Quickly view your notifications, invoke Google Assistant on Android.
Android 12 includes the biggest design change since 2014. We rethought the entire experience, from the colors to the shapes, light and motion, and made it easier to access some of the most used features:
To invoke Google Assistant wherever you are, long press the power button.
Swipe down to view your new notification shade, an at-a-glance view of all your app notifications in one place.
And to make it easier to access everything you need, Google Pay and Device Controls have been added to your customizable quick settings.
Manage your privacy settings more easily on Android.
On top of the new design changes, we’ve also launched a new Privacy Dashboard, giving you easy access to your permissions settings, visibility into what data is being accessed and the ability to revoke permissions on the spot. You also have new indicators that let you know when apps are using your microphone and camera, as well as a way to quickly shut off that access. And we’ve added new microphone and camera toggles into quick settings so you can easily remove app access to these sensors for the entire system. Learn about new privacy controls in Android 12.
Change the channel with your phone.
Lost your TV remote? Don’t sweat it — we’re building remote-control features directly into your Android phone. Another bonus: If you need to enter a long password to log into one of your many streaming services subscriptions, you can save time and use your phone’s keyboard to enter the text. This built-in remote control will be compatible with devices powered by Android TV OS, including Google TV, and it’ll roll out later this year. Learn more about how we’re helping your devices work better together.
Use your phone to enter your password for your streaming services.
And unlock your car with your phone while you’re at it.
We’re working with car manufacturers to develop a new digital car key in Android 12. This feature will enable you to use your phone to lock, unlock and even start your car — and in some cases you won’t even need to take it out of your pocket. And because it’s digital, you’ll also be able to securely and remotely share your car key with friends and family if needed. Read more about Android Auto.
Understand more about your Search results.
When you’re looking up information online, it’s important to check how credible a source is, especially if you aren’t familiar with the website. Our About This Result feature in Google Search provides details about a website before you visit it, including its description, when it was first indexed and whether your connection to the site is secure. This month, we’ll start rolling out About This Result to all English results worldwide, with more languages to come. And later this year, we’re going to add even more helpful contextual details — like how the site describes itself, what other sources are saying about it and related articles to check out.
Change your password using Chrome and Assistant.
Chrome on Android will help you change your passwords with a simple click. On supported sites, whenever you check your passwords and Chrome finds a password that may have been compromised, you will see a "Change password" button from Assistant. Powered by Duplex on the Web, Assistant will not only navigate to the site, but actually go through the entire process of changing your password for you. This feature is already available for purchasing movie tickets, ordering food, and checking into flights.
Use Google Lens to translate your homework into a language you’re more comfortable with.
Google Lens enables you to search what you see — from your camera, your photos and even your search bar. For a lot of students, their schoolwork might be in a language they’re not as comfortable with. That’s why we’re updating the Translate filter in Lens, making it easy to copy, listen to or search translated text in over 100 languages. Learn more about how information comes to life with Lens and AR.
And search your screenshots with Google Lens.
Lots of people take screenshots of things they’re interested in buying — but it can be hard to follow up on those screenshots afterward. Now when you look at any screenshot in Google Photos, we’ll prompt you to search the photo with Lens. This will help you find that pair of shoes or wallpaper pattern that you liked so much.
Search your screenshots using Google Lens.
When shopping online, keep track of your open carts when you open a new tab.
Raise your hand if this has ever happened to you: You’ve got a browser open to do some online shopping, but then you get distracted and open up two, three, or 10 other windows — and you forget what you were online to do in the first place. We’re introducing a new feature in Chrome that shows you your open carts when you open a new tab. No more lost shopping carts here.
And get the best value for products you’re buying online.
Coming soon, we’ll let you link your favorite loyalty programs from merchants like Sephora to your Google account to show you the best purchase options across Google. Learn more about all our latest shopping updates.
Explore unfamiliar neighborhoods with more detailed views in Maps.
If you’re traveling by foot, augmented reality in Live View will show you helpful details about the shops and restaurants around you – including how busy they are, and recent reviews and photos. And if you’re traveling, Live View will tell you where you are relative to your hotel – so you can always find your way back.
Avoid the crowds with area busyness.
Maps already shows the busyness of specific places — in fact, more than 80 million people use the live busyness information on Google every day. Now we’re expanding that functionality to show the busyness of an entire area, allowing you to see just how bustling a neighborhood or part of town is at any given moment. This means that if you want to keep things low-key, you can use Maps to see the hotspots to avoid. And if you’re looking for the most popular places to visit, you can use area busyness to scope out the liveliest neighborhoods at a glance.
See breakfast spots in the morning and dinner joints at night.
We’re updating Maps to show you more relevant information based on what time of day it is and whether you’re traveling. That means we’ll show you things like coffee shops in the morning, when you need that caffeine fix, and burger joints at night, when you’re hungry for dinner. And if you’re on a weekend getaway, we’ll make tourist attractions and local landmarks easier to spot. Learn more about our latest updates to Maps.
Discover unexpected Memories in Photos.
Starting later this summer, when we find a set of three or more photos with similarities like shape or color, we'll highlight these little patterns for you in your Memories. For example, Photos might surface a pattern of your family hanging out on the same couch over the years — something you wouldn’t have ever thought to search for, but that tells a deeply meaningful story about your daily life. Learn more about Little patterns in Photos.
Bring your pictures to life with Cinematic moments.
When you’re trying to get the perfect photo, you usually take the same shot two or three (or twenty) times. Using neural networks, we can take two nearly identical images and fill in the gaps by creating new frames in between. This creates vivid, moving images called Cinematic moments. Producing this effect from scratch would take professional animators hours, but with machine learning we can automatically generate these moments and bring them to your Recent Highlights. Learn more about Cinematic moments.
Cinematic moments will bring your photos to life.
Transform how you work with smart canvas in Google Workspace.
As part of our mission to build the future of work, we’re launching smart canvas, a bunch of exciting updates across Docs, Sheets and Meet. New features include interactive building blocks—smart chips, templates, and checklists—as well as a new pageless format in Docs and emoji reactions. We're also bringing Meet closer to Docs, Sheets and Slides, and much more. See all of the big updates to Google Workspace.
Today, there are nine smart devices in the average smart home — in 2016, there were only three. While this is explosive growth, the industry is still evolving. Selecting the right devices or connecting them with the ones you already have can be frustrating.
It’s up to us to simplify the smart home, and to start we must change the way device makers build products. There should be one standard that simplifies selection, setup and control, and makes it easy for our partners to create products and experiences for your home. Here’s how we’re making that happen:
1. Google’s bringing Matter to Nest and Android
Google and other leading tech companies are working together to develop Matter, the new protocol that simplifies smart homes by using one standard across the industry — and we’re committed to supporting Matter. We’re bringing Matter to Android and capable Nest products, powering them with interoperable control and enabling simpler setups.
Android will be one of the leading operating systems with built-in support for Matter, letting you quickly set up devices with Google and link your favorite Android apps. You’ll only need a few taps to set up your Matter devices, and you’ll have lots of ways to instantly control them such as Matter-enabled Android apps, Google Assistant, the Google Home app, Android Power Controls and compatible Google devices. It also allows over one billion Android devices to enable simple setup and control all Matter-certified products.
Nest is committed to making our devices connect better and respond faster. Thread, a technology we cofounded in 2014 that helps smart home devices work faster and more securely, will work in conjunction with Matter. Devices with Thread built-in, like Nest Wifi, Nest Hub Max and the second-generation Nest Hub will become connection points for Matter devices, creating even stronger, faster connections across your home. All Nest displays and speakers, like the Nest Hub and Nest Mini, will be automatically updated to control Matter devices, giving you faster and more reliable experiences whether they use Wi-Fi, Thread or ethernet.
Plus, we’ll update the newest Nest Thermostat to support Matter - meaning for the first time it can be controlled on other platforms that have certified with Matter.
The bottom line: Matter devices will work everywhere your Google smart home does.
2. One location for smart home information
Smart home information should be available in one trustworthy place. We’re unveiling a new Google smart home directory, an online destination to discover Google Assistant-compatible devices, answer your questions and learn from educational videos. You’ll find products across more than 30 categories, from brands like Philips Hue, Nanoleaf, Samsung, LG, Dyson, Netatmo, Wyze and more. It’s easy to search and filter compatible products, see product details, read reviews and find the best prices.
3. Better streaming
We’ve added support for WebRTC, an open-source communications protocol that reduces latency for an improved live video and audio streaming experience between security cameras, video doorbells, smart displays and mobile devices. Top device manufacturers, including Arlo, Logitech, Netatmo and Wyze, are among our first partners to integrate WebRTC with Google Assistant and more will join in the coming weeks.
4. Control your home, from anywhere
We’re also using Google technology to improve Home & Away Routines, enabling automatic control of Nest cameras, Nest thermostats, smart lights, smart plugs and smart switches based on when you’re home or away. When you leave home, your Away Routine can automatically turn on your Nest cameras and turn off the lights and plugs. When someone arrives home, your Home Routine can turn off the cameras and turn on the lights.
We’re committed to making the smart home more helpful. The Google smart home will keep finding ways to bring Google Assistant, Nest devices, industry-leading partners and new technology together to help you get things done, stay on track...and sometimes just sit back and enjoy your home.
Posted by Rebecca Nathenson, Director of Product for the Google Assistant Developer Platform
Today at I/O, we shared some exciting new product announcements to help you more easily bring Google Assistant to your Android apps and create more engaging content on smart displays.
Assistant development made easy with new Android APIs
App Actions helps you easily bring Google Assistant to your Android app and complete user queries of all kinds, from booking a ride to posting a message on social media. Companies such as MyFitnessPal and Twitter are already using App Actions to help their users get things done, just by using their voice. You can enable App Actions in Android Studio by mapping built-in intents to specific features and experiences within your apps. Here are new ways you can help users easily navigate your content through voice queries and proactive suggestions.
Better support for Assistant built-in intents with Capabilities
Capabilities is a new framework API available in beta today that lets you declare support for common tasks defined by built-in intents. By leveraging pre-built requests from our catalog of intents, you can offer users ways to jump to specific activities within your app.
For example, the Yahoo Finance app uses Capabilities to let users jump directly to the Verizon stock page just by saying “Hey Google, show me Verizon’s stock on Yahoo Finance.” Similarly, Snapchat users can use their voice to add filters and send them to friends: “Hey Google, send a snap with my Curry sneakers.”
Improved user discoverability with Shortcuts in Android 12
App shortcuts are already a popular way to automate most common tasks on Android. Thanks to the new APIs for Shortcuts in Android 12, it’s now easier to find all the Assistant queries that are supported with apps. If you build an Android Shortcut, it will automatically show up in the Assistant Shortcuts gallery, so users can choose to set up a personal voice command in your app, when they say “Hey Google, shortcuts.”
Google Assistant can also suggest relevant shortcuts to help drive traffic to your app. For example, when using the eBay app, people will see a suggested Google Assistant Shortcut appear on the screen and have the option to create a shortcut for "show my bids."
We also introduced the Google Shortcuts Integration library, which identifies shortcuts pushed by Shortcuts Jetpack Module and makes them available to Assistant for use in managing related voice queries. By doing so, Google Assistant can suggest relevant shortcuts to users and help drive traffic to your app.
Get immediate answers and updates right from Assistant using Widgets, coming soon
Improvements to Android 12 also makes it easier to discover glanceable content with Widgets by mapping them to specific built-in intents using the Capabilities API. We're also looking at how to easily bring driving optimized widgets to Android Auto in the future. The integration with Assistant will enable one shot answers, quick updates and multi-step interactions with the same widget.
For example, with Dunkin’s widget implementation, you can say “Hey Google, reorder from Dunkin’ to select from previous drinks and place the order. Strava’s widget helps a user track how many miles they ran in a week by saying “Hey Google, check my miles on Strava”, and it will show up right on the lock screen.
Build high quality Conversational Actions for smart displays
Last year, we introduced a number of improvements to the Assistant platform for smart displays, such as Actions Builder, Actions SDK and new built-in intents to improve the experience for both developers and users. Here are more improvements rolling out soon to make building conversational actions on smart displays even better.
New features to improve the developer experience
Interactive Canvas helps you build touch- and voice-controlled games and storytelling experiences for the Assistant using web technologies like HTML, CSS, and JavaScript. Companies such as CoolGames, Zynga, and GC Turbo have already used Canvas to build games for smart displays.
Since launch, we've gotten great feedback from developers that it would be simpler and faster to implement core logic in web code. To enable this, the Interactive Canvas API will soon provide access to text-to-speech (TTS), natural language understanding (NLU), and storage APIs that will allow developers to trigger these capabilities from client-side code. These APIs will provide experienced web developers with a familiar development flow and enable more responsive Canvas actions.
We’re also giving you a wider set of options around how to release your actions. Coming soon, in the Actions Console, you will be able to manage your releases by launching in stages. For example, you can launch to one country first and then expand to more later, or you can launch to just a smaller percentage and gradually roll out over time.
Improving the user experience on smart displays
You'll also see improvements that will enhance visual experiences on the smart display. For example, you can now remove the persistent header, which allows you to utilize full real estate of the device and provide users with fully immersive experiences.
Before Interactive Canvas brought customized touch interfaces to the Smart Display, we provided a simple way to stop TTS from playing by tapping anywhere on the screen of the device. However, with more multi-modal experiences being released on Smart Displays, there are use cases where it is important to continue playing TTS while the user touches the display. Developers will soon have the option to enable persistent TTS for their actions.
We’ve also added support for long-form media sessions with updates to the Media API so you can start playback from a specific moment, resume where a previous session stopped, and adapt conversational responses based on media playback context.
Easier transactions for your voice experiences
We know how important it is to have the tools you need to build a successful business on our platform. In October of last year, we made a commitment to make it easier for you to add seamless voice-based and display-based monetization capabilities to your experience. On-device CVC and credit card entry will soon be available on smart displays. Both of these features make on-device transactions much easier reducing the need to redirect users to their mobile devices.
We hope you are able to leverage all these new features to build engaging experiences and reach your users easily, both on mobile and at home. Check out our technical sessions, workshops and more from Google I/O on YouTube and get started with App Actions and Conversational Actions today!
Posted by Rebecca Nathenson, Director of Product for the Google Assistant Developer Platform
Today at I/O, we shared some exciting new product announcements to help you more easily bring Google Assistant to your Android apps and create more engaging content on smart displays.
Assistant development made easy with new Android APIs
App Actions helps you easily bring Google Assistant to your Android app and complete user queries of all kinds, from booking a ride to posting a message on social media. Companies such as MyFitnessPal and Twitter are already using App Actions to help their users get things done, just by using their voice. You can enable App Actions in Android Studio by mapping built-in intents to specific features and experiences within your apps. Here are new ways you can help users easily navigate your content through voice queries and proactive suggestions.
Better support for Assistant built-in intents with Capabilities
Capabilities is a new framework API available in beta today that lets you declare support for common tasks defined by built-in intents. By leveraging pre-built requests from our catalog of intents, you can offer users ways to jump to specific activities within your app.
For example, the Yahoo Finance app uses Capabilities to let users jump directly to the Verizon stock page just by saying “Hey Google, show me Verizon’s stock on Yahoo Finance.” Similarly, Snapchat users can use their voice to add filters and send them to friends: “Hey Google, send a snap with my Curry sneakers.”
Improved user discoverability with Shortcuts in Android 12
App shortcuts are already a popular way to automate most common tasks on Android. Thanks to the new APIs for Shortcuts in Android 12, it’s now easier to find all the Assistant queries that are supported with apps. If you build an Android Shortcut, it will automatically show up in the Assistant Shortcuts gallery, so users can choose to set up a personal voice command in your app, when they say “Hey Google, shortcuts.”
Google Assistant can also suggest relevant shortcuts to help drive traffic to your app. For example, when using the eBay app, people will see a suggested Google Assistant Shortcut appear on the screen and have the option to create a shortcut for "show my bids."
We also introduced the Google Shortcuts Integration library, which identifies shortcuts pushed by Shortcuts Jetpack Module and makes them available to Assistant for use in managing related voice queries. By doing so, Google Assistant can suggest relevant shortcuts to users and help drive traffic to your app.
Get immediate answers and updates right from Assistant using Widgets, coming soon
Improvements to Android 12 also makes it easier to discover glanceable content with Widgets by mapping them to specific built-in intents using the Capabilities API. We're also looking at how to easily bring driving optimized widgets to Android Auto in the future. The integration with Assistant will enable one shot answers, quick updates and multi-step interactions with the same widget.
For example, with Dunkin’s widget implementation, you can say “Hey Google, reorder from Dunkin’ to select from previous drinks and place the order. Strava’s widget helps a user track how many miles they ran in a week by saying “Hey Google, check my miles on Strava”, and it will show up right on the lock screen.
Build high quality Conversational Actions for smart displays
Last year, we introduced a number of improvements to the Assistant platform for smart displays, such as Actions Builder, Actions SDK and new built-in intents to improve the experience for both developers and users. Here are more improvements rolling out soon to make building conversational actions on smart displays even better.
New features to improve the developer experience
Interactive Canvas helps you build touch- and voice-controlled games and storytelling experiences for the Assistant using web technologies like HTML, CSS, and JavaScript. Companies such as CoolGames, Zynga, and GC Turbo have already used Canvas to build games for smart displays.
Since launch, we've gotten great feedback from developers that it would be simpler and faster to implement core logic in web code. To enable this, the Interactive Canvas API will soon provide access to text-to-speech (TTS), natural language understanding (NLU), and storage APIs that will allow developers to trigger these capabilities from client-side code. These APIs will provide experienced web developers with a familiar development flow and enable more responsive Canvas actions.
We’re also giving you a wider set of options around how to release your actions. Coming soon, in the Actions Console, you will be able to manage your releases by launching in stages. For example, you can launch to one country first and then expand to more later, or you can launch to just a smaller percentage and gradually roll out over time.
Improving the user experience on smart displays
You'll also see improvements that will enhance visual experiences on the smart display. For example, you can now remove the persistent header, which allows you to utilize full real estate of the device and provide users with fully immersive experiences.
Before Interactive Canvas brought customized touch interfaces to the Smart Display, we provided a simple way to stop TTS from playing by tapping anywhere on the screen of the device. However, with more multi-modal experiences being released on Smart Displays, there are use cases where it is important to continue playing TTS while the user touches the display. Developers will soon have the option to enable persistent TTS for their actions.
We’ve also added support for long-form media sessions with updates to the Media API so you can start playback from a specific moment, resume where a previous session stopped, and adapt conversational responses based on media playback context.
Easier transactions for your voice experiences
We know how important it is to have the tools you need to build a successful business on our platform. In October of last year, we made a commitment to make it easier for you to add seamless voice-based and display-based monetization capabilities to your experience. On-device CVC and credit card entry will soon be available on smart displays. Both of these features make on-device transactions much easier reducing the need to redirect users to their mobile devices.
We hope you are able to leverage all these new features to build engaging experiences and reach your users easily, both on mobile and at home. Check out our technical sessions, workshops and more from Google I/O on YouTube and get started with App Actions and Conversational Actions today!
Our teams at Google continue to support the tireless work of hospitals, nonprofits, and public health service providers across the country. Right now, we’re focused on three priority areas: ensuring people can access the latest and most authoritative information; amplifying vital safety and vaccination messages; and providing financial backing for affected communities, health authorities and other organizations.
Providing critical and authoritative information
On all our platforms, we’re taking steps to surface the critical information families and communities need to care for their own health and look after others.
Searches on the COVID-19 vaccine display key information around side effects, effectiveness, and registration details, while treatment-related queries surface guidance from ministry resources
When people ask questions about vaccines on Google Search, they see information panels that display the latest updates on vaccine safety, efficacy and side-effects, plus registration information that directs users to the Co-WIN website. You will also find information about prevention, self-care, and treatment under the Prevention and Treatment tab, in easy-to-understand language sourced from authorised medical sources and the Ministry of Health and Family Welfare.
On YouTube we’re surfacing authoritative information in a set of playlists, about vaccines, preventing the spread of COVID-19, and facts from experts on COVID-19 care.
Our YouTube India channel features a set of playlists to share tips and information on COVID-19 care
Testing and vaccination center locations
In addition to showing 2,500 testing centers on Search and Maps, we’re now sharing the locations of over 23,000 vaccination centers nationwide, in English and eight Indian languages. And we’re continuing to work closely with the Ministry of Health and Family Welfare to make more vaccination center information available to users throughout India.
Searching for vaccines in Maps and Search now shows over 23,000 vaccination centers across the country, in English and eight Indian languages
Pilot on hospital beds and medical oxygen availability
We know that some of the most crucial information people are searching for is the availability of hospital beds and access to medical oxygen. To help them find answers more easily, we’re testing a new feature using the Q&A function in Maps that enables people to ask about and share local information on availability of beds and medical oxygen in select locations. As this will be user generated content and not provided by authorised sources, it may be required to verify the accuracy and freshness of the information before utilizing it.
Amplifying vital safety and vaccination messages
As well as providing authoritative answers to queries, we’re using our channels to help extend the reach of health information campaigns. That includes the ‘Get the Facts’ around vaccines campaign, to encourage people to focus on authoritative information and content for vaccines. We’re also surfacing important safety messages through promotions on the Google homepage, Doodles and reminders within our apps and services.
Via the Google Search homepage and reminders within our apps and services, we are reminding people to stay safe and stay masked, and get authoritative information on vaccines
Supporting health authorities, organizations, and affected communities
Since the second wave began, we’ve been running an internal donation campaign to raise funds for nonprofit organizations helping those most in need, including GiveIndia, Charities Aid Foundation India, GOONJ, and United Way of Mumbai. This campaign has raised over $4.6 million (INR 33 crore) to date, and continues to generate much-needed support for relief efforts.
We recognize that many more nonprofits need donations, and that Indians are eager to help where they can—so we’ve rolled out a COVID Aid campaign on Google Pay, featuring non-profit organizations like GiveIndia, Charities Aid Foundation, Goonj, Save the Children, Seeds, UNICEF India (National NGOs) and United Way. We want to thank all our Google Pay users who have contributed to these organisations, and we hope this effort will make a difference where it matters most.
On Google Pay people can contribute funds to non-profit organizations involved in COVID response
As India battles this devastating wave, we’ll keep doing all we can to support the selfless individuals and committed organizations on the front lines of the response. There’s a long way to go—but standing together in solidarity, working together with determination, we can and will turn the tide.