Tag Archives: Google Assistant

What’s for dinner? Order it with Google

French fries, lettuce wraps, massaman curry, chicken wings, cupcakes—I could go on. When I was pregnant with my son last year, my cravings were completely overpowering. Lucky for me, I didn’t have to jump into the car and go to my favorite restaurants to get my fill—food delivery services saved my bacon on more occasions than I’d be comfortable admitting to the world.

Ever since then, I’ve counted myself as one of the millions of people who regularly order food for home delivery. Starting today, we’re making it even easier to get food delivered to your doorstep.

Find food and order faster
Now you can use Google Search, Maps or the Assistant to order food from services like DoorDash, Postmates, Delivery.com, Slice, and ChowNow, with Zuppler and others coming soon. Look out for the “Order Online” button in Search and Maps when you search for a restaurant or type of cuisine. For participating restaurants, you can make your selections with just a few taps, view delivery or pickup times, and check out with Google Pay.  

Let the Google Assistant handle dinner
To use the Assistant on your phone to get your food fix, simply say, “Hey Google, order food from [restaurant].” You can also quickly reorder your go-to meal with some of our delivery partners by saying, “Hey Google, reorder food from [restaurant].” The Assistant pulls up your past orders, and in just a few seconds, you can select your usual dish.

Now's the perfect time to let Google help with your cravings. So, what are we ordering tonight?

Bose speakers get smarter with the Google Assistant

With help from the Google Assistant, you can customize your entertainment at home with just your voice: ask the Assistant to play your favorite part of a song, pause a favorite show on your Chromecast-enabled TV to grab some snacks or dim the lights before the movie starts. And when you have great hardware that integrates with the Assistant, there's even more you can do.

Starting today, Bose is bringing the Google Assistant to its line of smart speakers and soundbars. This includes the Bose Home Speaker 500, Bose Soundbar 500 and 700, and an all-new, compact smart speaker coming later this summer, the Bose Home Speaker 300.

With the Google Assistant built in, you can play music, find answers on Google Search, manage everyday tasks and control smart devices around your home—just by saying “Hey Google.” If you’re using the Assistant for the first time on your Bose device, here are a few tips to get started: 

  • Enjoy entertainment:Ask the Google Assistant to play music and radio from your speaker. Or, stream videos to your Chromecast-enabled TV with a simple voice command to your Bose smart speaker. Later this summer, you’ll be able to play the news and podcasts, too. 
  • Get answers: Get answers on sports, weather, finance, calculations and translations.
  • Control compatible smart home devices:Check that the lights are turned off when you leave home and adjust the thermostat when you return. The Assistant works with over 3,500 home automation brands and more than 30,000 devices.
  • Plan your day:With your permission, get help with things like your flight information, or your commute to work. Check on the latest weather and traffic in your area.
  • Manage tasks:With your permission, your Assistant can add items to your shopping list and stock up on essentials. Set alarms and timers hands free.

How to pick the Assistant on your Bose speaker or soundbar 

If you already own one of these Bose smart speakers or sound bars, it’s easy to get the Assistant set up. Your speaker and soundbar will automatically receive a software update introducing the Google Assistant as a voice assistant option. You can go to “Voice Settings” for the device in the Bose Music app, select the Google Assistant and follow the guided setup process.

And if you are purchasing a Bose smart speaker for the first time, you’ll be able to select the Assistant right at set up.

With our collaboration with Bose, we hope you enjoy your new home audio with the helpfulness of the Google Assistant. 


Search at Google I/O 2019

Google I/O is our yearly developer conference where we have the pleasure of announcing some exciting new Search-related features and capabilities. A good place to start is Google Search: State of the Union, which explains how to take advantage of the latest capabilities in Google Search:

We also gave more details on how JavaScript and Google Search work together and what you can do to make sure your JavaScript site performs well in Search.

Try out new features today

Here are some of the new features, codelabs, and documentation that you can try out today:
The Google I/O sign at Shoreline Amphitheatre at Mountain View, CA

Be among the first to test new features

Your help is invaluable to making sure our products work for everyone. We shared some new features that we're still testing and would love your feedback and participation.
A large crowd at Google I/O

Learn more about what's coming soon

I/O is a place where we get to showcase new Search features, so we're excited to give you a heads up on what's next on the horizon:
Two people posing for a photo at Google I/O, forming a heart with their arms

We hope these cool announcements help & inspire you to create even better websites that work well in Search. Should you have any questions, feel free to post in our webmaster help forums, contact us on Twitter, or reach out to us at any of the next events we're at.

Make your smart home more accessible with new tutorials

I’m legally blind, so from the moment I pop out of bed each morning, I use technology to help me go about my day. When I wake up, I ask my Google Assistant for my custom-made morning Routine which turns on my lights, reads my calendar and plays the news. I use other products as well, like screen readers and a refreshable braille display, to help me be as productive as possible.

I bring my understanding of what it's like to have a disability to work with me, where I lead accessibility for Google Search, Google News and the Google Assistant. I work with cross-functional teams to help fulfill Google’s mission of building products for everyone—including those of us in the disabled community.

The Assistant can be particularly useful for helping people with disabilities get things done. So today, Global Accessibility Awareness Day, we’re releasing a series of how-to videos with visual and audible directions, designed to help the accessibility community set up and get the most out of their Assistant-enabled smart devices.

You can find step-by-step tutorials to learn how to interact with your Assistant, from setting up your Assistant-enabled device to using your voice to control your home appliances, at our YouTube playlist which we’ll continue to update throughout the year.

Intro to Assistant Accessibility Videos

This playlist came out of conversations within the team about how we can use our products to make life a little easier. Many of us have some form of disability, or have a friend, co-worker or family member who does. For example, Stephanie Wilson, an engineer on the Google Home team, helped set up her parents’ smart home after her dad was diagnosed with Parkinson’s disease.

In addition to our own teammates, we're always listening to suggestions from the broader community on how we can make our products more accessible. Last week at I/O, we showed how we’re making the Google Assistant more accessible, using AI to improve products for people with a speech impairment, and added Live Caption in Android Q to give the Deaf community automatic captions for media that’s playing audio on your phone. All these changes were made after receiving feedback from people like you.

Head over to our Accessibility website to learn more, and if you have questions or feedback on accessibility within Google products, please share your feedback with us via our dedicated Disability Support team.

We hear you: updates to Works with Nest

Last week we announced that we would stop supporting the Works with Nest (WWN) program on August 31, 2019 and transition to the Works with Google Assistant platform (WWGA). The decision to retire WWN was made to unify our efforts around third-party connected home devices under a single platform for developers to build features for a more helpful home. The goal is to simplify the experience for developers and to give you more control over how your data is shared. Since the announcement, we’ve received a lot of questions about this transition. Today we wanted to share our updated plan and clarify our approach.


First, we’re committed to supporting the integrations you value and minimizing disruptions during this transition, so here’s our updated plan for retiring WWN:

  • Your existing devices and integrations will continue working with your Nest Account, however you won’t have access to new features that will be available with a Google Account. If we make changes to the existing WWN connections available to you with your Nest Account, we will make sure to keep you informed.

  • We’ll stop accepting new WWN connections on August 31, 2019. Once your WWN functionality is available on the WWGA platform you can migrate with minimal disruption from a Nest Account to a Google Account.

Second, we want to clarify how this transition will work for you. Moving forward, we’ll deliver a single consumer and developer experience through the Google Assistant. WWGA already works with over 3,500 partners and 30,000 devices, and integrates seamlessly with Assistant Routines. Routines allow anyone to quickly customize how their smart devices work together based on simple triggers—whether you’re leaving home or going to bed.


One of the most popular WWN features is to automatically trigger routines based on Home/Away status. Later this year, we'll bring that same functionality to the Google Assistant and provide more device options for you to choose from. For example, you’ll be able to have your smart light bulbs automatically turn off when you leave your home. Routines can be created from the Google Home or Assistant apps, and can be created using the hardware you already own. Plus we’re making lots of improvements to setup and managing Routines to make them even easier to use.

We recognize you may want your Nest devices to work with other connected ecosystems. We’re working with Amazon to migrate the Nest skill that lets you control your Nest thermostat and view your Nest camera livestream via Amazon Alexa. Additionally, we’re working with other partners to offer connected experiences that deliver more custom integrations.

For these custom integrations, partners will undergo security audits and we’ll control what data is shared and how it can be used. You’ll also have more control over which devices these partners will see by choosing the specific devices you want to share. For example, you’ll be able to share your outdoor cameras, but not the camera in your nursery, with a security partner.

We know we can't build a one-size-fits-all solution, so we're moving quickly to work with our most popular developers to create and support helpful interactions that give you the best of Google Nest. Our goal remains to give you the tools you need to make your home, and those of other Nest users, helpful in the ways that matter most to you.


Cathy Pearl has learned the art and science of conversation

Conversations can be tough. Whether you’re chit-chatting with a coworker or having an important talk with your partner, it’s easy to misinterpret, say the wrong thing, or accidentally offend someone. Now imagine teaching a computer how to avoid those minefields. That’s even tougher—and Googler Cathy Pearl knows exactly how difficult it is.

Cathy has made a career out of teaching computers how to talk to humans. She’s worked in the field of conversation design for decades, and now works in outreach at Google, where she helps spread the word about her field both within and outside of the company. She also served as a judge for this year’s Webby Awards, which is introducing a category for voice user interfaces for the very first time.  (Google ended up winning several awards, too, in categories Cathy didn't judge.)

For this installment of The She Word, Cathy tells us about the challenges of teaching computers to talk to humans, and what that’s taught her about her own conversations:

Designing conversations is trickier than you think. That’s because human conversations are really complicated.

“Basically, conversation design is about teaching computers how to communicate like humans, not the other way around. We all know how to talk from a young age, so now we need to build computers that can understand us where we are, instead of forcing people to speak some foreign computer language.

People may not realize how complex it really is. Think about something that seems like a simple yes or no question: What if you asked me, ‘Do you want a cup of coffee?’ Let’s say I replied, ‘Coffee will keep me awake.’ Is that a yes, or a no? Well, if you asked me first thing in the morning and I have a big presentation to write, it’s probably a yes. Ask me right before bed, and it’s probably a no. People say things like this all the time, but it’s hard for computers to understand.”

Voice recognition used to seem like the stuff of fiction. It's come a long way.

“I learned how to program when I was a kid, and I was really interested in learning to get the computer to talk back to me. I was really into movies like ‘War Games’ and TV shows like ‘Knight Rider’ that had these talking computers. Now, there was no such career at the time really, unless you were a researcher at Bell Labs or something like that. Coming out of grad school, I didn’t know of any jobs I could take in that field.

So really it was in 1999 when I saw a job opening for a company and they said, ‘Come work on speech recognition!’ And I said, ‘Well, that stuff doesn’t work, it’s still a science fiction thing.’ But they had a demo line you could call, and it was this fake banking demo where you could move money from checking to savings. It’s all you could do, really, but it worked. I was astounded. I spent eight years at the company learning the ins and outs of building voice user interfaces for phone systems for companies.”

When you find yourself at a career crossroads, don't limit your options.

“If you do something like IVF, it takes over your whole life. It’s a constant thing. That’s why I quit my job. You can’t plan vacations, you can’t plan work meetings, because you have to go to the doctor’s office. And it’s so disruptive. After nearly 3 years of trying, I had my son. I spent the next three years as a stay-at-home mom.

I think what was hardest for me was the point where I thought, I absolutely want to go back to work now, which was earlier than those three years, but I didn’t know what I was going to do. I didn’t know what resources to use to try and figure out what I should do to get back into a great career. I felt very alone in that way.

I went to a career counselor, and I just tried to start saying yes to more things. So when somebody asked me to give a talk, even if I didn’t think I was necessarily qualified, I said yes. I said yes to writing a book, which was just a terrifying prospect. It expanded my worldview of what was out there, and it opened a lot of doors to opportunities I wouldn’t have had otherwise. I think as women we often undersell ourselves.”

Teaching computers how to talk to us can teach us a lot about ourselves.

“So much of the time when we communicate, we want to be acknowledged. We don’t want you to try to solve problems. When I’m saying I had this really hard day, I don’t want my friend to say, ‘You know what you should do next time?’ No! I want you to say, ‘That sounds frustrating.’

That applies to voice user interfaces. With the Google Assistant, there’s a lot of stuff we can’t do yet. But it’s better to acknowledge the things we can’t do then just say, ‘I don’t understand.’ If someone says, ‘I want to rent a car,’ and we can’t do that, can we say, ‘I’m sorry, I can’t rent cars yet?’ That’s more satisfying at a basic, human, primitive level, because at least they understood me.”

With the Google Assistant, your Sonos system gets even smarter

When your hands are full hosting a party with family and friends, or when you’re just chilling at home, your Google Assistant can help you find your favorite playlist, skip to the next song or turn up the volume. Today, we’re bringing the Assistant to your Sonos system so that you can easily play music on any speaker in your house. Plus, you'll get all the usual help from the Assistant to better manage your day, like traffic to work or your next appointment. With this software update, you can activate the Assistant on your Sonos One and Sonos Beam, or control any other Sonos product from a Google Assistant-enabled device, such as your phone, a smart speaker, or a Smart Display.

Here are a few things you can do with the Google Assistant on Sonos:

  • Listen to some tunes. Or news. Or podcasts:While you’re able to play music from all the services the Assistant already supports—including YouTube Music, Pandora, and Spotify—you can also use the Assistant to skip to the next track, pause the music and change volume from the 100+ music services already available on Sonos. It’s also easy to catch up on the latest episode of your favorite podcast just by asking the Assistant.
  • Enjoy entertainment:With the Google Assistant on Sonos Beam and a Chromecast-enabled TV, you can turn on the TV, switch from music to TV, and adjust the volume. You can also stream videos from popular services.
  • Plan your day: Get help with things like your flight information, or traffic on your commute to work.
  • Manage tasks:Set alarms and timers, pull up your calendar appointments, or add items to your shopping list.
  • Get answers: Ask all your questions on sports, weather, calculations, translations and more.
  • Control your home:Ask your Assistant on your speakers to adjust the temperature, lighting, and other smart home devices connected in your home.

Get started with the Assistant on your Sonos speaker or soundbar

If you already own a Sonos One or Sonos Beam, it’s easy to set up the Assistant. Your speaker and soundbar will automatically receive a software update introducing the Google Assistant as a voice assistant option. To add Google Assistant to your Sonos speakers,

  1. Go to "Voice Services" in the Sonos app under Settings
  2. Select Google Assistant
  3. Follow the guided setup process

We’re starting in the U.S. and then will expand support to the UK, Germany, Canada, Australia, France, The Netherlands and more over the coming months.

To showcase Sonos’ sound experience paired with the smart control of the Google Assistant, we invite you to attend an immersive, multi-sensory experience in New York, June 7-9. Featuring new music from The National and Holly Herndon, alongside tracks curated by the iconic Beggars Group labels (Rough Trade, 4AD, XL, Matador and Young Turks), we’ll help you explore how sound works, how it layers into music, and how music sparks emotion. Learn more and RSVP.

Your Sonos system, now paired with the helpfulness of the Google Assistant, gives you more choices than ever to enjoy and control your music and entertainment.


Adding the Assistant to Bluetooth devices gets easier for device makers

Posted by Tomer Amarilio, Product Manager, Google Assistant

Building Google Assistant Bluetooth devices gets easier for device makers

Headphones were one of the first devices optimized for the Google Assistant. With just your voice, you can ask the Assistant to make calls to friends or skip to the next song when you’re commuting on the subway to work or biking around on the weekend without having to always glance at your phone.

But as wireless Bluetooth devices like headphones and earbuds become more popular, we need to make it easier to have the same great Assistant experience across many headsets. We collaborated with Qualcomm to design a comprehensive, customizable development kit to provide all device makers with the building blocks needed to create a smart headset with the Google Assistant. The new Qualcomm Smart Headset Development Kit for the Google Assistant is powered by Qualcomm’s QCC5100-series Bluetooth audio chip and supports Google Fast Pair to make pairing Bluetooth accessories a hassle-free process.

To inspire device makers, we also built a Qualcomm Smart Headset Reference Design which delivers high quality audio, noise cancellation capabilities, and supports extended battery life and playback time. The reference design includes a push button to activate the Assistant and is just an example of what manufacturers can engineer.

Qualcomm Smart Headset

How DIVA makes Google Assistant more accessible

My 21 year old brother Giovanni loves to listen to music and movies. But because he was born with congenital cataracts, Down syndrome and West syndrome, he is non-verbal. This means he relies on our parents and friends to start or stop music or a movie.  

Over the years, Giovanni has used everything from DVDs to tablets to YouTube to Chromecast to fill his entertainment needs. But as new voice-driven technologies started to emerge, they also came with a different set of challenges that required him to be able to use his voice or a touchscreen. That’s when I decided to find a way to let my brother control access to his music and movies on voice-driven devices without any help. It was a way for me to give him some independence and autonomy.

Working alongside my colleagues in the Milan Google office, I set up Project DIVA, which stands for DIVersely Assisted. The goal was to create a way to let people like Giovanni trigger commands to the Google Assistant without using their voice. We looked at many different scenarios and methodologies that people could use to trigger commands, like pressing a big button with their chin or their foot, or with a bite.  For several months we brainstormed different approaches and presented them at different accessibility and tech events to get feedback.

We had a bunch of ideas on paper that looked promising. But in order to turn those ideas into something real, we took part in an Alphabet-wide accessibility innovation challenge and built a prototype which went on to win the competition. We identified that many assistive buttons available on the market come with a 3.5mm jack, which is the kind many people have on their wired headphones. For our prototype, we created a box to connect those buttons and convert the signal coming from the button to a command sent to the Google Assistant.

Project DIVA diagram

To move from a prototype to reality, we started working with the team behind Google Assistant Connect, and today we are announcing DIVA at Google I/O 2019.


The real test, however, was giving this to Giovanni to try out. By touching the button with his hand, the signal is converted into a command sent to the Assistant. Now he can listen to music on the same devices and services our family and all his friends use,  and his smile tells the best story.


Getting this to work for Giovanni was just the start for Project DIVA. We started with single-purpose buttons, but this could be extended to more flexible and configurable scenarios. Now, we are investigating attaching RFID tags to objects and associating a command to each tag. That way, a person might have a cartoon puppet trigger a cartoon on the TV, or a physical CD trigger the music on their speaker.


Learn more about the idea behind the DIVA project at our publication site, and learn how to build your own device at our technical site.


Actions on Google at I/O 2019: New tools for web, mobile, and smart home developers

Posted by Chris Turkstra, Director, Actions on Google

People are using the Assistant every day to get things done more easily, creating lots of opportunities for developers on this quickly growing platform. And we’ve heard from many of you that want easier ways to connect your content across the Assistant.

At I/O, we’re announcing new solutions for Actions on Google that were built specifically with you in mind. Whether you build for web, mobile, or smart home, these new tools will help make your content and services available to people who want to use their voice to get things done.

Enhance your presence in Search and the Assistant

Help people with their “how to” questions

Every day, people turn to the internet to ask “how to” questions, like how to tie a tie, how to fix a faucet, or how to install a dog door. At I/O, we’re introducing support for How-to markup that lets you power richer and more helpful results in Search and the Assistant.

Adding How-to markup to your pages will enable the page to appear as a rich result on mobile Search and on Google Assistant Smart Displays. This is an incredibly lightweight way for web developers and creators to connect with millions of people, giving them helpful step-by-step instructions with video, images and text. You can start seeing How-to markup results on Search today, and your content will become available on the Smart Displays in the coming months.

Here’s an example where DIY Network added markup to their existing content on the web to provide a more helpful, interactive result on both Google Search and the Assistant:

Mobile Search screenshot showing how to install a dog door How-to Markup of how to install a dog door

For content creators that don’t maintain a website, we created a How-to Video Template where video creators can upload a simple spreadsheet with titles, text and timestamps for their YouTube video, and we’ll handle the rest. This is a simple way to transform your existing how-to videos into interactive, step-by-step tutorials across Google Assistant Smart Displays and Android phones.

Check out how REI is getting extra mileage out of their YouTube video:

Laptop to Home Hub displaying How To Template for the REI compass

How-to Video Templates are in developer preview so you can start building today, and your content will become available on Android phones and Smart Displays in the coming months.

Easier engagement with your apps

Help people quickly get things done with App Actions

If you’re an app developer, people are turning to your apps every day to get things done. And we see people turn to the Assistant every day for a natural way to ask for help via voice. This offers an opportunity to use intents to create voice-based entry points from the Assistant to the right spot in your app.

Last year, we previewed App Actions, a simple mechanism for Android developers that uses intents from the Assistant to deep link to exactly the right spot in your app. At I/O, we are announcing the release of built-in intents for four new App Action categories: Health & Fitness, Finance and Banking, Ridesharing, and Food Ordering. Using these intents, you can integrate with the Assistant in no time.

If I wanted to track my run with Nike Run Club, I could just say “Hey Google, start my run in Nike Run Club” and the app will automatically start tracking my run. Or, let’s say I just finished dinner with my friend Chad and we're splitting the check. I can say "Hey Google, send $15 to Chad on PayPal" and the Assistant takes me right into Paypal, I log in, and all of my information is filled in – all I need to do is hit send.

Google Pixel showing App Actions Nike Run Club

Each of these integrations were completed in less than a day with the addition of an Actions.xml file that handles the mapping of intents between your app and the Actions platform. You can start building with these new intents today and deploy to Assistant users on Android in the coming months. This is a huge opportunity to offer your fans an effortless way to engage more frequently with your apps.

Build for devices in the home

Take advantage of Smart Displays’ interactive screens

Last year, we saw the introduction of the Smart Display as a new device category. The interactive visual surface opens up many new possibilities for developers.

Today, we’re introducing a developer preview of Interactive Canvas which lets you create full-screen experiences that combine the power of voice, visuals and touch. Canvas works across Smart Displays and Android phones, and it uses open web technologies you’re likely already familiar with, like HTML, CSS and Javascript.

Here’s an example of what you can build when you can leverage the full screen of a Smart Display:

Full screen of a Smart Display

Interactive Canvas is available for building games starting today, and we’ll be adding more categories soon. Visit the Actions Console to be one of the first to try it out.

Enable smart home devices to communicate locally

There are now more than 30,000 connected devices that work with the Assistant across 3,500 brands, and today, we’re excited to announce a new suite of local technologies that are specifically designed to create an even better smart home.

Introducing a preview of the Local Home SDK which enables you to run your smart home code locally on Google Home Speakers and Nest Displays and use its radios to communicate locally with your smart devices. This reduces cloud hops and brings a new level of speed and reliability to the smart home. We’ve been working with some amazing partners including Philips, Wemo, TP-Link, and LIFX on testing this SDK and we’re excited to open it up for all developers next month.

Flowchart of Local Home SDK

Make setup more seamless

And, through the Local Home SDK, we’re improving the device setup experience by providing users with a seamless setup experience, something we launched in partnership with GE smart lights this past October. So far, people have loved the ability to set up their lights in less than a minute in the Google Home app. We’re now scaling this to more partners, so go here if you’re interested.

Make your devices smart with Assistant Connect

Also, at CES earlier this year we previewed Google Assistant Connect which leverages the Local Home SDK. Assistant Connect enables smart home and appliance developers to easily add Assistant functionality into their devices at low cost. It does this by offloading a lot of work onto the Assistant to complete Actions, display content and respond to commands. We've been hard at work developing the platform along with the first products built on it by Anker, Leviton and Tile. We can't wait to show you more about Assistant Connect later this year.

New device types and traits

For those of you creating Actions for the smart home, we’re also releasing 16 new device types and three new device traits including LockUnlock, ArmDisarm, and Timer. Head over to our developer documentation for the full list of 38 device types and 18 device traits, and check out our sample project on GitHub to start building.

Get started with our new tools for all types of developers

Whether you’re looking to extend the reach of your content, drive more usage in your apps, or build custom Assistant-powered experiences, you now have more tools to do so.

If you want to learn more about how you can start building with these tools, check out our website to get started and our schedule so you can tune in to all of our developer talks that we’ll be hosting throughout the week.

We can’t wait to build together with you!