Tag Archives: Google Assistant

Meet the bilingual Google Assistant with new smart home devices

This summer, we’ve brought the Google Assistant to more devices across Europe and the rest of the world to help you get answers and get things done in more languages (most recently supporting Spanish, Swedish and Dutch).

At IFA 2018, we’re adding multilingual support, so that the Assistant will be able to understand and speak more than one language at a time. Additionally, we’ll be introducing new phones and a broad range of devices and appliances for the home that support the Assistant from our growing ecosystem of partners in Europe.

Talk to the Google Assistant in multiple languages

Family members in bilingual homes often switch back and forth between languages, and now the Assistant can keep up. With our advancement in speech recognition, you can now speak two languages interchangeably with the Assistant on smart speakers and phones and the Assistant will respond in kind. This is a first-of-its-kind feature only available on the Assistant and is part of our multi-year effort to make your conversations with the Assistant more natural.

If you’re looking for an answer in English, ask, “Hey Google, what’s the weather like today?” If you’re craving tunes from your favorite German hip hop band, just ask “Hey Google, spiele die Fantastischen Vier.” Currently, the Assistant can understand any pair of languages within English, German, French, Spanish, Italian, and Japanese. We’ll be expanding to more languages in the coming months.

Your bilingual Google Assistant

A fully connected home

Enjoying home entertainment
Listening to music is one of the most popular ways people use the Assistant. That’s why we built the Google Home Max to offer high-fidelity and balanced sound and now it's available in Germany, UK and France—Google Home Max will hit store shelves starting today.

This week, we’re also announcing that the Assistant will be built into new voice-activated speakers, including the Bang & Olufsen’s Beosound 1 and Beosound 2, Blaupunkt’s PVA 100, Harman Kardon’s HK Citation series, Kygo’s Speaker B9-800, Polaroid’s Sam and Buddy and Marshall Acton II and Stanmore II. Expect these smart speakers and soundbars to roll out later this year in local European markets.

Getting things done in the kitchen
On the heels of introducing our first ever Smart Displays last month with Lenovo, we’re expanding our offerings with the upcoming launch of JBL’s Link View and LG XBOOM AI ThinQ WK9 in the coming weeks. With these new Smart Displays, you’ll have the perfect kitchen companion. You can use your voice and tap or swipe the screen to follow along with a recipe, control your smart home, watch live TV on YouTube TV, and make video calls with Google Duo. Smart Displays also come integrated with all your favorite Google products services like Google Calendar, Google Maps, Google Photos and YouTube.

Controlling all connected devices in your home
The Assistant is also making your home even smarter. Just in the past year, there are now triple the number of home devices and appliances that work with the Assistant in Europe from all the major local brands you’re familiar with.

Our partners will be releasing more devices that work with the Assistant throughout the home in the coming months, including:

  • Thermostats: tado° Smart Thermostat and Smart Radiator Thermostat, Homematic IP Radiator Thermostate
  • Security and Smart Home Hubs: Netatmo’s Smart Indoor and Outdoor  Security Cameras, TP-Link’s Kasa Cam KC120 and Kasa Cam Outdoor KC200, Smanos K1 SmartHome DIY Security Kit, and Somfy’ TaHoma smart home hub
  • Lighting: FIBARO Switch, MEDION RGB LED bulb and stripe, and the Nanoleaf Light Panels
  • Appliances: Electrolux’s smart ovens, iRobot® Roomba® 980, 896 and 676 vacuums

Whether you speak German, French, English, Italian, Spanish, you’ll be able to set the temperature, lock the doors, dim the lights and more from a smart speaker and smartphone. 

Smart Home

On the go with your phone and headphones

The Google Assistant is expanding on more Android phones and headphones, helping you when you're on the go. Some of the latest flagship devices, including the LG G7 One, SHARP Simple Smartphone 4 and Vivo NEX S, now feature dedicated buttons to easily access the Assistant. In addition, the new Xperia XZ3 from Sony and Blackberry Key 2 LE also take advantage of the shortcuts to trigger the Assistant.

And this week we're announcing that over the coming year, more headphones are on the way, including the JBL Everest GA and LG Tone Platinum, and Earin M-2. When you pair them to your phone, you can talk to the Assistant instantly with just a touch, whether you want to skip a track to hear the next song, get notifications, and respond to your messages, or set reminders.

Phew, that was a lot of news. With lots of new devices and partners coming to Europe, the Google Assistant will be available to help you through every step of your day.


Protecting your home with the Google Assistant

Empty homes are more vulnerable to being burglarized, and it’s important to have a system in place that can monitor, alter and deter crime while you’re away. That’s where the Google Assistant can step in and help you keep an eye on everything at home.

We worked some of the most trusted home security brands to launch new devices that work with the Assistant.

Monitor your home 24/7. There are many security cameras and lights out there that work with the Assistant to help you keep an eye on your home while you’re away. With the new Arlo Security Lights, you can get instant alerts when motion is detected or pair the lights with Arlo security cameras. There are also several cameras, such as the  Nest Cam, for the interior and exterior of your home that stream 24/7 and can be checked by simply asking the Google Assistant on the app. And with any Nest Cam model, you can also ask Assistant to stream live feeds onto Chromecast-enabled televisions. If there’s an intruder, talk and listen through the camera to scare them off.

Lock from anywhere.
Smart locks allow you to lock and unlock your door from anywhere in the world, making it easy for you to monitor your doorstep while you’re away. Beginning tomorrow we’ll launch a new integration with the Nest x Yale Lock.You can use the Assistant to check the status of your lock, remotely lock it, and even include it in a Routine. For example you can lock the door automatically before going to bed by saying “Hey Google, goodnight.” Additionally, with the recent integrations of the Assistant with the August Smart Lock, Schlage® Sense Smart Deadbolt, and Sesame Lock by Candy House, you can share access with trusted friends and family and lock the door with your voice. You’ll also get an alert whenever someone locks or unlocks the door.

Keep your home secure.
Security systems aren’t new but “smart” security systems are. ADT Pulse, Honeywell’s new Smart Home Security solution and Nest Secure alarm system will let you know what’s happening at your home while you’re gone. If the alarm goes off, you’ll get an alert on your phone with information about what triggered the alarm. Silence the alarm through the apps and alert the police.

With these new security devices, your now have an easy way to protect your home with the Google Assistant. 

Verifying your Google Assistant media action integrations on Android

Posted by Nevin Mital, Partner Developer Relations

The Media Controller Test (MCT) app is a powerful tool that allows you to test the intricacies of media playback on Android, and it's just gotten even more useful. Media experiences including voice interactions via the Google Assistant on Android phones, cars, TVs, and headphones, are powered by Android MediaSession APIs. This tool will help you verify your integrations. We've now added a new verification testing framework that can be used to help automate your QA testing.

The MCT is meant to be used in conjunction with an app that implements media APIs, such as the Universal Android Music Player. The MCT surfaces information about the media app's MediaController, such as the PlaybackState and Metadata, and can be used to test inter-app media controls.

The Media Action Lifecycle can be complex to follow; even in a simple Play From Search request, there are many intermediate steps (simplified timeline depicted below) where something could go wrong. The MCT can be used to help highlight any inconsistencies in how your music app handles MediaController TransportControl requests.

Timeline of the interaction between the User, the Google Assistant, and the third party Android App for a Play From Search request.

Previously, using the MCT required a lot of manual interaction and monitoring. The new verification testing framework offers one-click tests that you can run to ensure that your media app responds correctly to a playback request.

Running a verification test

To access the new verification tests in the MCT, click the Test button next to your desired media app.

MCT Screenshot of launch screen; contains a list of installed media apps, with an option to go to either the Control or Test view for each.

The next screen shows you detailed information about the MediaController, for example the PlaybackState, Metadata, and Queue. There are two buttons on the toolbar in the top right: the button on the left toggles between parsable and formatted logs, and the button on the right refreshes this view to display the most current information.

MCT Screenshot of the left screen in the Testing view for UAMP; contains information about the Media Controller's Playback State, Metadata, Repeat Mode, Shuffle Mode, and Queue.

By swiping to the left, you arrive at the verification tests view, where you can see a scrollable list of defined tests, a text field to enter a query for tests that require one, and a section to display the results of the test.

MCT Screenshot of the right screen in the Testing view for UAMP; contains a list of tests, a query text field, and a results display section.

As an example, to run the Play From Search Test, you can enter a search query into the text field then hit the Run Test button. Looks like the test succeeded!

MCT Screenshot of the right screen in the Testing view for UAMP; the Play From Search test was run with the query 'Memories' and ended successfully.

Below are examples of the Pause Test (left) and Seek To test (right).

MCT Screenshot of the right screen in the Testing view for UAMP; a Pause test was run successfully. MCT Screenshot of the right screen in the Testing view for UAMP; a Seek To test was run successfully.

Android TV

The MCT now also works on Android TV! For your media app to work with the Android TV version of the MCT, your media app must have a MediaBrowserService implementation. Please see here for more details on how to do this.

On launching the MCT on Android TV, you will see a list of installed media apps. Note that an app will only appear in this list if it implements the MediaBrowserService.

Android TV MCT Screenshot of the launch screen; contains a list of installed media apps that implement the MediaBrowserService.

Selecting an app will take you to the testing screen, which will display a list of verification tests on the right.

Android TV MCT Screenshot of the testing screen; contains a list of tests on the right side.

Running a test will populate the left side of the screen with selected MediaController information. For more details, please check the MCT logs in Logcat.

Android TV MCT Screenshot of the testing screen; the Pause test was run successfully and the left side of the screen now displays selected MediaController information.

Tests that require a query are marked with a keyboard icon. Clicking on one of these tests will open an input field for the query. Upon hitting Enter, the test will run.

Android TV MCT Screenshot of the testing screen; clicking on the Seek To test opened an input field for the query.

To make text input easier, you can also use the ADB command:

adb shell input text [query]

Note that '%s' will add a space between words. For example, the command adb shell input text hello%sworld will add the text "hello world" to the input field.

What's next

The MCT currently includes simple single-media-action tests for the following requests:

  • Play
  • Play From Search
  • Play From Media ID
  • Play From URI
  • Pause
  • Stop
  • Skip To Next
  • Skip To Previous
  • Skip To Queue Item
  • Seek To

For a technical deep dive on how the tests are structured and how to add more tests, visit the MCT GitHub Wiki. We'd love for you to submit pull requests with more tests that you think are useful to have and for any bug fixes. Please make sure to review the contributions process for more information.

Check out the latest updates on GitHub!

Five insights on voice technology

Over the last couple years something interesting happened—millions of people began having conversations with their speakers, cars, computers and phones. Voice technology is fundamentally changing the way we use we our devices, often in ways we didn’t expect. 

We’ve learned a lot about how we can better serve people’s needs with voice, helping them save time and get things done. Here are a few things we’ve learned since we introduced the Google Assistant nearly two years ago.

Voice is about action.

When people talk to their Google Assistant, they’re usually trying to get something done. Assistant queries are 40 times more likely to be action-oriented than Search, with people asking for things like “send a text message,” “turn off the lights,” or “turn on airplane mode.”


Why do we think this is happening? For many tasks, particularly while you’re on the go, it can be much easier to get things done through voice. I can say “turn on the lights and play some music,” without having to worry about which app I need to open. Even for basic things like creating a calendar invite, I don’t have to look down at my phone or interrupt what I’m doing, I can just say “create an appointment for noon on Saturday.” These seem like small things, and they are. But they illustrate what makes voice so unique—the technology allows me to complete a task in a way that feels natural. The more we can build these types of experiences, the closer we get to an ideal Assistant.

People expect conversations.

When people start using voice assistants, we often see very simple commands. But very quickly, expectations go up in terms of complex dialogue. We might see “weather Chicago” typed in Search, whereas with the Assistant we see much longer and more conversational queries like “what’s the weather today in Chicago at 3pm.” On average, Assistant queries are 200 times more conversational than Search.

We’ve seen that even simple commands can take all forms. For example, people ask the Google Assistant to set an alarm in more than 5,000 different ways, which means that we have to build the Assistant to understand this conversational complexity.  

simple commands take all forms

Screens change everything.

The world hasn’t completely shifted to voice, nor do we expect it to. Screens bring a completely new canvas for conversational AI, where we can bring together voice and touch in an intelligent way. So when you ask for a pasta dough recipe, you can get visuals of what the dough should look like while the Assistant reads you the steps along the way.  

With the launch of Smart Displays and our new visual experience for phones, we’ve evolved the Google Assistant to become much more dynamic, spanning voice, screens, tapping and typing. And we’re seeing people respond—in fact, nearly half of interactions with Assistant today include both voice and touch input.

Daily routines matter.

You can access the Assistant almost anywhere you are throughout the day—on the phone, in the car, or on a speaker in the living room. So it makes sense that when people use the Assistant, it’s largely driven by their environment and what they’re trying to accomplish in their daily routines.

Let’s take a look at some of the most popular ways we use the Google Assistant in our daily routines. In the morning, we’ll use our smart speakers to ask for the weather or listen to the news. During lunch and on the commute home, we’ll text and call our friends, or look for local restaurants. When we get home, we want to listen to music. And at the end of the day, we get ready for tomorrow with tasks like “set an alarm,” “set a reminder,” or “ask the Assistant to tell me about tomorrow’s meetings.” Where and how we use our Assistant varies throughout the day, but the consistency of the experience should stay the same.

google assistant daily habits

Voice is universal.

One of the most exciting things to witness about digital assistants is that even though the Assistant is a  new technology, it’s incredibly easy to adopt. There’s no user manual needed, and people of all ages, across all types of devices, and in many different geographies can use the Assistant. Because of this, we’re finding that Google Assistant users defy the early adopter stereotype—there’s a huge uptick in seniors and families, and women are the fastest growing user segment for the Assistant.

Voice is also universal on a global scale. Over the past year, we've brought the Assistant to more countries and languages. In places where people are coming online for the first time—like India, where Google Assistant usage has tripled in India since the beginning of the year—voice is taking the forefront as the primary way they interact with their devices. For example, Google Assistant usage has tripled in India since the beginning of the year.

Of course voice technology is still relatively new and evolving. We’re just figuring out what works in this space. But it’s exciting to see how voice technology is making it easier for people to get things done, and we’re all learning together.


Hey Google, tell me something good

The news has always played an essential role in our lives, keeping us informed about the world and the issues we care about. These days we’re consuming more news than ever, and sometimes, it can feel like there are only problems out there. But the fact is, there is a plethora of “good news” happening, and we're not talking about unlikely animal friendships or random acts of kindness. Real people are making progress solving real issues—and hearing about those stories is a crucial part of a balanced media diet.

The Assistant is making this kind of news easier to find.

“Tell me something good” is a new experimental feature for Assistant users in the U.S.  that delivers your daily dose of good news. Just say “Hey Google, tell me something good” to receive a brief news summary about people who are solving problems for our communities and our world.

This is good news like how Georgia State University coupled empathy with data to double its graduation rate and eliminate achievement gaps between white and black students, how backyard beekeepers in East Detroit are bringing back the dwindling bee population while boosting the local economy, and how Iceland curbed teen drinking with nightly curfews and coupons for kids to enroll in extracurricular activities.

Hey Google, Tell Me Something Good

Watch to learn more about "Tell me something good"

The stories come from a wide range of media outlets, curated and summarized by the Solutions Journalism Network. They’re a nonpartisan nonprofit dedicated to spreading the practice of solutions journalism, which highlights how problems are solvable and that doing better is possible. Solutions journalism empowers and energizes audiences, helping to combat negative news fatigue. It’s an important part of a balanced news diet, so we’re exploring how to incorporate more solutions journalism wherever you access Google News.

“Tell me something good” isn’t meant to be a magic solution. But it’s an experiment worth trying because it’s good info about good work that may bring some good to your day. Give it a go yourself on any Assistant-enabled device, including your phone, Smart Display, or Google Home.

Pump up the jams: New music streaming services now available on Google Home

Cue the music: You can now ask your Google Assistant on Google Home, Mini and Max to play some of your favorite songs with Pandora Premium and Deezer.

Both services are now available on the Google Assistant across supported devices like Google Home, Smart Displays and more. Pandora Premium subscribers can search and play their favorite songs, albums and playlists, just by using their voice. 

Plus, Deezer now allows you to stream music hands-free with access to more than 36 million HiFi tracks from around the world. So if you’ve got a Google Home Max, get ready to turn it all the way up.

To play music from Pandora or Deezer, link your accounts in the Google Home app. Then all you have to do is say, “Hey Google, play my Chill Vibes playlist on Deezer,” or “Hey Google, play my Chill playlist on Pandora.”

If you have a Google Home, you can try Pandora Premium free for 90 days. Deezer on Google Home is available for HiFi and Premium users in the U.S., Canada, Italy, Australia, U.K., France and Germany. Check the Google Home app on Android or visit store.google.com to see if your region is eligible for a special 90-day Deezer Premium trial offer

With the addition of Pandora Premium and Deezer, you have even more choices when it comes to music streaming services. So, next time you’re throwing a party or hanging out with friends, we’ve got the DJ booth covered.


Hey Google, what’s the news?

Back at Google I/O, we launched the new Google News to help you keep up with the news that matters to you. Since then, millions of you have turned to Google News to follow the big stories of the day, subscribe to your favorite local and national publishers, and dig into topics and people you care about.


But there are moments in the day when you want to catch up on the news while your eyes or hands are busy. Maybe you’re listening to a podcast as you walk to work or catching up on what’s happening while driving to pick up the kids. We are beginning to bring the best of Google News to devices with the Google Assistant so that you can stay up to date wherever you are.


Last week, in the U.S., Lenovo launched the first of many Smart Displays with the Google Assistant. Smart Displays help you get more done with a glanceable touch screen and offer video or audio news briefings to catch you up on headlines, sports, politics, and more. You can choose your preferred news sources from hundreds of national and local broadcasters including CNBC, CNN, Cheddar and more. Just ask, “Hey Google, what’s the news?”

smart display

When you want to go deeper or learn more about a specific topic, ask the Assistant: “What’s the news on the women’s national soccer team?” or “What’s the latest on NASA?” The Google Assistant will find relevant videos from YouTube to play on your Smart Display, and on Assistant speakers like Google Home, it will read out excerpts from news articles from a growing list of publishers.

And whether you’re at home or on the go, the Assistant is there to help you stay informed. All these features are available today on Android phones and will soon be coming to Android Auto and Assistant-enabled headphones (including Google Pixel Buds).

Right now, these updates are coming to devices with the Google Assistant in the U.S. We plan to learn from the U.S. launches and then expand further, so stay tuned for more as we grow the news on the Google Assistant community globally.


Designing for the Google Assistant on Smart Displays

Posted by Saba Zaidi, Senior Interaction Designer, Google Assistant

Earlier this year we announced Smart Displays, a new category of devices with the Google Assistant built in, that augment voice experiences with immersive visuals. These new, highly visual devices can make it easier to convey complex information, suggest Actions, support transactions, and express your brand. Starting today, Smart Displays are available for purchase in major US retailers, both in-store and online.

Interacting through voice is fast and easy, because speaking comes naturally to people, and language doesn't constrain people to predefined paths, unlike traditional visual interfaces. However in audio-only interfaces, it can be difficult to communicate detailed information like lists or tables, and nearly impossible to represent rich content like images, charts or a visual brand identity. Smart Displays allow you to create Actions for the Assistant that can respond to natural conversation, and also display information and represent your brand in an immersive, visual way.

Today we're announcing consumer availability of rich responses optimized for Smart Displays. With rich responses, developers can use basic cards, lists, tables, carousels and suggestion chips, which give you an array of visual interactions for your Action, with more visual components coming soon. In addition, developers can also create custom themes to more deeply customize your Action's look and feel.

If you've already built a voice-centric Action for the Google Assistant, not to worry, it'll work automatically on Smart Displays. But we highly recommend adding rich responses and custom themes to make your Action even more visually engaging and useful to your users on Smart Displays. Here are a few tips to get you started:

1. Consider using visual components instead of complex voice prompts

Smart Displays offer several visual formats for displaying information and facilitating user input. A carousel of images, a list or a table can help users scan information efficiently and then interact with a quick tap or swipe.

For example, consider a long, spoken prompt like: "Welcome to National Anthems! You can play the national anthems from 20 different countries, including the United States, Canada and the United Kingdom. Which would you like to hear?"

Instead of merely showing the transcript of that whole spoken prompt on the screen, a carousel of country flags makes it easy for users to scroll and tap the anthem they want to hear.

2. Use visual suggestions to streamline the conversation

Suggestion chips are a great way to surface recommendations, aid feature discovery and keep the conversation moving on Smart Displays.

In this example, suggestion chips can help users find the "surprise me" feature, find the most popular anthems, or filter anthems by region.

3. Express your brand with themes

You can take advantage of new custom themes to differentiate your experience and represent your brand's persona, choosing a custom voice, background image or color, font style, or the shape of your cards to match your branding.

For example, an Action like California Surf Report, could be themed in a more immersive and customized way.

4. Check out our library of developer resources

We offer more tips on designing and building for Smart Displays and other visual devices in our conversation design site and in our talk from I/O about how to design Actions across devices.

Then head to our documentation to learn how to customize the visual appearance of your Actions with rich responses. You can also test and tinker with customizations for Smart Displays in the Actions Console simulator.

Don't forget that once you publish your first Action you can join our community program* and receive your exclusive Google Assistant t-shirt and up to $200 of monthly Google Cloud credit.

We can't wait to see—quite literally—what you build next! Thanks for being a part of our community, and as always, if you have ideas or requests that you'd like to share with our team, don't hesitate to join the conversation.


*Some countries are not eligible to participate in the developer community program, please review the terms and conditions

Hey Google, what’s the latest news?

Since launching the Google Assistant in 2016, we have seen users ask questions about everything from weather to recipes and news. In order to fulfill news queries with results people can count on, we collaborated on a new schema.org structured data specification called speakable for eligible publishers to mark up sections of a news article that are most relevant to be read aloud by the Google Assistant.

When people ask the Google Assistant -- "Hey Google, what's the latest news on NASA?", the Google Assistant responds with an excerpt from a news article and the name of the news organization. Then the Google Assistant asks if the user would like to hear another news article and also sends the relevant links to the user's mobile device.

As a news publisher, you can surface your content on the Google Assistant by implementing Speakable markup according to the developer documentation. This feature is now available for English language users in the US and we hope to launch in other languages and countries as soon as a sufficient number of publishers have implemented speakable. As this is a new feature, we are experimenting over time to refine the publisher and user experience.

If you have any questions, ask us in the Webmaster Help Forum. We look forward to hearing from you!

How creating an Action can complement your Android app

Posted by Neto Marin - Actions on Google Developer Advocate

There are millions of apps in the Android ecosystem, so helping yours get discovered can require some investment. Your app needs to offer something that differentiates it from other similar apps to stand out to users.

Building a companion Action is a fast and simple way to increase your Android app's potential reach by creating a new entrypoint from devices covered by the Google Assistant. This lets you bring your services to users without needing to install anything through voice, and can bring people into your app when it can provide more value.

Your companion Action complements your Android app's experience by offering some of your services through the Google Assistant, which is available on more than 500 million devices including speakers, phones, cars, headphones, and more. Creating an Action provides a frictionless way for users to start engaging with your services wherever the Google Assistant is available.

Creating an Action for the Assistant will extend your brand presence, bringing your services to new devices and contexts as users interact with the Google Assistant.

Feature what your app does better

It is probably a mistake to try to rewrite all of your Android app as a conversational Action, since voice is a different modality with different constraints and usage patterns. Instead, you should start by selecting the most important or popular features in your app that translate well into a voice context and can be more easily accomplished there. Then, you can create your conversational experience to offer these features on Google Assistant devices. Check out the Conversation design site, which has several articles and guides about how to create a great voice UI.

Let's take a look at a hypothetical example. Imagine you have a mobile commerce app. Some features include searching for products, navigating to different categories, adding payment information, and checking out. You could build an Action for the Assistant with most of the same functionality, but we encourage you to look for what makes the most sense in a conversational experience.

In this case, your Action could focus on everything that a user would want to know after they've purchased a product through your Android app or web page. You could offer a quick way to get updates about a purchase's status (if you provide different states for payment/purchase process) and shipment information, or provide an interface for re-ordering a user's favorite products. Then, your users would be able to ask something like, "Hey Google, ask Voice Store about my last purchase."

Or, to reach users who have never made a purchase before, you could create an Action to offer exciting deals for common products. For example, you could create an Action that is invoked with, "Hey Google, ask Voice Store what are the deals on TVs today".

As you can see, starting with a "hero" use case for your Action is an exciting way to introduce conversational features that complement your Android app, and it will take less time than you think.

At Google I/O 2018, we presented a talk, "Integrating your Android apps with the Google Assistant" which contains more details and examples for developers.

Delivering user's purchases across surfaces

In-app purchases, subscriptions, and one-time products have proven successful for Android developers when it comes to monetization, allowing developers to offer different kinds of digital goods and additional value for paying users. These types of monetization are proven to drive user conversion and make the app more profitable.

Google Play Billing offers a series of tools, APIs, and documentation to help developers manage the subscription life-cycle, build server-side validation, and much more. If you are new to in-app billing, check out the Google Play Billing Overview page.

Now, Android developers can expand where users can access these goods or upgraded experiences by offering them through Actions, as well. This expansion is accomplished by honoring the user's entitlements on Google Play across different surfaces and devices, reaching users when they can't (or don't want to) use an app, like while cooking or driving.

For non-Android platforms, you'll need to ask your users to link their accounts. You can then use your user's account history to identify what purchases they've made on other surfaces.

Check the Accessing Digital Purchases page for a step-by-step guide on how to enable access to the user's purchases and request and parse the purchase data.

What's next?

If you are not familiar with Actions on Google yet, start by checking out our overview page, which describes the platform in detail and tells you all you need to know to create your Actions for the Google Assistant.

Stay tuned for more posts about how to improve your Android app experience with Actions on Google.

Thanks for reading!