Tag Archives: assistant

Google Keep integration with Assistant available to all Google Workspace users

What’s changing 

If you’re using the Google Assistant with your Google Workspace device, you can set Google Keep as the default provider for your notes and lists. You can ask Assistant to create a new list, add or delete items for an existing list, or read back all the list items to you. 

To configure Keep as your provider, visit the notes and lists section of Assistant Settings and select Keep



Getting started 

  • Admins: There is no admin control for this feature. Visit the Help Center to learn more about managing Keep in your organization
  • End users: If Keep is enabled in your organization, you can change your note provider to Keep in the “notes and list” section of the Assistant Settings. Visit the Help Center to learn more about creating or editing notes with Assistant

Rollout pace 


Availability 

  • Available to all Google Workspace customers 

Resources 

13 Things to know for Android developers at Google I/O!

Posted by Maru Ahues Bouza, Director of Android Developer Relations

Android I/O updates: Jetpack, Wear OS, etc 

There aren’t many platforms where you can build something and instantly reach billions of people around the world, not only on their phones—but their TVs, cars, tablets, watches, and more. Today, at Google I/O, we covered a number of ways Android helps you make the most of this opportunity, and how Modern Android Development brings as much commonality as possible, to make it faster and easier for you to create experiences that tailor to all the different screens we use in our daily lives.

We’ve rounded up the top 13 things to know for Android developers—from Jetpack Compose to tablets to Wear OS and of course… Android 13! And stick around for Day 2 of Google I/O, when Android’s full track of 26 technical talks and 4 workshops drop. We’re also bringing back the Android fireside Q&A in another episode of #TheAndroidShow; tweet us your questions now using #AskAndroid, and we’ve assembled a team of experts to answer live on-air, May 12 at 12:30PM PT.


MODERN ANDROID DEVELOPMENT

#1: Jetpack Compose Beta 1.2, with support for more advanced use cases

Android’s modern UI toolkit, Jetpack Compose, continues to bring the APIs you need to support more advanced use cases like downloadable fonts, LazyGrids, window insets, nested scrolling interop and more tooling support with features like LiveEdit, Recomposition Debugging and Animation Preview. Check out the blog post for more details.

Jetpack Compose 1.2 Beta  

#2: Android Studio: introducing Live Edit

Get more done faster with Android Studio Dolphin Beta and Electric Eel Canary! Android Studio Dolphin includes new features and improvements for Jetpack Compose and Wear OS development and an updated Logcat experience. Android Studio Electric Eel comes with integrations with the new Google Play SDK Index and Firebase Crashlytics. It also offers a new resizable emulator to test your app on large screens and the new Live Edit feature to immediately deploy code changes made within composable functions. Watch the What’s new in Android Development Tools session and read the Android Studio I/O blog post here.

#3: Baseline Profiles - speed up your app load time!

The speed of your app right after installation can make a big difference on user retention. To improve that experience, we created Baseline Profiles. Baseline Profiles allow apps and libraries to provide the Android runtime with metadata about code path usage, which it uses to prioritize ahead-of-time compilation. We've seen up to 30% faster app startup times thanks to adding baseline profiles alone, no other code changes required! We’re already using baseline profiles within Jetpack: we’ve added baselines to popular libraries like Fragments and Compose – to help provide a better end-user experience. Watch the What’s new in app performance talk, and read the Jetpack blog post here.

Modern Android Development 

BETTER TOGETHER

#4: Going big on Android tablets

Google is all in on tablets. Since last I/O we launched Android 12L, a release focused on large screen optimizations, and Android 13 includes all those improvements and more. We also announced the Pixel tablet, coming next year. With amazing new hardware, an updated operating system & Google apps, improved guidelines and libraries, and exciting changes to the Play store, there has never been a better time to review your apps and get them ready for large screens and Android 13. That’s why at this year’s I/O we have four talks and a workshop to take you from design to implementation for large screens.


#5: Wear OS: Compose + more!

With the latest updates to Wear OS, you can rethink what is possible when developing for wearables. Jetpack Compose for Wear OS is now in beta, so you can create beautiful Wear OS apps with fewer lines of code. Health Services is also now in beta, bringing a ton of innovation to the health and fitness developer community. And last, but certainly not least, we announced the launch of The Google Pixel Watch - coming this Fall - which brings together the best of Fitbit and Wear OS. You can learn more about all the most exciting updates for wearables by watching the Wear OS technical session and reading our Jetpack Compose for Wear OS announcement.

Compose for Wear OS 

#6: Introducing Health Connect

Health Connect is a new platform built in close collaboration between Google and Samsung, that simplifies connectivity between apps making it easier to reach more users with less work, so you can securely access and share user health and fitness data across apps and devices. Today, we’re opening up access to Health Connect through Jetpack Health—read our announcement or watch the I/O session to find out more!

#7: Android for Cars & Android TV OS

Android for Cars and Android TV OS continue to grow in the US and abroad. As more users drive connected or tune-in, we’re introducing new features to make it even easier to develop apps for cars and TV this year. Catch the “What’s new with Android for Cars” and “What's new with Google TV and Android TV” sessions on Day 2 (May 12th) at 9:00 AM PT to learn more.

#8: Add Voice Across Devices

We’re making it easier for users to access your apps via voice across devices with Google Assistant, by expanding developer access to Shortcuts API for Android for Cars, with support for Wear OS apps coming later this year. We’re also making it easier to build those experiences with Smarter Custom Intents, enabling Assistant to better detect broader instances of user queries through ML, without any NLU training heavy lift. Additionally, we’re introducing improvements that drive discovery to your apps via voice on Mobile, first through Brandless Queries, that drive app usage even when the user hasn’t explicitly said your app’s name, and App Install Suggestions that appear if your isn’t installed yet–these are automatically enabled for existing App Actions today.


AND THE LATEST FROM ANDROID, PLAY, AND MORE:

#9: What’s new in Play!

Get the latest updates from Google Play, including new ways Play can help you grow your business. Highlights include the ability to deep-link and create up to 50 custom listings; our LiveOps beta, which will allow more developers to submit content to be considered for featuring on the Play Store; and even more flexibility in selling subscriptions. Learn about these updates and more in our blog post.

#10: Google Play SDK Index

Evaluate if an SDK is right for your app with the new Google Play SDK index. This new public portal lists over 100 of the most widely used commercial SDKs and information like which app permissions the SDK requests, statistics on the apps that use them, and which version of the SDK is most popular. Learn more on our blog post and watch “What’s new in Google Play” and “What’s new in Android development tools” sessions.

#11: Privacy Sandbox on Android

Privacy Sandbox on Android provides a path for new advertising solutions to improve user privacy without putting access to free content and services at risk. We recently released the first Privacy Sandbox on Android Developer Preview so you can get an early look at the SDK Runtime and Topics API. You can conduct preliminary testing of these new technologies, evaluate how you might adopt them for your solutions, and share feedback with us.

#12: The new Google Wallet API

The new Google Wallet gives users fast and secure access to everyday essentials across Android and Wear OS. We’re enhancing the Google Wallet API, previously called Google Pay Passes API, to support generic passes, grouping and mixing passes together, for example grouping an event ticket with a voucher, and launching a new Android SDK which allows you to save passes directly from your app without a backend integration. To learn more, read the full blog post, watch the session, or read the docs at developers.google.com/wallet.

#13: And of course, Android 13!

The second Beta of Android 13 is available today! Get your apps ready for the latest features for privacy and security, like the new notification permission, the privacy-protecting photo picker, and improved permissions for pairing with nearby devices and accessing media files. Enhance your app with features like app-specific language support and themed app icons. Build with modern standards like HDR video and Bluetooth LE Audio. You can get started by enrolling your Pixel device here, or try Android 13 Beta on select phones, tablets, and foldables from our partners - visit developer.android.com/13 to learn more.

That’s just a snapshot of some of the highlights for Android developers at this year’s Google I/O. Be sure to watch the What’s New in Android talk to get the landscape on the full Android technical track at Google I/O, which includes 26 talks and 4 workshops. Enjoy!

Personalize user journeys by Pushing Dynamic Shortcuts to Assistant

Posted by Jessica Dene Earley-Cha, Developer Relations Engineer

Like many other people who use their smartphone to make their lives easier, I’m way more likely to use an app that adapts to my behavior and is customized to fit me. Android apps already can support some personalization like the ability to long touch an app and a list of common user journeys are listed. When I long press my Audible app (an online audiobook and podcast service), it gives me a shortcut to the book I’m currently listening to; right now that is Daring Greatly by Brené Brown.

Now, imagine if these shortcuts could also be triggered by a voice command – and, when relevant to the user, show up in Google Assistant for easy use.

Wouldn't that be lovely?

Dynamic shortcuts on a mobile device

Well, now you can do that with App Actions by pushing dynamic shortcuts to the Google Assistant. Let’s go over what Shortcuts are, what happens when you push dynamic shortcuts to Google Assistant, and how to do just that!

Android Shortcuts

As an Android developer, you're most likely familiar with shortcuts. Shortcuts give your users the ability to jump into a specific part of your app. For cases where the destination in your app is based on individual user behavior, you can use a dynamic shortcut to jump to a specific thing the user was previously working with. For example, let’s consider a ToDo app, where users can create and maintain their ToDo lists. Since each item in the ToDo list is unique to each user, you can use Dynamic Shortcuts so that users' shortcuts can be based on their items on their ToDo list.

Below is a snippet of an Android dynamic shortcut for the fictional ToDo app.

val shortcut = = new ShortcutInfoCompat.Builder(context, task.id)
.setShortLabel(task.title)
.setLongLabel(task.title)
.setIcon(Icon.createWithResource(context, R.drawable.icon_active_task))
.setIntent(intent)
.build()

ShortcutManagerCompat.pushDynamicShortcut(context, shortcut)

Dynamic Shortcuts for App Actions

If you're pushing dynamic shortcuts, it's a short hop to make those same shortcuts available for use by Google Assistant. You can do that by adding the Google Shortcuts Integration library and a few lines of code.

To extend a dynamic shortcut to Google Assistant through App Actions, two jetpack modules need to be added, and the dynamic shortcut needs to include .addCapabilityBinding.

val shortcut = = new ShortcutInfoCompat.Builder(context, task.id)
.setShortLabel(task.title)
.setLongLabel(task.title)
.setIcon(Icon.createWithResource(context, R.drawable.icon_active_task))
.addCapabilityBinding("actions.intent.GET_THING", "thing.name", listOf(task.title))
.setIntent(intent)
.build()

ShortcutManagerCompat.pushDynamicShortcut(context, shortcut)

The addCapabilityBinding method binds the dynamic shortcut to a capability, which are declared ways a user can launch your app to the requested section. If you don’t already have App Actions implemented, you’ll need to add Capabilities to your shortcuts.xml file. Capabilities are an expression of the relevant feature of an app and contains a Built-In Intent (BII). BIIs are a language model for a voice command that Assistant already understands, and linking a BII to a shortcut allows Assistant to use the shortcut as the fulfillment for a matching command. In other words, by having capabilities, Assistant knows what to listen for, and how to launch the app.

In the example above, the addCapabilityBinding binds that dynamic shortcut to the actions.intent.GET_THING BII. When a user requests one of their items in their ToDo app, Assistant will process their request and it’ll trigger capability with the GET_THING BII that is listed in their shortcuts.xml.

<shortcuts xmlns:android="http://schemas.android.com/apk/res/android">
<capability android:name="actions.intent.GET_THING">
<intent
android:action="android.intent.action.VIEW"
android:targetPackage="YOUR_UNIQUE_APPLICATION_ID"
android:targetClass="YOUR_TARGET_CLASS">
<!-- Eg. name = the ToDo item -->
<parameter
android:name="thing.name"
android:key="name"/>
</intent>
</capability>
</shortcuts>

So in summary, the process to add dynamic shortcuts looks like this:

1. Configure App Actions by adding two jetpack modules ( ShortcutManagerCompat library and Google Shortcuts Integration Library). Then associate the shortcut with a Built-In Intent (BII) in your shortcuts.xml file. Finally push the dynamic shortcut from your app.

2. Two major things happen when you push your dynamic shortcuts to Assistant:

  1. Users can open dynamic shortcuts through Google Assistant, fast tracking users to your content
  2. During contextually relevant times, Assistant can proactively suggest your Android dynamic shortcuts to users, displaying it on Assistant enabled surfaces.

Not too bad. I don’t know about you, but I like to test out new functionality in a small app first. You're in luck! We recently launched a codelab that walks you through this whole process.

Dynamic Shortcuts Codelab

Looking for more resources to help improve your understanding of App Actions? We have a new learning pathway that walks you through the product, including the dynamic shortcuts that you just read about. Let us know what you think!

Thanks for reading! To share your thoughts or questions, join us on Reddit at r/GoogleAssistantDev.

Follow @ActionsOnGoogle on Twitter for more of our team's updates, and tweet using #AppActions to share what you’re working on. Can’t wait to see what you build!

Assistant Developer Relations is hiring!

Posted by Mike Bifulco, Developer Relations Engineer

Every day, millions of users ask Google Assistant for help with the things that matter to them: managing a connected home, setting reminders and timers, adding to their shopping list, communicating with friends and family, and countless other imaginative uses. Developers use Assistant APIs and tools to add voice interactivity to their apps for everything from building games, to ordering food, to listening to the news, and much more.

The Google Assistant Developer Relations team works with our community and our engineering teams to help developers build, integrate, and innovate with voice-driven technology on the Assistant platform. We help developers build Conversational Actions, Smart Home hardware and tools, and App Actions integrations with Android. As we continue our mission to bring accessible voice technology to Android devices, smart speakers and screens, we’re excited to announce that we are hiring for several roles!

What Assistant DevRel does

In Developer Relations (DevRel), we wear many hats - our developer ecosystem stretches across several Google products, and work with our community wherever we can. Our team consists of engineers, technical writers, and content producers who work to help developers build with Assistant, while providing active feedback and validation to the engineering teams to make Google Assistant even better. These are just some of the ways we do our work:

Google I/O and other conferences

Google I/O is Google’s annual developer conference, where Googlers from across the company share the latest product releases, insights from Google experts, as well as hands-on learning. The Assistant DevRel team is heavily involved in I/O, writing, producing, and delivering a variety of content types, including: keynotes, technical talks, hands-on workshops, codelabs, and technical demos. We also meet and talk to developers who are building cool things with Assistant.

We also participate in a variety of other conferences, and while most have been virtual for the past year or so, we’re looking forward to traveling to places near and far to deliver technical content to the global community.

Our team members contribute to creation and presentation of content at events like Google I/O

Our team members contribute to creation and presentation of content at events like Google I/O.

Google Developers YouTube channel

One of the best ways to get our content out to the world is via YouTube. Members of our team make frequent appearances on the Google Developers channel, producing segments and episodes for The Developer Show, Assistant On Air, AoG Pro Tips, as well as tutorials on new features and developer tools.

Open Source Projects

Another exciting part of our work is the creation and maintenance of Open Source libraries used as samples, demos, and starter kits for devs working with Assistant. As a part of the team, you’ll contribute to projects in GitHub organizations including github.com/actions-on-google and github.com/actions-on-google-labs, as well as projects and libraries created outside of Google.

Developer Platform Tools

The Assistant DevRel team also helps build and maintain the Assistant Developer Platform - we contribute to the tools, policies and features which allow developers to distribute their Assistant apps to Android devices, smart screens and speakers. This engineering work is a truly unique opportunity to shape the future of a growing developer platform, and to support the future of voice-driven and multi-modal technology – all built from the ground up.

Open positions on our team

Our team is headquartered in Mountain View, California, US. If contributing to the next generation of Google Assistant excites you, read below about our openings to find out more.

Developer Relations Engineer
Location: Mountain View, CA, New York, NY, Seattle, WA, or Austin, TX

As a Developer Relations Engineer (or DRE), you’ll work to build developer tools, code samples, and demos for Google Assistant. You’ll work with our community to educate and support developers using our APIs to build their software. You will also be the 0th customer for new features on Assistant - testing, verifying, and giving active feedback to the PM, UX, and Engineering teams that make Assistant come to life. You’ll work with Google Developer Experts to build and scale content to be shared at conferences, events, and hackathons. DREs may also occasionally contribute to blog posts, help write and produce scripts for educational videos on YouTube, and speak at events like conferences, Google Developer Groups, and meetups. Candidates should have experience building native Android apps with Java or Kotlin - experience creating web applications with HTML, JavaScript, and CSS is a plus.

Sound interesting? Learn more and apply to be a Developer Relations Engineer.

Developer Relations Engineering Manager

Location: Mountain View, CA, New York, NY, Seattle, WA, or Austin, TX

Developer Relations Engineering Managers help coordinate and direct teams of engineers to build and update developer tools, APIs, reference documentation, and code samples. As an Engineering Manager, you’ll work with leadership across the company to prioritize new features, goals, and programs for developer relations within Assistant. You’ll manage a variety of roles, including Developer Relations Engineers, Program Managers, and Technical Writers. You’ll be asked to work across a variety of technologies, with a strong focus on building tools and libraries for Android.

Sound interesting? learn more and apply to be a Developer Relations Engineering Manager

Thanks for reading! To share your thoughts or questions, join us on Reddit at r/GoogleAssistantDev.

Follow @ActionsOnGoogle on Twitter for more of our team's updates, and tweet using #AoGDevs to share what you’re working on. Can’t wait to see what you build!

Everything Assistant at I/O

Posted by Mike Bifulco

Google I/O banner

We’re excited to host the first ever virtual Google I/O Conference this year, from May 18-20, 2021 – and everyone's invited! Developers around the world will join us for keynotes, technical sessions, codelabs, demos, meetups, workshops, and Ask Me Anything (AMA) sessions hosted by Googlers whose teams have been hard at work preparing new features, APIs, and tools for you to try out. We can’t wait for you to explore everything Google has to share. Given the sheer amount of content that will be shared during those 3 days, this guide is meant to help you find sessions that might interest you if you’re interested in building and integrating with Google Assistant.

With that in mind, here’s a rundown of everything Assistant at Google I/O 2021:

Keynote: What’s New in Google Assistant (register)

We’ll kick off news from Assistant with our keynote session, which will be livestreamed on May 19th at 9:45am PST. Expect to hear about what’s happened in Assistant over the past year, new product announcements, feature updates, and tooling changes.

Keynote: What’s New in Smart Home (register)

In celebration of Google Assistant's 5th birthday, we'll share our Smart Home journey and the things we’ve learned along the way. We'll also dive into product vision, new product announcements, and showcase great Assistant experiences built by our developer community. Catch the Smart Home keynote on May 19th at 4:15pm PST.

Technical Sessions

Technical sessions are 15 minute deep dives into new features, tools, and other announcements from product teams. These 4 sessions will be available on demand, so you can watch them any time after they officially launch during the event.

Driving a Successful Launch for Conversational Actions (register)

In this session, we’ll discuss marketing activities that will help users discover and engage with what you’ve built on Google Assistant. Learn some of the basics of putting together a marketing team, a go-to-market plan, and some recommended activities for promoting engagement with your Conversational Actions.

How to Voicify Your Android App (register)

In this session, you’ll learn how to implement voice capabilities in your Android App. Get users into your app with a voice command using App Actions.

Android Shortcuts for Assistant (register)

Now that you've added a layer of voice interaction to your Android app, learn what's new with Android Shortcuts and how they can be extended to the Google Assistant.

Refreshing Widgets (register)

Widgets in Android 12 are coming with a fresh new look and feel. Come to this session to learn how you can make the most of what’s coming to Widgets, while also making them more useful and discoverable through integrations with Assistant and Assistant Auto.

Ask Me Anything (AMA)

AMAs are a great opportunity for you to have your questions fielded by Googlers. If you register for I/O, you’ll be able to pre-submit questions to any of these AMAs. Teams of Googlers will be answering audience questions live during I/O. All AMA sessions will be livestreamed at specific dates and times, so be sure to add them to your calendar.

App Actions: Ask Me Anything

May 19th, 10:15am PST (register)

This is the place to bring all of your burning questions about App Actions for Android. Our App Actions team will include Program Managers, Developer Advocates, and Engineers who are looking forward to answering your questions. Maybe you’re building an app which uses Custom Intents, or you’ve got questions about some of the new feature announcements from our Technical Sessions (see above!) - the team is looking forward to helping.

Games on Google Assistant: Ask Me Anything

May 19th, 11:00pm PST (register)

Join a panel of Googlers to ask your questions about building Games with Google Assistant. Our team of Product Managers and Game developers are here to help you - from designing and building games, to toolchain questions, to figuring out what types of games people are playing on their smart devices.

Workshops

This year, our workshops will be conducted online via livestream. Each workshop will be led by a Googler providing instruction alongside a team of Googler TAs, who will be there to answer your questions via live chat. Workshops will show you how to apply the things you learn at I/O by giving you hands-on experience with new tools and APIs. Each workshop has limited space for registrations, so be sure to sign up early if you’re interested.

Extend an Android app to Google Assistant with App Actions

May 19th, 11:00am PST (register)

Learn to develop App Actions using common built-in intents in this intermediate codelab, enabling users to open app features and search for in-app content, with Google Assistant.

Debugging the Smart Home

May 19th, 11:30pm PST (register)

Improve your products' reliability and user experience with Google's new smart home quality tools in this intermediate codelab. Learn how to view, analyze, debug and fix issues with your smart home integrations.

Meetups

Women in Voice Meetup

May 20th, 4:00pm PST (register)

This meetup will be a chance for developers to share influential work by women in Voice AI and to discuss ways allies can help women in Voice to be more successful while building a more inclusive ecosystem.

Smart Home Developer Meetup

[Americas] May 18, 3:00pm PST (register)
[APAC] May 19th, 9:00pm PST (register)
[EMEA] May 20th, 6:00am PST (register)

This meetup will be a chance for developers interested in Smart Home to chat with the Smart Home partner engineering team about developing and debugging smart home integrations, share projects, or ask questions.

Register now

Registration for Google I/O 2021 is now open - and attending I/O 2021 is entirely free and open to all. We hope to see you there, and can’t wait to share what we’ve been working on with you. To register for the event, head over to the Google I/O registration page.

Policy changes and certification requirement updates for Smart Home Actions

Posted by Toni Klopfenstein, Developer Advocate

Illustration of 2 animated locks and phone with Actions on Google logo on screen

As more developers onboard to the Smart Home Actions platform, we have gathered feedback about the certification process for launching an Action. Today, we are pleased to announce we have updated our Actions policy to enable developers to more quickly develop their Actions, and to help streamline the certification and launch process for developers. These updates will also help to provide a consistent, cohesive experience for smart device users.

Device quality guidelines

Ensuring each device type meets quality benchmark metrics provides end users with reliable and timely responses from their smart devices.With these policy updates, minimum latency and reliability metrics have been added to each device type guide. To ensure consistent device control and timely updates to Home Graph, all cloud controlled smart devices need to maintain a persistent connection through a hub or the device itself, and cannot rely on mobile devices and tablets.

Along with these quality benchmarks, we have also updated our guides with required and recommended traits for each device. By implementing these within an Action, developers can ensure their end users can trigger devices in a consistent manner and access the full range of device capabilities. To assist you in ensuring your Action is compliant with the updated policy, the Test Suite testing tool will now more clearly flag any device type or trait issues.

Safety and security

Smart home users care deeply about the safety and security of the devices integrated into their homes, so we have also updated our requirements for secondary user verification. This verification step must be implemented for any Action that can set a device in an unprotected state, such as unlocking a door, regardless of whether you are building a Conversational Action or Smart Home Action. Once configured with a secondary verification method, developers can provide users a way to opt out of this flow. For any developer wishing to include an opt-out selection to their customers, we have provided a warning message template to ensure users understand the security implications for turning this feature off.

For devices that may pose heightened safety risks, such as cooking appliances, we require UL certificates or similar certification forms to be provided along with the Test Suite results before an Action can be released to production.

Works With 'Hey Google' badge

These policy updates also will affect the use of the Works With Hey Google badge. The badge will only be available for use on marketing materials for new Smart Home Direct Actions that have successfully integrated any device types referenced.

Any Conversational Actions currently using the badge will not be approved for use for any new marketing assets, including packaging/product refreshes. Any digital assets using the badge will need to be updated to remove the badge by the end of 2021.

Timeline

With the roll-out today, there will be a 1 month grace period for developers to update new integrations to match the new policy requirements. For Actions currently deployed to production, compliance will be evaluated when the Action is recertified. Once integrations have been certified and launched to production, Actions will need to be recertified annually, or any time new devices or device functionality is added to the Action. Notifications for recertification will be shared with the developer account associated with your Action in the console.

This policy grace-period ends April 12, 2021.

Please review the updated policy, as well as our updated docs for launching your Smart Home Action. You can also check out our policy video for more information.

We want to hear from you, so continue sharing your feedback with us through the issue tracker, and engage with other smart home developers in the /r/GoogleAssistantDev community. Follow @ActionsOnGoogle on Twitter for more of our team's updates, and tweet using #AoGDevs to share what you’re working on. We can’t wait to see what you build!

Announcing New Smart Home App Discovery Features

Posted by Toni Klopfenstein, Developer Advocate

When a user connects a smart device to the Google Assistant via the Home app, the user must select the appropriate related Action from the list of all available Actions. The user then clicks through multiple screens to complete their device setup. Today, we're releasing two new features to improve this device discovery process and drive customer adoption of your Smart Home Action through the Google Home app. App Discovery and Deep Linking are two convenience features that help users find your Google-Assistant compatible smart devices quickly and onboard faster.

App Discovery enables users to quickly find your smart home Action thanks to suggestion chips within the Google Home app. You can implement this new feature through the Actions Console by creating a verified brand link between your Action, your website, and your mobile app. App Discovery doesn't require any coding work to implement, making this a development-light feature that provides great improvements to the user experience of device linking.

In addition to helping users discover your Action directly through suggestion chips, Deep Linking enables you to guide users to your account linking flow within the Google Home app in one step. These deep links are easily added to your mobile app or web content, guiding users to your smart home integration with a single tap.

Deep Linking and App Discovery can help you create a more streamlined onboarding experience for your users, driving increased engagement and user satisfaction, and can be implemented with minimal engineering work.

To implement App Discovery and Deep Linking for your Smart Home Action, check out the developer documents, or watch the video covering these new features.

You can also check out the smart home codelabs if you are just starting to build out your Action.

We want to hear from you, so continue sharing your feedback with us through the issue tracker, and engage with other smart home developers in the /r/GoogleAssistantDev community. Follow @ActionsOnGoogle on Twitter for more of our team's updates, and tweet using #AoGDevs to share what you’re working on. We can’t wait to see what you build!

Announcing New Smart Home SHED Types and Traits

Posted by Toni Klopfenstein, Developer Advocate

Back in April, we released the first set of Smart Home Entertainment Device (SHED) types, including TV, set-top box, and remote, as well as the traits AppSelector, InputSelector, MediaState, TransportControl, and Volume. We are excited to announce the release of new Smart Home Entertainment Device (SHED) types and traits. These new device types and traits compliment the original set we released earlier this year, and help build out a more complete solution for smart home media and gaming devices. By implementing these types and traits on your entertainment devices, you can enable users to fully access device and media controls from any Assistant surface.

SHED Types and Traits

To expand the SHED options, we've released the following new device types for Smart Home:

  • Audio-video receiver
  • Streaming box
  • Streaming stick
  • Soundbar
  • Streaming soundbar
  • Speaker

We've also released the following new trait:

  • Channel

To ensure a consistent, high-quality experience for all end users, each of these device types require your service to report activityState and playbackState to Google using the ReportState API. This requirement improves the portability between media devices and helps the Assistant better understand user intents for these devices. By implementing the complete set of recommended device traits, you can further improve the quality of your smart home Action and improve device targeting for media playback command fulfilment.

For more information on how to implement these new device features, check out the docs and samples. You can also join us at our "Hey Google" Smart Home Virtual Summit to learn more about these new features.

We want to hear from you, so continue sharing your feedback with us through the issue tracker, and engage with other smart home developers in the /r/GoogleAssistantDev community. Follow @ActionsOnGoogle on Twitter for more of our team's updates, and tweet using #AoGDevs to share what you’re working on. We can’t wait to see what you build!

Join the "Hey Google" Smart Home Virtual Summit

Posted by Toni Klopfenstein, Developer Relations

Over the past year, we've been focused on building new tools and features to support our smart home developer community. Though we weren't able to engage with you in person at Google I/O, we are pleased to announce the "Hey Google" Smart Home Virtual Summit on July 8th - an opportunity for us to come together and dive into the exciting new and upcoming features for smart home developers and users.

Join us in the keynote where Michele Turner, the Product Management director of the Smart Home Ecosystem, will share our recent smart home product initiatives and how developers can benefit from these capabilities. She will also introduce new tools that make it easier for you to develop with Google Assistant. We will also be hosting a partner panel, where you can hear from industry leaders on how they navigate the impact of COVID-19 and their thoughts on the state of the industry.

Registration is FREE! Head on over to the Summit website to register and check out the schedule. Events will be held during EMEA, APAC, and AMER friendly times. We hope to see you and your colleagues there!

Finding COVID-19 testing centers in Search, Maps, and Assistant in India

Many experts agree that widespread testing is a key tool in the fight against COVID-19. That's why we're working with the Indian Council of Medical Research (ICMR) and MyGov to help people find local COVID-19 testing centres on Google Search, Assistant, and Maps.


When making a coronavirus-related search (eg. “coronavirus testing”), people will now see a ‘Testing’ tab on the results page providing a list of nearby testing labs along with key information and guidance needed before using their services. This includes government-mandated requirements such as: 
  • Calling the national or state helplines before heading out to get tested
  • Carrying a doctor’s prescription (referral required)
  • Testing restrictions (tests are limited to certain patients)
  • Information about whether the lab is government- or private-run.




    Search for ‘Covid 19 testing’ to see nearby authorized test labs, along with key recommendations (images are representational)


    On Google Maps, when people search for keywords like “covid 19 testing” or “coronavirus testing” they will see a list of nearby testing labs, with a link to Google Search for the government-mandated requirements.
    Search for ‘Covid 19 testing’ on Maps to see a list of nearby testing labs, with a link to Google Search for testing requirements (image representational)


    While this experience is designed to help people find authorized testing centers near them, it's important to follow the recommended guidelines that help determine testing eligibility before visiting. Tapping the ‘Learn more’ link leads to authoritative information from the Ministry of Health and Family Welfare (MoHFW), Government of India.


    So far we have integrated over 700 testing labs on Search, Assistant, and Maps spanning more than 300 cities, and we continue to work with ICMR as we surface more labs across the country. This experience is available in English and in eight Indian languages -- Hindi, Bengali, Telugu, Tamil, Malayalam, Kannada, Marathi, and Gujarati.


    We hope these new experiences play a part in helping people as well as healthcare workers as we collectively work toward overcoming this pandemic. 

    Posted by Jayant Baliga, Product Manager, Google Maps