Tag Archives: Android Auto

Android for Cars: Bringing more apps to cars

Posted by Vivek Radhakrishnan – Technical Program Manager, and Seung Nam – Product Manager

With technology in cars becoming more capable, the opportunity to deliver safe and seamless connected experiences for drivers and passengers is greater than ever. Google remains committed to the automotive industry and is seeing momentum across Android Auto and cars powered by Android Automotive OS with Google built-in. We’re excited to share updates across our in-car experiences and introduce new programs and resources to make it easier for you to bring your apps to cars. Learn more below and in the Android for Cars Technical Session.

Momentum and updates

With over 200 million cars on the road compatible with Android Auto, and nearly 40 car models like the Nissan Rogue, Renault R5, Acura ZDX, and Ford Explorer offering Google built-in, the time to bring your apps to cars is now.

Over the last year, the ecosystem of apps available across these experiences has grown – thanks to you. New entertainment apps like Max, Peacock and Angry Birds are coming to select cars with Google built-in. On Android Auto, the Uber Driver app is now available, allowing drivers to accept rides and deliveries, and get turn-by-turn directions on a bigger screen.

Image showing Angry Birds on a Volvo EX90 car display
Angry Birds is coming to select cars with Google built-in, including Volvo EX90 (pictured).

We’re also pleased to share that Google Cast is coming to cars with Android Automotive OS, starting with Rivian with more to follow. This allows you to easily cast video content from your phone or tablet directly to the car while parked. If you don’t already offer casting in your app, this is a simple way for your content to reach new audiences in the car.

Coming soon - you can stream content from apps on your phone, like Pluto TV, to Rivian cars via Google Cast.

New car app quality tiers

There are unique considerations when developing apps and experiences for cars including safety, numerous screen sizes, and more. Our priority is developing resources and tools that take these considerations into account and minimize the work needed for you to bring your apps to cars.

We’re introducing new quality tiers, inspired by those that exist for large screens, to streamline the process of bringing existing apps to cars by highlighting what makes for a great user experience in cars. Here are the tiers and what they encompass:

    • Tier 1: Car differentiated
      This tier represents the best of what’s possible in cars. Apps in this tier are specifically built to work across the variety of hardware in cars and can adapt their experience across driving and parked modes. They provide the best user experience designed for the different screens in the car like the center console, instrument cluster and additional screens - like panoramic displays that we see in many premium vehicles.
    • Tier 2: Car optimized
      Most apps available in cars today fall into this tier and provide a great experience on the car’s center stack display. These apps will have some car-specific engineering to include capabilities that can be used across driving or parked modes, depending on the app’s category.
    • Tier 3: Car ready
      Apps in this tier are large screen compatible and are enabled while the car is parked, with potentially no additional work. While these apps may not have car-specific features, users can experience the app just as they would on any large screen Android device.

To learn more about the quality tiers, see Android app quality for cars.

Car ready mobile apps program

Let’s dive deep into Tier 3 apps. In collaboration with car manufacturers, we’re introducing the Car ready mobile apps program to accelerate bringing mobile apps to cars with no additional work for developers.

As part of this program, Google will proactively review mobile apps that are already adaptive and large screen compatible to ensure safety and compatibility in cars. If the app qualifies, we will automatically opt it in for distribution on cars with Google built-in and make it available in Android Auto, without the need for new development or a new release to be created. This program will start with parked app categories like video, gaming and browsers with plans to expand to other app categories in the future.

The program will roll out in the coming months, but if you already offer a large screen compatible adaptive app and it falls into one of these categories, you can request a review to participate sooner. As this program rolls out, availability of your app will depend on platform compatibility.

To learn more about building qualified mobile apps, check out the technical session titled “Building Adaptive Android Apps”. You can find guidance on what to look out for at developer.google.com

Animation showing AMC+ app on a phone, tablet and car display.
Apps optimized for large screens, like AMC+, may be able to come to cars with little to no development work.

New tools and emulators

To create high quality experiences in cars, we are also introducing some new tools that can help you along the way.

    • First, we have a new emulator for distant and panoramic displays so developers can visualize and test for the growing sizes and number of screens in the car and make sure apps can adapt to the variety of displays for the best experience.
    • We also have a new tool that addresses the wide range of screen shapes and user interfaces (UI) present in cars. Many new car displays have unique curves, insets and angles that impact the UI, so we have an emulator that lets you change the emulator screen to match OEM screen designs. This will help ensure the apps work well on real cars without needing to set up specific OEM emulators or bringing in real cars for testing.
    • Lastly, we’re introducing an Android Automotive OS system image for Pixel Tablet. This will let you physically interact with your app as you would on a car screen. We are opening this up for early access partners for the purpose of development and testing today, and you can request to participate here.

To learn more about how to use these tools, check out the “Build and test a parked app for Android Automotive OS” codelab that will be published tomorrow.

More app categories for cars

As you consider bringing your app to cars, we put together a table to help you understand what app categories are currently open and accepting app submissions across both Android Auto and cars with Google built-in. We will continue to expand the type of apps that can be enabled in cars, so if your app isn’t in one of these categories, stay tuned for future opportunities!

Android for Cars Catergory Status

Start developing apps for cars today

To learn how to bring your apps to cars, check out the documentation on the Android for Cars developer site and the Android for Cars Technical Session. With all the opportunities across car screens, there has never been a better time to bring your apps and experiences to cars. Thanks for all the contributions to the Android ecosystem. See you on the road!

15 Things to know for Android developers at Google I/O

Posted by Matthew McCullough, Vice President, Product Management, Android Developer  

AI is unlocking experiences that were not even possible a few years ago, and we’ve been hard at work reimaging Android with AI at the core, to help enable you to build a whole new class of apps. At this year’s Google I/O, we’re covering how new tools like Gemini can power building the next generations of apps on Android. Plus, we showcased a range of updates to our tools and services grounded in productivity, helping you make it faster and easier to build excellent experiences across form factors. Let’s dive in!

Powering the next generation of Apps with AI

#1: AI in your tools, with Gemini in Android Studio

Gemini in Android Studio (formerly Studio Bot) is your coding companion for Android development, and thanks to your feedback since its preview at last year’s Google I/O, we’ve evolved our models, expanded to over 200 countries and territories, and brought it into the Gemini family of products. Earlier today, we previewed a number of new features coming soon, like Code suggestions, App Quality Insights that leverage Gemini, and a preview of the multi-modal inputs that are coming using Gemini 1.5 Pro. You can read more about the updates here, and make sure to check out What’s new in Android development tools.

#2: Building with Generative AI

Android provides the solution you need to build Generative AI apps. You can use our most capable models over the Cloud with the Gemini API in Google AI or Vertex AI for Firebase directly in your Android apps. For on-device, Gemini Nano is our most efficient model. We’re working closely with a few early adopters such as Patreon, Grammarly, and Adobe to ensure we’re creating the best APIs that unlock the most innovative experiences. For example, Adobe is experimenting with Gemini Nano to enhance the on-device experience of Acrobat AI Assistant, a tool that allows their users to summarize and interact with documents. Be sure to check out the Build your own generative AI powered Android app, Android on-device gen AI under the hood, and the What’s New in Android sessions to learn more!

Moving image of Gemini Nano operating in Adobe

Excellent apps, across devices

#3: Think adaptive: apps on phones, foldables, tablets and more

Build and design apps that adapt beyond the phone, with the new Compose adaptive layout libraries built with Material guidance in beta. Add rich stylus and keyboard support to increase user productivity. Check out three of our key Android adaptive sessions at Google I/O: Designing adaptive apps, Building adaptive Android apps, and Increase user productivity with large screens and accessories.

Moving image of Gemini Nano operating in Adobe

#4: Enhance homescreens with Widgets and Jetpack Glance

Jetpack Glance 1.1 is now available in release candidate and lets you build high quality widgets using your Compose skills. Check out our new canonical layouts, design guidance and figma updates to the Android UI kit. To learn more check out our Improve the user experience of your Android app workshop and Build Android widgets with Jetpack Glance technical session.

#5-9: come back here tomorrow and Thursday!

We’ll continue to share more updates for Android Developers throughout Google I/O, so check back here tomorrow!

Developer Productivity

#10: Use Kotlin Multiplatform for sharing business logic

Kotlin Multiplatform (KMP) enables sharing Kotlin code across different platforms and several of our Jetpack libraries, like DataStore and Room, have already been migrated to take advantage of KMP. We use Kotlin Multiplatform within Google and recommend using KMP for sharing business logic between platforms. Learn more about it here.

#11: Compose: Shared Elements, performance improvements and more

The upcoming Compose June ‘24 release is packed with the features you’ve been asking for! Shared element transitions, lazy list item reordering animations, strong skipping mode, performance improvements, a new lazy flow layout and more. Read more about it in our blog.

#12: Android Studio: the latest preview, with Gemini and more

Android Studio Koala 🐨Feature Drop (2024.1.2) available today in the canary channel, builds on top of IntelliJ 2024.1 and adds new innovative features unlocked by Gemini, such as insights for crashes in App Quality Insights, code transformations and a Gemini API starter template to get you quickly started with Gemini. Additionally, new features such as USB speed detection, shortcut UI to control device settings, a new way to sign into Google services, updated and speedier UI for profilers with a new task centric approach and a deep integration with the Google Play SDK index are intended to make the development process extremely productive. Read more here.

And the latest from the world of Mobile

#13: Grow your business with the latest Google Play updates

Discover new ways to attract and engage users with enhanced custom store listings. Optimize revenue with expanded payment options. Reinforce trust through secure, high-quality experiences made easier with our latest SDK Console improvements. Learn about these updates and more, including our new vertical approach, in our blog.

#14: Simplify app compliance with Checks

Streamline your app's privacy compliance with Checks, Google's AI-powered compliance solution! Checks empowers developers to swiftly identify, address, resolve privacy issues, and enables you to launch apps faster and with confidence. Harness the power of automation with Checks' intelligent reports, saving you valuable time and resources. Get started now at checks.google.com.

#15: And of course, Android 15

…but for that, you’ll have to stay tuned tomorrow, when we’ve got a bit more up our sleeve!

What’s new with Android for Cars: I/O 2023

Posted by Jennifer Tsau, Product Management Lead and David Dandeneau, Engineering Lead

For more than a decade, Google has been committed to bringing safe and seamless connected experiences to cars. We’re continuing to see strong momentum and adoption across Android for Cars. Android Auto is supported by nearly every major car maker, and will be in nearly 200 million cars by the end of this year. And the number of cars powered by Android Automotive OS with Google built-in — which includes top brands like Chevrolet, Volvo, Polestar, Honda, Renault and more — is expected to nearly double by the end of this year.

With cars becoming more connected and equipped with immersive displays, there’s more opportunities for developers to bring app experiences to cars. We’re excited to share updates and new ways for developers to reach more users in the car.


Apps designed for driving experiences

Helping drivers while on the road - whether they are navigating, listening to music, or checking the weather - is a top priority. We’re continuing to invest in tools and resources, including the Android for Cars App Library, to make it even easier for developers to build new apps or port existing Android apps over to cars.

New capabilities for navigation apps

Today, we announced Waze rolling out globally on the Google Play Store for all cars with Google built-in, expanding its availability beyond Android Auto. As a part of this launch, we created more templates in Android for Cars App Library to help speed up development time across a number of app categories, including navigation.

For navigation apps, it’s also now possible to integrate with the instrument cluster, providing turn-by-turn directions right in the driver's line of sight. And developers can also access car sensor data to surface helpful information like range, fuel level, and speed to provide more contextual assistance to drivers.

A car dashboard shows the Waze app open on the display panel
The Waze app is coming to all cars with Google built-in, including the first-ever Chevrolet Blazer EV launching this year.

Tools to easily port your media apps across Android for Cars

Media apps continue to be a top use case in the car, and it’s quicker than ever to bring your media apps to Android Auto and Android Automotive OS. Audible recently joined popular streaming audio apps like Deezer, Soundcloud, and Spotify to offer their apps across both Android Auto and cars with Google built-in. If you have a media app on mobile, port it over to reach new users in the car.

New app categories for driving experiences

The Android for Cars App Library now allows developers to bring new apps to cars including internet of things (IoT) and weather apps to cars. The IoT category is available for all developers, while weather is in an early access program. In the weather category, The Weather Channel app will join other weather apps like Weather & Radar later this year.

We’re also working with messaging apps like Zoom, Microsoft Teams, and Webex by Cisco to allow you to join meetings by audio from your car display in the coming months.

A car display shows a Zoom meeting schedule next to a route in Google Maps.
Coming soon, join meetings by audio from your car display.

Apps designed for parked and passenger experiences

With screens expanding in size and more being added for passengers, there is growing demand for parked and passenger experiences in cars.

Video, gaming, and browsing in cars

Now, video and gaming app categories are available in the car, with an early access program for browsing apps coming soon. YouTube is now available for car makers to offer in cars with Google built-in. And drivers of cars with Google built-in will soon have access to popular titles like Beach Buggy Racing 2, Solitaire FRVR, and My Talking Tom Friends from publishers like Vector Unit, FRVR and Outfit7 Limited. Developers can now port their large screen optimized apps to cars to take advantage of this opportunity.

A car display shows a YouTube video of an animated character singing.
YouTube is coming to cars with Google built-in, like the Polestar 2.

More screens in cars allows for new experiences between drivers and passengers, including individual and shared entertainment experiences. We're excited to announce multi-screen support is coming to Android Automotive OS 14 — stay tuned for more updates.

A car with a panoramic front display and screens in headrests showing apps and video content.
Support for multiple screens is coming to Android Automotive OS 14.

Start developing apps for cars today

To learn how to bring your apps to cars, check out the technical session, codelab and documentation on the Android for Cars developer site. With all the opportunities across car screens, there has never been a better time to bring your apps and experiences to cars. Thanks for all the contributions to the Android ecosystem. See you on the road!

13 Things to know for Android developers at Google I/O!

Posted by Maru Ahues Bouza, Director of Android Developer Relations

Android I/O updates: Jetpack, Wear OS, etc 

There aren’t many platforms where you can build something and instantly reach billions of people around the world, not only on their phones—but their TVs, cars, tablets, watches, and more. Today, at Google I/O, we covered a number of ways Android helps you make the most of this opportunity, and how Modern Android Development brings as much commonality as possible, to make it faster and easier for you to create experiences that tailor to all the different screens we use in our daily lives.

We’ve rounded up the top 13 things to know for Android developers—from Jetpack Compose to tablets to Wear OS and of course… Android 13! And stick around for Day 2 of Google I/O, when Android’s full track of 26 technical talks and 4 workshops drop. We’re also bringing back the Android fireside Q&A in another episode of #TheAndroidShow; tweet us your questions now using #AskAndroid, and we’ve assembled a team of experts to answer live on-air, May 12 at 12:30PM PT.


MODERN ANDROID DEVELOPMENT

#1: Jetpack Compose Beta 1.2, with support for more advanced use cases

Android’s modern UI toolkit, Jetpack Compose, continues to bring the APIs you need to support more advanced use cases like downloadable fonts, LazyGrids, window insets, nested scrolling interop and more tooling support with features like LiveEdit, Recomposition Debugging and Animation Preview. Check out the blog post for more details.

Jetpack Compose 1.2 Beta  

#2: Android Studio: introducing Live Edit

Get more done faster with Android Studio Dolphin Beta and Electric Eel Canary! Android Studio Dolphin includes new features and improvements for Jetpack Compose and Wear OS development and an updated Logcat experience. Android Studio Electric Eel comes with integrations with the new Google Play SDK Index and Firebase Crashlytics. It also offers a new resizable emulator to test your app on large screens and the new Live Edit feature to immediately deploy code changes made within composable functions. Watch the What’s new in Android Development Tools session and read the Android Studio I/O blog post here.

#3: Baseline Profiles - speed up your app load time!

The speed of your app right after installation can make a big difference on user retention. To improve that experience, we created Baseline Profiles. Baseline Profiles allow apps and libraries to provide the Android runtime with metadata about code path usage, which it uses to prioritize ahead-of-time compilation. We've seen up to 30% faster app startup times thanks to adding baseline profiles alone, no other code changes required! We’re already using baseline profiles within Jetpack: we’ve added baselines to popular libraries like Fragments and Compose – to help provide a better end-user experience. Watch the What’s new in app performance talk, and read the Jetpack blog post here.

Modern Android Development 

BETTER TOGETHER

#4: Going big on Android tablets

Google is all in on tablets. Since last I/O we launched Android 12L, a release focused on large screen optimizations, and Android 13 includes all those improvements and more. We also announced the Pixel tablet, coming next year. With amazing new hardware, an updated operating system & Google apps, improved guidelines and libraries, and exciting changes to the Play store, there has never been a better time to review your apps and get them ready for large screens and Android 13. That’s why at this year’s I/O we have four talks and a workshop to take you from design to implementation for large screens.


#5: Wear OS: Compose + more!

With the latest updates to Wear OS, you can rethink what is possible when developing for wearables. Jetpack Compose for Wear OS is now in beta, so you can create beautiful Wear OS apps with fewer lines of code. Health Services is also now in beta, bringing a ton of innovation to the health and fitness developer community. And last, but certainly not least, we announced the launch of The Google Pixel Watch - coming this Fall - which brings together the best of Fitbit and Wear OS. You can learn more about all the most exciting updates for wearables by watching the Wear OS technical session and reading our Jetpack Compose for Wear OS announcement.

Compose for Wear OS 

#6: Introducing Health Connect

Health Connect is a new platform built in close collaboration between Google and Samsung, that simplifies connectivity between apps making it easier to reach more users with less work, so you can securely access and share user health and fitness data across apps and devices. Today, we’re opening up access to Health Connect through Jetpack Health—read our announcement or watch the I/O session to find out more!

#7: Android for Cars & Android TV OS

Android for Cars and Android TV OS continue to grow in the US and abroad. As more users drive connected or tune-in, we’re introducing new features to make it even easier to develop apps for cars and TV this year. Catch the “What’s new with Android for Cars” and “What's new with Google TV and Android TV” sessions on Day 2 (May 12th) at 9:00 AM PT to learn more.

#8: Add Voice Across Devices

We’re making it easier for users to access your apps via voice across devices with Google Assistant, by expanding developer access to Shortcuts API for Android for Cars, with support for Wear OS apps coming later this year. We’re also making it easier to build those experiences with Smarter Custom Intents, enabling Assistant to better detect broader instances of user queries through ML, without any NLU training heavy lift. Additionally, we’re introducing improvements that drive discovery to your apps via voice on Mobile, first through Brandless Queries, that drive app usage even when the user hasn’t explicitly said your app’s name, and App Install Suggestions that appear if your isn’t installed yet–these are automatically enabled for existing App Actions today.


AND THE LATEST FROM ANDROID, PLAY, AND MORE:

#9: What’s new in Play!

Get the latest updates from Google Play, including new ways Play can help you grow your business. Highlights include the ability to deep-link and create up to 50 custom listings; our LiveOps beta, which will allow more developers to submit content to be considered for featuring on the Play Store; and even more flexibility in selling subscriptions. Learn about these updates and more in our blog post.

#10: Google Play SDK Index

Evaluate if an SDK is right for your app with the new Google Play SDK index. This new public portal lists over 100 of the most widely used commercial SDKs and information like which app permissions the SDK requests, statistics on the apps that use them, and which version of the SDK is most popular. Learn more on our blog post and watch “What’s new in Google Play” and “What’s new in Android development tools” sessions.

#11: Privacy Sandbox on Android

Privacy Sandbox on Android provides a path for new advertising solutions to improve user privacy without putting access to free content and services at risk. We recently released the first Privacy Sandbox on Android Developer Preview so you can get an early look at the SDK Runtime and Topics API. You can conduct preliminary testing of these new technologies, evaluate how you might adopt them for your solutions, and share feedback with us.

#12: The new Google Wallet API

The new Google Wallet gives users fast and secure access to everyday essentials across Android and Wear OS. We’re enhancing the Google Wallet API, previously called Google Pay Passes API, to support generic passes, grouping and mixing passes together, for example grouping an event ticket with a voucher, and launching a new Android SDK which allows you to save passes directly from your app without a backend integration. To learn more, read the full blog post, watch the session, or read the docs at developers.google.com/wallet.

#13: And of course, Android 13!

The second Beta of Android 13 is available today! Get your apps ready for the latest features for privacy and security, like the new notification permission, the privacy-protecting photo picker, and improved permissions for pairing with nearby devices and accessing media files. Enhance your app with features like app-specific language support and themed app icons. Build with modern standards like HDR video and Bluetooth LE Audio. You can get started by enrolling your Pixel device here, or try Android 13 Beta on select phones, tablets, and foldables from our partners - visit developer.android.com/13 to learn more.

That’s just a snapshot of some of the highlights for Android developers at this year’s Google I/O. Be sure to watch the What’s New in Android talk to get the landscape on the full Android technical track at Google I/O, which includes 26 talks and 4 workshops. Enjoy!

Building apps for Android Automotive OS

Posted by Madan Ankapura, Product Manager

Today we’re announcing the availability of version 1.2 beta of the Car App Library, enabling app developers to start building their navigation, parking, and charging apps for Android Automotive OS.

Now, developers can begin building and testing apps for these categories using the Automotive OS emulator across both Android Automotive OS and Android Auto. For the entire list of changes in v1.2 beta, please see the release notes. To start building your app for the car, check out our updated developer documentation, car quality guidelines, and design guidelines.

As announced earlier, drivers of Polestar 2 and Volvo cars can now download charging (ChargePoint, PlugShare), parking (Spothero, Parkwhiz), and navigation (Flitsmeister, Sygic) apps developed with the Car App Library by joining the Google Group and opting-in to each app's beta on the Google Play store, with your Gmail account.

six spp icons

Car App Library apps on Android Automotive OS are automatically rendered to be consistent with the rest of the experience within each car, without additional work needed from developers.. For example,

Polestar 2
Volvo
Polestar 2 setting with labeled On / Off switches for PlugShare

Polestar 2 setting with labeled On / Off switches for PlugShare

Volvo settings with sliding switches for PlugShare

Volvo settings with sliding switches for PlugShare

Polestar 2 sign-in screen for SpotHero

Polestar 2 sign-in screen for SpotHero

Volvo sign-in screen for SpotHero

Volvo sign-in screen for SpotHero

Example of app customization on Android Automotive OS

Experience for yourself how your app will look within the different systems, by accessing the OEM emulator system images downloadable in Android Studio. You can begin developing your charging, parking and navigation apps for Android Automotive OS today, and we are working to enable you to publish your apps to the Google Play store in the coming months (stay tuned!).

Beyond navigation, rideshare drivers spend a lot of time in their vehicles and will benefit from safer interactions if those apps can be brought to the car’s screen. We are working with Lyft and Kakao Mobility to bring their driver app experiences into the car in the coming months.

image of car screen with gps map and Lyft logo

We are also pleased to announce that we are expanding support to all Points of Interest apps. Beyond charging and parking, this allows any app that will help users discover and search for interesting locations on a map, and optionally enable them to navigate to such points. We are partnering with MochiMochi, Fuelio, Prezzi Benzina, and NAVITIME JAPAN as our early access partners.

If you’re interested in joining our Early Access Program in the future, please fill out this interest form. You can get started with the Android for Cars App Library today, by visiting g.co/androidforcars.

2021 Assistant Recap

Posted by Jessica Dene Earley-Cha, Mike Bifulco and Toni Klopfenstein, Developer Relations Engineers for Google Assistant

We've reached the end of the year - and what a year it's been! Between all of our live (virtual) events including I/O, developer summits, meetups and more, there are a lot of highlights for App Actions, Smart Home Actions and Conversational Actions. Let's dive in and take a look.

App Actions

App Actions allows developers to extend their Android App to Google Assistant. App Actions integrates more cleanly with Android using new Android platform features. With the introduction of the beta shortcuts.xml configuration resource, expanding existing Android features and our latest Google Assistant Plug App Actions is moving closer to the Android platform.

App Actions Benefits:

  • Display app information on Google surfaces. Provide Android widgets for Assistant to display, offering inline answers, simple confirmations and brief interactions to users without changing context.
  • Launch features from Assistant. Connect your app's capabilities to user queries that match predefined semantic patterns (BII).
  • Suggest voice shortcuts from Assistant. Use Assistant to proactively suggest tasks for users to discover or replay, in the right context.

Core Integration

Capabilities is a new Android framework API that allows you to declare the types of actions users can take to launch your app and jump directly to performing a specific task. Assistant provides the first available concrete implementation of the capabilities API. You can utilize capabilities by creating a shortcuts.xml resource and defining your capabilities. Capabilities specify two things: how it's triggered and what to do when it's triggered. To add a capability, you’ll need to select a Built-In intent (BII), which are pre-built language models that provide all the Natural Language Understanding to map the user's input to individual fields. When a BII is matched by the user’s request, your capability will trigger an Android Intent that delivers the understood BII fields to your app, so you can determine what to show in response.

To support a user query like “Hey Google, Find waterfall hikes on ExampleApp,” you can use the GET_THING BII. This BII supports queries that request an “item” and extracts the “item” from the user query as the parameter thing.name. The best use case for the GET_THING BII is to search for things in the app. Below is an example of a capability that uses the GET_THING BII:

<!-- This is a sample shortcuts.xml -->
<shortcuts xmlns:android="http://schemas.android.com/apk/res/android">
<capability android:name="actions.intent.GET_THING">
<intent
android:action="android.intent.action.VIEW"
android:targetPackage="YOUR_UNIQUE_APPLICATION_ID"
android:targetClass="YOUR_TARGET_CLASS">
<!-- Eg. name = "waterfall hikes" -->
<parameter
android:name="thing.name"
android:key="name"/>
</intent>
</capability>
</shortcuts>

This framework integration is in the Beta release stage, and will eventually replace the original implementation of App Actions that uses actions.xml. If your app provides both the new shortcuts.xml and old actions.xml, the latter will be disregarded.

Learn how to add your first capability with this codelab.

Voice shortcuts

Google Assistant suggests relevant shortcuts to users during contextually relevant times. Users can see what shortcuts they have by saying “Hey Google, shortcuts.”

Shortcut for Google Assistant

You can use the Google Shortcuts Integration library, currently in beta, to push an unlimited number of dynamic shortcuts to Google to make your shortcuts visible to users as voice shortcuts. Assistant can suggest relevant shortcuts to users to help make it more convenient for the user to interact with your Android app.

Learn how to push your dynamic shortcuts to Assistant with our dynamic shortcuts codelab.

Example of App using Dynamic Shortcuts CodeLab Tool

Simple Answers, Hands Free & Android Auto

During situations where users need a hand free experience, like on Android Auto, Assistant can display widgets to provide simple answers, brief confirmations and quick interactive experience as a response to a user’s inquiry. These widgets are displayed within the Assistant UI, and in order to implement a fully voice-forward interaction with your app, you can arrange for Assistant to speak a response with your widget, which is safe and natural for use in automobiles. A great re-engagement feature with widgets, is that a “Add this widget” chip can be included too!

Example of App using Dynamic Shortcuts CodeLab Tool

Re Engagement

Another re-engagement tool is In-App Promo SDK you can proactively suggest shortcuts in your app for actions that the user can repeat with a voice command to Assistant, in beta. The SDK allows you to check if the shortcut you want to suggest already exists for that user and prompt the user to create the suggested shortcut.

New Tooling

To support testing Capabilities, the Google Assistant plugin for Android Studio was launched. It contains an updated App Action Test Tool that creates a preview of your App Action, so you can test an integration before publishing it to the Play store.

New App Actions resources

Learn more with new or updated content:


Smart Home Actions

A big focus of this year's Smart Home launches were new and updated tools. At events like I/O, Works With: SiLabs, and the Google Smart Home Developer Summit, we shared these new resources to help you quickly build a high quality smart home integration.

New Resources

To make implementing new features even easier for developers, we released many new tools to help you get your Smart Home Action up and running.

To help consumers discover Google-compatible smart home devices and associated routines, we released the smart home directory, accessible on the web and through the Google Home app.

We heard your requests for more ways to localize your integrations, so we added sample utterances in English (en-US), German (de-DE), and French (fr-FR) to several device guides. Additionally, we also rolled out Chinese (zh-TW) as one of the supported languages for the overall platform. To make our documentation more accessible, we added a Japanese translation of our developer guides.

We also released several new device types and traits, along with new features to support your integrations, including proactive and follow-up responses, app discovery and deep linking.

Quality Improvements

For general onboarding, we've added three new codelabs to enable you to dive deeper into debugging and monitoring your projects. You can now walk through debugging smart home Actions, debugging local fulfillment Actions, and dig deeper into your log-based metrics for your Actions.

When you're actively developing your integration, the Google Home Playground can simulate a virtual home with configurable device types and traits. Here you can view the types and traits in Home Graph, modify device attributes, and share device configurations.

If you discover issues with your configuration, we've continued upgrading the monitoring and logging dashboards to show you detailed views of events with your integrations, as well as better guidance on how to handle errors and exceptions.

The WebRTC Validator Tool acts as a WebRTC peer to stream to or from, and generally emulates the WebRTC player on smart displays with Google Assistant. If you're specifically working with a smart camera, WebRTC is now supported on the CameraStream trait.

Local Home

In order to continue striving towards quality responses to user queries, we also added support to the Local Home SDK to support local queries and responses. Additionally, to help users onboard new devices in their homes quickly and use Google Nest devices as local hubs, we launched BLE Seamless Setup.

Matter

The new Google Home IDE enables you to improve your development process by enabling in-IDE access to Google Assistant Simulator, Cloud Logging, and more for each of your projects. This plugin is available for VSCode.

Finally, as we get closer to the official launch of the Matter protocol, we're working hard to unify all of our smart home ecosystem tools together under a single name - Google Home. The Google Home Developer Center will enable you to quickly find resources for integrating your Matter-compatible smart devices and platforms with Nest, Android, Google Home app, and Google Assistant.

Conversational Actions

Way back in January of 2021, we rolled up an updated Actions for Families program, which provides guidelines for teams building actions meant for kids. Conversational Actions which are approved for the Actions for Families program get a special badge in the Assistant Directory, which lets parents know that your Action is family-friendly.

During the What's New in Google Assistant keynote at Google I/O, Director of Product for the Google Assistant Developer Platform Rebecca Nathenson mentioned several coming updates and changes for Conversational Actions. This included the launch of a Developer Preview for a new client-side fulfillment model for Interactive Canvas. Client-side fulfillment changes the implementation strategy for Interactive Canvas apps, removing the need for a webhook relaying information between the Assistant NLU and their web application. This simplifies the infrastructure needed to deploy an action that uses Interactive Canvas. Since the release of this Developer Preview, we’ve been listening closely to developers to get feedback on client-side fulfillment.

Interactive Canvas Developer Tools

We also released Interactive Canvas Developer tools - a Chrome extension which can help dev teams mock and debug the web app side of Interactive Canvas apps and games. Best of all, it’s open source! You can install the dev tools from the Chrome Web Store, or compile them from source yourself on GitHub at actions-on-google/interactive-canvas-dev-tools.

Updates to SSML

Earlier this year we announced support for new SSML features in Conversational Actions. This expanded support lets you build more detailed and nuanced features using text to speech. We produced a short demonstration of SSML Features on YouTube, and you can find more in our docs on SSML if you’re ready to dive in and start building already

Updates to Transaction UX for Smart Displays

Also announced at I/O for Conversational Actions - we released an updated workflow for completing transactions on smart displays. The new transaction process lets users complete transactions from their smart screens, by confirming the CVC code from their chosen payment method, rather than using a phone to enter a CVC code. If you’d like to get an idea of what the new process looks like, check out our demo video showing new transaction features on smart devices.

Tips on Launching your Conversational Action

Driving a successful launch for Conversational Actions contains helpful information to help you think through some strategies for putting together a marketing team and go-to-market plan for releasing your Conversational Action.

Looking forward to 2022

We're looking forward to another exciting year in 2022. To stay connected, sign up for our new App Actions email series or Google Home newsletter, or for the general Assistant newsletter.

As always, you can also join us on Reddit or follow us on Twitter. Happy Holidays!

Bringing richer navigation, charging, parking apps to more Android Auto users

Posted by Madan Ankapura, Product Manager

Illustration of car interior with map, parking and gas symbols

Today, we are releasing the beta of Android for Cars App Library version 1.1. Your Android Auto apps using features that require Car App API level 2+ like map interactivity, vehicle’s hardware data, multiple-length text, long message and sign-in templates, can now be used in cars with Android Auto 6.7+ (which were previously limited to Desktop Head Unit only).

Two Android Auto GIF examples. Left GIF is 2GIS and right GIF is TomTom

With this announcement, we are also completing the transition to Jetpack and will no longer be accepting submissions built with the closed source library (com.google.android.libraries.car.app). If you haven’t already, we encourage you to migrate to the AndroidX library now.

For the entire list of changes in beta01, please see the release notes. To start building your app for the car, check out our updated developer documentation, car quality guidelines and design guidelines.

If you’re interested in joining our Early Access Program to get access to new features early in the future, please fill out this interest form. You can get started with the Android for Cars App Library today, by visiting g.co/androidforcars.

Accessing car hardware APIs in your app for cars

Posted by Madan Ankapura, Product Manager

Building on our effort to enable developers to create app experiences across navigation, parking, and charging apps via Android for Cars App Library as part of Jetpack, today we’re announcing the availability of CarHardwareManager APIs as part of version 1.1 alpha02 to get developer feedback.

CarHardwareManager can be used to query the vehicle’s hardware data, such as model and make, fuel levels and other sensors. Currently, this feature is only available for Android Auto 6.7+ in the open-testing channel. Testing this in a desktop environment requires a new version of the Desktop Head Unit which will be released separately. Stay tuned here for details on when the new version becomes available.

For the entire list of changes in alpha02, please see the release notes. To start building your app for the car, check out our updated developer documentation, car quality guidelines and design guidelines. These library features are available for testing only with the Desktop Head Unit. We will announce when these features are available to run in cars in the future.

In addition, if you are a developer of a parking app, you can now integrate with Google Assistant to enable users to talk to Google to open their favorite parking app and find parking while driving.

If you’re interested in joining our Early Access Program in the future, please fill out this interest form. You can get started with the Android for Cars App Library today, by visiting g.co/androidforcars.

Improve your app mileage with Android for Cars App library

Posted by Madan Ankapura, Product Manager

In April, we announced our first version of the Android for Cars App Library as part of Jetpack, reaching a milestone to let developers publish their navigation, parking, charging apps on the Google Play Store.

Today, we’re announcing that version 1.1 is in alpha, which brings the following features to developers:

  • Sign-in template - Allows users to sign in to your app directly from the car screen while parked.
  • Long Message template - Allows you to show long messages like terms of service to users as part of the setup flow while parked.
  • Multiple-length text - Different car screen sizes may show different amounts of text. We added an API you can use to specify multiple variants of a text string in select templates to fit different screen sizes.
  • Map Interactivity - You can now add capabilities such as zooming and panning to your navigation template.
Android for Cars App library

For the entire list of changes, please see the release notes. To start building your app for the car, check out our updated developer documentation, car quality guidelines and design guidelines.

These library features are available for testing only with the Desktop Head Unit. We will announce when these features are available to run in cars in the future.

If you’re interested in joining our Early Access Program in the future, please fill out this interest form. You can get started with the Android for Cars App Library today, by visiting g.co/androidforcars.

What’s new with Android for Cars

Posted by Mickey Kataria, Director of Product Management

For over a decade, Google has been committed to automotive, with a vision of creating a safe and seamless connected experience in every car. Developers like all of you are a crucial part of helping people stay connected while on the go. We’re seeing strong momentum across our in-car experiences, Android Auto and Android Automotive OS, and today, we’re excited to share the latest updates and opportunities to reach users in the car.

Check out our I/O session: What's new with Android for Cars

Android Auto

Android Auto, which allows users to connect their phone to their car display, now has over 100 million compatible cars on the road and is supported by nearly every major car manufacturer. Porsche is our newest partner and they will begin shipping Android Auto on new cars, starting this summer with the Porsche 911.

We’ve been working closely with car manufacturers to build an even better Android Auto experience by enabling wireless projection in more vehicles, extending availability to more countries, and continuing to launch new features, like integration into the instrument cluster. To see some of the newest Android Auto technology in the BMW iX, check out the video below.

Android Auto projecting to the cluster display in a BMW iX.

Android Automotive OS

Our newest in-car experience, Android Automotive OS with Google apps and services built-in, also has strong momentum. With this experience, the entire infotainment system is powered by Android and users can access Google Assistant, Google Maps, and more apps from Google Play directly from the car screen without relying on a phone. Cars from Polestar and Volvo, like the Polestar 2 and the Volvo XC40 Recharge, are already available to customers. And by the end of 2021, this experience will be available to order in more than 10 car models from Volvo, General Motors and Renault. You can get a sneak peek of this customized experience in the new GMC HUMMER EV below.

The all-electric GMC HUMMER EV infotainment features Android Automotive OS with Google built-in. Preproduction model shown. Actual production models may vary. Initial availability Fall 2021.

Developing new apps for cars

To support this growing ecosystem, we recently made the Android for Cars App Library available as part of Jetpack. It allows developers of navigation, EV charging and parking apps to bring their apps to Android Auto compatible cars. Many of these developers have already published their Android Auto apps to the Play Store and we’re now extending this library to also support Android Automotive OS, making it easy for you to build once and generate apps that are compatible with both platforms. We’re already working with Early Access Partners — including Parkwhiz, Plugshare, Sygic, ChargePoint, Flitsmeister, SpotHero and others — to bring apps in these categories to cars powered by Android Automotive OS.

Android for cars

PlugShare, an app for finding EV chargers, has used the Android for Cars App Library and Google Assistant App Actions to build for Android Auto.

We plan to expand to more app categories in the future, so if you’re interested in joining our Early Access Program, please fill out this interest form. You can also get started with the Android for Cars App Library today, by visiting g.co/androidforcars. Lastly, you can always get help from the developer community at Stack Overflow using the android-automotive and android-auto tags. We can’t wait to see what you build next!