Tag Archives: Android Auto

What’s new with Android for Cars: I/O 2023

Posted by Jennifer Tsau, Product Management Lead and David Dandeneau, Engineering Lead

For more than a decade, Google has been committed to bringing safe and seamless connected experiences to cars. We’re continuing to see strong momentum and adoption across Android for Cars. Android Auto is supported by nearly every major car maker, and will be in nearly 200 million cars by the end of this year. And the number of cars powered by Android Automotive OS with Google built-in — which includes top brands like Chevrolet, Volvo, Polestar, Honda, Renault and more — is expected to nearly double by the end of this year.

With cars becoming more connected and equipped with immersive displays, there’s more opportunities for developers to bring app experiences to cars. We’re excited to share updates and new ways for developers to reach more users in the car.


Apps designed for driving experiences

Helping drivers while on the road - whether they are navigating, listening to music, or checking the weather - is a top priority. We’re continuing to invest in tools and resources, including the Android for Cars App Library, to make it even easier for developers to build new apps or port existing Android apps over to cars.

New capabilities for navigation apps

Today, we announced Waze rolling out globally on the Google Play Store for all cars with Google built-in, expanding its availability beyond Android Auto. As a part of this launch, we created more templates in Android for Cars App Library to help speed up development time across a number of app categories, including navigation.

For navigation apps, it’s also now possible to integrate with the instrument cluster, providing turn-by-turn directions right in the driver's line of sight. And developers can also access car sensor data to surface helpful information like range, fuel level, and speed to provide more contextual assistance to drivers.

A car dashboard shows the Waze app open on the display panel
The Waze app is coming to all cars with Google built-in, including the first-ever Chevrolet Blazer EV launching this year.

Tools to easily port your media apps across Android for Cars

Media apps continue to be a top use case in the car, and it’s quicker than ever to bring your media apps to Android Auto and Android Automotive OS. Audible recently joined popular streaming audio apps like Deezer, Soundcloud, and Spotify to offer their apps across both Android Auto and cars with Google built-in. If you have a media app on mobile, port it over to reach new users in the car.

New app categories for driving experiences

The Android for Cars App Library now allows developers to bring new apps to cars including internet of things (IoT) and weather apps to cars. The IoT category is available for all developers, while weather is in an early access program. In the weather category, The Weather Channel app will join other weather apps like Weather & Radar later this year.

We’re also working with messaging apps like Zoom, Microsoft Teams, and Webex by Cisco to allow you to join meetings by audio from your car display in the coming months.

A car display shows a Zoom meeting schedule next to a route in Google Maps.
Coming soon, join meetings by audio from your car display.

Apps designed for parked and passenger experiences

With screens expanding in size and more being added for passengers, there is growing demand for parked and passenger experiences in cars.

Video, gaming, and browsing in cars

Now, video and gaming app categories are available in the car, with an early access program for browsing apps coming soon. YouTube is now available for car makers to offer in cars with Google built-in. And drivers of cars with Google built-in will soon have access to popular titles like Beach Buggy Racing 2, Solitaire FRVR, and My Talking Tom Friends from publishers like Vector Unit, FRVR and Outfit7 Limited. Developers can now port their large screen optimized apps to cars to take advantage of this opportunity.

A car display shows a YouTube video of an animated character singing.
YouTube is coming to cars with Google built-in, like the Polestar 2.

More screens in cars allows for new experiences between drivers and passengers, including individual and shared entertainment experiences. We're excited to announce multi-screen support is coming to Android Automotive OS 14 — stay tuned for more updates.

A car with a panoramic front display and screens in headrests showing apps and video content.
Support for multiple screens is coming to Android Automotive OS 14.

Start developing apps for cars today

To learn how to bring your apps to cars, check out the technical session, codelab and documentation on the Android for Cars developer site. With all the opportunities across car screens, there has never been a better time to bring your apps and experiences to cars. Thanks for all the contributions to the Android ecosystem. See you on the road!

13 Things to know for Android developers at Google I/O!

Posted by Maru Ahues Bouza, Director of Android Developer Relations

Android I/O updates: Jetpack, Wear OS, etc 

There aren’t many platforms where you can build something and instantly reach billions of people around the world, not only on their phones—but their TVs, cars, tablets, watches, and more. Today, at Google I/O, we covered a number of ways Android helps you make the most of this opportunity, and how Modern Android Development brings as much commonality as possible, to make it faster and easier for you to create experiences that tailor to all the different screens we use in our daily lives.

We’ve rounded up the top 13 things to know for Android developers—from Jetpack Compose to tablets to Wear OS and of course… Android 13! And stick around for Day 2 of Google I/O, when Android’s full track of 26 technical talks and 4 workshops drop. We’re also bringing back the Android fireside Q&A in another episode of #TheAndroidShow; tweet us your questions now using #AskAndroid, and we’ve assembled a team of experts to answer live on-air, May 12 at 12:30PM PT.


MODERN ANDROID DEVELOPMENT

#1: Jetpack Compose Beta 1.2, with support for more advanced use cases

Android’s modern UI toolkit, Jetpack Compose, continues to bring the APIs you need to support more advanced use cases like downloadable fonts, LazyGrids, window insets, nested scrolling interop and more tooling support with features like LiveEdit, Recomposition Debugging and Animation Preview. Check out the blog post for more details.

Jetpack Compose 1.2 Beta  

#2: Android Studio: introducing Live Edit

Get more done faster with Android Studio Dolphin Beta and Electric Eel Canary! Android Studio Dolphin includes new features and improvements for Jetpack Compose and Wear OS development and an updated Logcat experience. Android Studio Electric Eel comes with integrations with the new Google Play SDK Index and Firebase Crashlytics. It also offers a new resizable emulator to test your app on large screens and the new Live Edit feature to immediately deploy code changes made within composable functions. Watch the What’s new in Android Development Tools session and read the Android Studio I/O blog post here.

#3: Baseline Profiles - speed up your app load time!

The speed of your app right after installation can make a big difference on user retention. To improve that experience, we created Baseline Profiles. Baseline Profiles allow apps and libraries to provide the Android runtime with metadata about code path usage, which it uses to prioritize ahead-of-time compilation. We've seen up to 30% faster app startup times thanks to adding baseline profiles alone, no other code changes required! We’re already using baseline profiles within Jetpack: we’ve added baselines to popular libraries like Fragments and Compose – to help provide a better end-user experience. Watch the What’s new in app performance talk, and read the Jetpack blog post here.

Modern Android Development 

BETTER TOGETHER

#4: Going big on Android tablets

Google is all in on tablets. Since last I/O we launched Android 12L, a release focused on large screen optimizations, and Android 13 includes all those improvements and more. We also announced the Pixel tablet, coming next year. With amazing new hardware, an updated operating system & Google apps, improved guidelines and libraries, and exciting changes to the Play store, there has never been a better time to review your apps and get them ready for large screens and Android 13. That’s why at this year’s I/O we have four talks and a workshop to take you from design to implementation for large screens.


#5: Wear OS: Compose + more!

With the latest updates to Wear OS, you can rethink what is possible when developing for wearables. Jetpack Compose for Wear OS is now in beta, so you can create beautiful Wear OS apps with fewer lines of code. Health Services is also now in beta, bringing a ton of innovation to the health and fitness developer community. And last, but certainly not least, we announced the launch of The Google Pixel Watch - coming this Fall - which brings together the best of Fitbit and Wear OS. You can learn more about all the most exciting updates for wearables by watching the Wear OS technical session and reading our Jetpack Compose for Wear OS announcement.

Compose for Wear OS 

#6: Introducing Health Connect

Health Connect is a new platform built in close collaboration between Google and Samsung, that simplifies connectivity between apps making it easier to reach more users with less work, so you can securely access and share user health and fitness data across apps and devices. Today, we’re opening up access to Health Connect through Jetpack Health—read our announcement or watch the I/O session to find out more!

#7: Android for Cars & Android TV OS

Android for Cars and Android TV OS continue to grow in the US and abroad. As more users drive connected or tune-in, we’re introducing new features to make it even easier to develop apps for cars and TV this year. Catch the “What’s new with Android for Cars” and “What's new with Google TV and Android TV” sessions on Day 2 (May 12th) at 9:00 AM PT to learn more.

#8: Add Voice Across Devices

We’re making it easier for users to access your apps via voice across devices with Google Assistant, by expanding developer access to Shortcuts API for Android for Cars, with support for Wear OS apps coming later this year. We’re also making it easier to build those experiences with Smarter Custom Intents, enabling Assistant to better detect broader instances of user queries through ML, without any NLU training heavy lift. Additionally, we’re introducing improvements that drive discovery to your apps via voice on Mobile, first through Brandless Queries, that drive app usage even when the user hasn’t explicitly said your app’s name, and App Install Suggestions that appear if your isn’t installed yet–these are automatically enabled for existing App Actions today.


AND THE LATEST FROM ANDROID, PLAY, AND MORE:

#9: What’s new in Play!

Get the latest updates from Google Play, including new ways Play can help you grow your business. Highlights include the ability to deep-link and create up to 50 custom listings; our LiveOps beta, which will allow more developers to submit content to be considered for featuring on the Play Store; and even more flexibility in selling subscriptions. Learn about these updates and more in our blog post.

#10: Google Play SDK Index

Evaluate if an SDK is right for your app with the new Google Play SDK index. This new public portal lists over 100 of the most widely used commercial SDKs and information like which app permissions the SDK requests, statistics on the apps that use them, and which version of the SDK is most popular. Learn more on our blog post and watch “What’s new in Google Play” and “What’s new in Android development tools” sessions.

#11: Privacy Sandbox on Android

Privacy Sandbox on Android provides a path for new advertising solutions to improve user privacy without putting access to free content and services at risk. We recently released the first Privacy Sandbox on Android Developer Preview so you can get an early look at the SDK Runtime and Topics API. You can conduct preliminary testing of these new technologies, evaluate how you might adopt them for your solutions, and share feedback with us.

#12: The new Google Wallet API

The new Google Wallet gives users fast and secure access to everyday essentials across Android and Wear OS. We’re enhancing the Google Wallet API, previously called Google Pay Passes API, to support generic passes, grouping and mixing passes together, for example grouping an event ticket with a voucher, and launching a new Android SDK which allows you to save passes directly from your app without a backend integration. To learn more, read the full blog post, watch the session, or read the docs at developers.google.com/wallet.

#13: And of course, Android 13!

The second Beta of Android 13 is available today! Get your apps ready for the latest features for privacy and security, like the new notification permission, the privacy-protecting photo picker, and improved permissions for pairing with nearby devices and accessing media files. Enhance your app with features like app-specific language support and themed app icons. Build with modern standards like HDR video and Bluetooth LE Audio. You can get started by enrolling your Pixel device here, or try Android 13 Beta on select phones, tablets, and foldables from our partners - visit developer.android.com/13 to learn more.

That’s just a snapshot of some of the highlights for Android developers at this year’s Google I/O. Be sure to watch the What’s New in Android talk to get the landscape on the full Android technical track at Google I/O, which includes 26 talks and 4 workshops. Enjoy!

Building apps for Android Automotive OS

Posted by Madan Ankapura, Product Manager

Today we’re announcing the availability of version 1.2 beta of the Car App Library, enabling app developers to start building their navigation, parking, and charging apps for Android Automotive OS.

Now, developers can begin building and testing apps for these categories using the Automotive OS emulator across both Android Automotive OS and Android Auto. For the entire list of changes in v1.2 beta, please see the release notes. To start building your app for the car, check out our updated developer documentation, car quality guidelines, and design guidelines.

As announced earlier, drivers of Polestar 2 and Volvo cars can now download charging (ChargePoint, PlugShare), parking (Spothero, Parkwhiz), and navigation (Flitsmeister, Sygic) apps developed with the Car App Library by joining the Google Group and opting-in to each app's beta on the Google Play store, with your Gmail account.

six spp icons

Car App Library apps on Android Automotive OS are automatically rendered to be consistent with the rest of the experience within each car, without additional work needed from developers.. For example,

Polestar 2
Volvo
Polestar 2 setting with labeled On / Off switches for PlugShare

Polestar 2 setting with labeled On / Off switches for PlugShare

Volvo settings with sliding switches for PlugShare

Volvo settings with sliding switches for PlugShare

Polestar 2 sign-in screen for SpotHero

Polestar 2 sign-in screen for SpotHero

Volvo sign-in screen for SpotHero

Volvo sign-in screen for SpotHero

Example of app customization on Android Automotive OS

Experience for yourself how your app will look within the different systems, by accessing the OEM emulator system images downloadable in Android Studio. You can begin developing your charging, parking and navigation apps for Android Automotive OS today, and we are working to enable you to publish your apps to the Google Play store in the coming months (stay tuned!).

Beyond navigation, rideshare drivers spend a lot of time in their vehicles and will benefit from safer interactions if those apps can be brought to the car’s screen. We are working with Lyft and Kakao Mobility to bring their driver app experiences into the car in the coming months.

image of car screen with gps map and Lyft logo

We are also pleased to announce that we are expanding support to all Points of Interest apps. Beyond charging and parking, this allows any app that will help users discover and search for interesting locations on a map, and optionally enable them to navigate to such points. We are partnering with MochiMochi, Fuelio, Prezzi Benzina, and NAVITIME JAPAN as our early access partners.

If you’re interested in joining our Early Access Program in the future, please fill out this interest form. You can get started with the Android for Cars App Library today, by visiting g.co/androidforcars.

2021 Assistant Recap

Posted by Jessica Dene Earley-Cha, Mike Bifulco and Toni Klopfenstein, Developer Relations Engineers for Google Assistant

We've reached the end of the year - and what a year it's been! Between all of our live (virtual) events including I/O, developer summits, meetups and more, there are a lot of highlights for App Actions, Smart Home Actions and Conversational Actions. Let's dive in and take a look.

App Actions

App Actions allows developers to extend their Android App to Google Assistant. App Actions integrates more cleanly with Android using new Android platform features. With the introduction of the beta shortcuts.xml configuration resource, expanding existing Android features and our latest Google Assistant Plug App Actions is moving closer to the Android platform.

App Actions Benefits:

  • Display app information on Google surfaces. Provide Android widgets for Assistant to display, offering inline answers, simple confirmations and brief interactions to users without changing context.
  • Launch features from Assistant. Connect your app's capabilities to user queries that match predefined semantic patterns (BII).
  • Suggest voice shortcuts from Assistant. Use Assistant to proactively suggest tasks for users to discover or replay, in the right context.

Core Integration

Capabilities is a new Android framework API that allows you to declare the types of actions users can take to launch your app and jump directly to performing a specific task. Assistant provides the first available concrete implementation of the capabilities API. You can utilize capabilities by creating a shortcuts.xml resource and defining your capabilities. Capabilities specify two things: how it's triggered and what to do when it's triggered. To add a capability, you’ll need to select a Built-In intent (BII), which are pre-built language models that provide all the Natural Language Understanding to map the user's input to individual fields. When a BII is matched by the user’s request, your capability will trigger an Android Intent that delivers the understood BII fields to your app, so you can determine what to show in response.

To support a user query like “Hey Google, Find waterfall hikes on ExampleApp,” you can use the GET_THING BII. This BII supports queries that request an “item” and extracts the “item” from the user query as the parameter thing.name. The best use case for the GET_THING BII is to search for things in the app. Below is an example of a capability that uses the GET_THING BII:

<!-- This is a sample shortcuts.xml -->
<shortcuts xmlns:android="http://schemas.android.com/apk/res/android">
<capability android:name="actions.intent.GET_THING">
<intent
android:action="android.intent.action.VIEW"
android:targetPackage="YOUR_UNIQUE_APPLICATION_ID"
android:targetClass="YOUR_TARGET_CLASS">
<!-- Eg. name = "waterfall hikes" -->
<parameter
android:name="thing.name"
android:key="name"/>
</intent>
</capability>
</shortcuts>

This framework integration is in the Beta release stage, and will eventually replace the original implementation of App Actions that uses actions.xml. If your app provides both the new shortcuts.xml and old actions.xml, the latter will be disregarded.

Learn how to add your first capability with this codelab.

Voice shortcuts

Google Assistant suggests relevant shortcuts to users during contextually relevant times. Users can see what shortcuts they have by saying “Hey Google, shortcuts.”

Shortcut for Google Assistant

You can use the Google Shortcuts Integration library, currently in beta, to push an unlimited number of dynamic shortcuts to Google to make your shortcuts visible to users as voice shortcuts. Assistant can suggest relevant shortcuts to users to help make it more convenient for the user to interact with your Android app.

Learn how to push your dynamic shortcuts to Assistant with our dynamic shortcuts codelab.

Example of App using Dynamic Shortcuts CodeLab Tool

Simple Answers, Hands Free & Android Auto

During situations where users need a hand free experience, like on Android Auto, Assistant can display widgets to provide simple answers, brief confirmations and quick interactive experience as a response to a user’s inquiry. These widgets are displayed within the Assistant UI, and in order to implement a fully voice-forward interaction with your app, you can arrange for Assistant to speak a response with your widget, which is safe and natural for use in automobiles. A great re-engagement feature with widgets, is that a “Add this widget” chip can be included too!

Example of App using Dynamic Shortcuts CodeLab Tool

Re Engagement

Another re-engagement tool is In-App Promo SDK you can proactively suggest shortcuts in your app for actions that the user can repeat with a voice command to Assistant, in beta. The SDK allows you to check if the shortcut you want to suggest already exists for that user and prompt the user to create the suggested shortcut.

New Tooling

To support testing Capabilities, the Google Assistant plugin for Android Studio was launched. It contains an updated App Action Test Tool that creates a preview of your App Action, so you can test an integration before publishing it to the Play store.

New App Actions resources

Learn more with new or updated content:


Smart Home Actions

A big focus of this year's Smart Home launches were new and updated tools. At events like I/O, Works With: SiLabs, and the Google Smart Home Developer Summit, we shared these new resources to help you quickly build a high quality smart home integration.

New Resources

To make implementing new features even easier for developers, we released many new tools to help you get your Smart Home Action up and running.

To help consumers discover Google-compatible smart home devices and associated routines, we released the smart home directory, accessible on the web and through the Google Home app.

We heard your requests for more ways to localize your integrations, so we added sample utterances in English (en-US), German (de-DE), and French (fr-FR) to several device guides. Additionally, we also rolled out Chinese (zh-TW) as one of the supported languages for the overall platform. To make our documentation more accessible, we added a Japanese translation of our developer guides.

We also released several new device types and traits, along with new features to support your integrations, including proactive and follow-up responses, app discovery and deep linking.

Quality Improvements

For general onboarding, we've added three new codelabs to enable you to dive deeper into debugging and monitoring your projects. You can now walk through debugging smart home Actions, debugging local fulfillment Actions, and dig deeper into your log-based metrics for your Actions.

When you're actively developing your integration, the Google Home Playground can simulate a virtual home with configurable device types and traits. Here you can view the types and traits in Home Graph, modify device attributes, and share device configurations.

If you discover issues with your configuration, we've continued upgrading the monitoring and logging dashboards to show you detailed views of events with your integrations, as well as better guidance on how to handle errors and exceptions.

The WebRTC Validator Tool acts as a WebRTC peer to stream to or from, and generally emulates the WebRTC player on smart displays with Google Assistant. If you're specifically working with a smart camera, WebRTC is now supported on the CameraStream trait.

Local Home

In order to continue striving towards quality responses to user queries, we also added support to the Local Home SDK to support local queries and responses. Additionally, to help users onboard new devices in their homes quickly and use Google Nest devices as local hubs, we launched BLE Seamless Setup.

Matter

The new Google Home IDE enables you to improve your development process by enabling in-IDE access to Google Assistant Simulator, Cloud Logging, and more for each of your projects. This plugin is available for VSCode.

Finally, as we get closer to the official launch of the Matter protocol, we're working hard to unify all of our smart home ecosystem tools together under a single name - Google Home. The Google Home Developer Center will enable you to quickly find resources for integrating your Matter-compatible smart devices and platforms with Nest, Android, Google Home app, and Google Assistant.

Conversational Actions

Way back in January of 2021, we rolled up an updated Actions for Families program, which provides guidelines for teams building actions meant for kids. Conversational Actions which are approved for the Actions for Families program get a special badge in the Assistant Directory, which lets parents know that your Action is family-friendly.

During the What's New in Google Assistant keynote at Google I/O, Director of Product for the Google Assistant Developer Platform Rebecca Nathenson mentioned several coming updates and changes for Conversational Actions. This included the launch of a Developer Preview for a new client-side fulfillment model for Interactive Canvas. Client-side fulfillment changes the implementation strategy for Interactive Canvas apps, removing the need for a webhook relaying information between the Assistant NLU and their web application. This simplifies the infrastructure needed to deploy an action that uses Interactive Canvas. Since the release of this Developer Preview, we’ve been listening closely to developers to get feedback on client-side fulfillment.

Interactive Canvas Developer Tools

We also released Interactive Canvas Developer tools - a Chrome extension which can help dev teams mock and debug the web app side of Interactive Canvas apps and games. Best of all, it’s open source! You can install the dev tools from the Chrome Web Store, or compile them from source yourself on GitHub at actions-on-google/interactive-canvas-dev-tools.

Updates to SSML

Earlier this year we announced support for new SSML features in Conversational Actions. This expanded support lets you build more detailed and nuanced features using text to speech. We produced a short demonstration of SSML Features on YouTube, and you can find more in our docs on SSML if you’re ready to dive in and start building already

Updates to Transaction UX for Smart Displays

Also announced at I/O for Conversational Actions - we released an updated workflow for completing transactions on smart displays. The new transaction process lets users complete transactions from their smart screens, by confirming the CVC code from their chosen payment method, rather than using a phone to enter a CVC code. If you’d like to get an idea of what the new process looks like, check out our demo video showing new transaction features on smart devices.

Tips on Launching your Conversational Action

Driving a successful launch for Conversational Actions contains helpful information to help you think through some strategies for putting together a marketing team and go-to-market plan for releasing your Conversational Action.

Looking forward to 2022

We're looking forward to another exciting year in 2022. To stay connected, sign up for our new App Actions email series or Google Home newsletter, or for the general Assistant newsletter.

As always, you can also join us on Reddit or follow us on Twitter. Happy Holidays!

Bringing richer navigation, charging, parking apps to more Android Auto users

Posted by Madan Ankapura, Product Manager

Illustration of car interior with map, parking and gas symbols

Today, we are releasing the beta of Android for Cars App Library version 1.1. Your Android Auto apps using features that require Car App API level 2+ like map interactivity, vehicle’s hardware data, multiple-length text, long message and sign-in templates, can now be used in cars with Android Auto 6.7+ (which were previously limited to Desktop Head Unit only).

Two Android Auto GIF examples. Left GIF is 2GIS and right GIF is TomTom

With this announcement, we are also completing the transition to Jetpack and will no longer be accepting submissions built with the closed source library (com.google.android.libraries.car.app). If you haven’t already, we encourage you to migrate to the AndroidX library now.

For the entire list of changes in beta01, please see the release notes. To start building your app for the car, check out our updated developer documentation, car quality guidelines and design guidelines.

If you’re interested in joining our Early Access Program to get access to new features early in the future, please fill out this interest form. You can get started with the Android for Cars App Library today, by visiting g.co/androidforcars.

Accessing car hardware APIs in your app for cars

Posted by Madan Ankapura, Product Manager

Building on our effort to enable developers to create app experiences across navigation, parking, and charging apps via Android for Cars App Library as part of Jetpack, today we’re announcing the availability of CarHardwareManager APIs as part of version 1.1 alpha02 to get developer feedback.

CarHardwareManager can be used to query the vehicle’s hardware data, such as model and make, fuel levels and other sensors. Currently, this feature is only available for Android Auto 6.7+ in the open-testing channel. Testing this in a desktop environment requires a new version of the Desktop Head Unit which will be released separately. Stay tuned here for details on when the new version becomes available.

For the entire list of changes in alpha02, please see the release notes. To start building your app for the car, check out our updated developer documentation, car quality guidelines and design guidelines. These library features are available for testing only with the Desktop Head Unit. We will announce when these features are available to run in cars in the future.

In addition, if you are a developer of a parking app, you can now integrate with Google Assistant to enable users to talk to Google to open their favorite parking app and find parking while driving.

If you’re interested in joining our Early Access Program in the future, please fill out this interest form. You can get started with the Android for Cars App Library today, by visiting g.co/androidforcars.

Improve your app mileage with Android for Cars App library

Posted by Madan Ankapura, Product Manager

In April, we announced our first version of the Android for Cars App Library as part of Jetpack, reaching a milestone to let developers publish their navigation, parking, charging apps on the Google Play Store.

Today, we’re announcing that version 1.1 is in alpha, which brings the following features to developers:

  • Sign-in template - Allows users to sign in to your app directly from the car screen while parked.
  • Long Message template - Allows you to show long messages like terms of service to users as part of the setup flow while parked.
  • Multiple-length text - Different car screen sizes may show different amounts of text. We added an API you can use to specify multiple variants of a text string in select templates to fit different screen sizes.
  • Map Interactivity - You can now add capabilities such as zooming and panning to your navigation template.
Android for Cars App library

For the entire list of changes, please see the release notes. To start building your app for the car, check out our updated developer documentation, car quality guidelines and design guidelines.

These library features are available for testing only with the Desktop Head Unit. We will announce when these features are available to run in cars in the future.

If you’re interested in joining our Early Access Program in the future, please fill out this interest form. You can get started with the Android for Cars App Library today, by visiting g.co/androidforcars.

What’s new with Android for Cars

Posted by Mickey Kataria, Director of Product Management

For over a decade, Google has been committed to automotive, with a vision of creating a safe and seamless connected experience in every car. Developers like all of you are a crucial part of helping people stay connected while on the go. We’re seeing strong momentum across our in-car experiences, Android Auto and Android Automotive OS, and today, we’re excited to share the latest updates and opportunities to reach users in the car.

Check out our I/O session: What's new with Android for Cars

Android Auto

Android Auto, which allows users to connect their phone to their car display, now has over 100 million compatible cars on the road and is supported by nearly every major car manufacturer. Porsche is our newest partner and they will begin shipping Android Auto on new cars, starting this summer with the Porsche 911.

We’ve been working closely with car manufacturers to build an even better Android Auto experience by enabling wireless projection in more vehicles, extending availability to more countries, and continuing to launch new features, like integration into the instrument cluster. To see some of the newest Android Auto technology in the BMW iX, check out the video below.

Android Auto projecting to the cluster display in a BMW iX.

Android Automotive OS

Our newest in-car experience, Android Automotive OS with Google apps and services built-in, also has strong momentum. With this experience, the entire infotainment system is powered by Android and users can access Google Assistant, Google Maps, and more apps from Google Play directly from the car screen without relying on a phone. Cars from Polestar and Volvo, like the Polestar 2 and the Volvo XC40 Recharge, are already available to customers. And by the end of 2021, this experience will be available to order in more than 10 car models from Volvo, General Motors and Renault. You can get a sneak peek of this customized experience in the new GMC HUMMER EV below.

The all-electric GMC HUMMER EV infotainment features Android Automotive OS with Google built-in. Preproduction model shown. Actual production models may vary. Initial availability Fall 2021.

Developing new apps for cars

To support this growing ecosystem, we recently made the Android for Cars App Library available as part of Jetpack. It allows developers of navigation, EV charging and parking apps to bring their apps to Android Auto compatible cars. Many of these developers have already published their Android Auto apps to the Play Store and we’re now extending this library to also support Android Automotive OS, making it easy for you to build once and generate apps that are compatible with both platforms. We’re already working with Early Access Partners — including Parkwhiz, Plugshare, Sygic, ChargePoint, Flitsmeister, SpotHero and others — to bring apps in these categories to cars powered by Android Automotive OS.

Android for cars

PlugShare, an app for finding EV chargers, has used the Android for Cars App Library and Google Assistant App Actions to build for Android Auto.

We plan to expand to more app categories in the future, so if you’re interested in joining our Early Access Program, please fill out this interest form. You can also get started with the Android for Cars App Library today, by visiting g.co/androidforcars. Lastly, you can always get help from the developer community at Stack Overflow using the android-automotive and android-auto tags. We can’t wait to see what you build next!

Start Your Engines: Launch New Android Auto Apps to Production!

Posted by Eric Bahna, Product Manager

In March, we published the Android for Cars App Library as part of Jetpack and most developers have already migrated their implementations to it! In addition to fantastic partner adoption, drivers have been enthusiastic about the new apps and our quality metrics have been positive.

Partner apps running on the Jetpack library (clockwise from upper left): T map, Chargepoint, Sygic, PlugShare, AmiGO, 2GIS, A Better Route Planner, and Flitsmeister

Today, we’re thrilled to announce that you can publish your Android Auto navigation, parking, and charging apps to production! We’ve been hard at work stabilizing the library, Android Auto, and the publishing process to reach this milestone. Publishing to production enables drivers to use your Android Auto app on their car screen without needing to sign up for a beta program. Here’s how:

Thank you for your collaboration and feedback on the Android for Cars App Library 1.0! One of the most common requests from Android Auto users has been for more categories of apps. Our goal with the library is to enable you to easily bring your app to 500+ models of Android Auto-compatible vehicles while meeting our app quality guidelines. The library abstracts away the complexities of screen form factors and input modes so you can focus on what makes your app shine.

Enabling navigation, parking, and charging apps in production is both a big step and the start of a much longer journey. We’re excited to see what you build and look forward to working together to deliver awesome in-car experiences.

Android Auto Apps Powered by Jetpack

Posted by Eric Bahna, Product Manager

In January, we enabled the Google Play Store to accept open testing submissions of navigation, parking, and charging apps. It’s great to see many of you developing Android Auto apps and sending us feedback through the issue tracker. Thank you for helping us improve the platform so we deliver better in-car experiences together! Drivers have been sending positive feedback, too, as new apps launch to open testing, like Chargemap.

Chargemap in Android Auto

Today, we’ve reached the next milestone: the Android for Cars App Library is available in Jetpack as androidx.car.app 1.0.0-beta01! The move to Jetpack makes the library open source, gives you more visibility into our feature development, and provides API consistency with other Jetpack libraries. We’ve updated the developer guide and design guidelines to cover androidx.car.app. Test your app with Android Auto 6.1, or later, then you can publish your app to open testing in the Google Play Store. androidx.car.app includes all functionality of the closed source library (com.google.android.libraries.car), and then some! For example, we added a new GridTemplate, which is useful when users rely primarily on images to make their selections.

Examples of the new GridTemplate in androidx.car.app

On September 1, 2021, the closed source Android for Cars App Library (com.google.android.libraries.car.app) will no longer be available and the Google Play Store will not accept submissions that use com.google.android.libraries.car.app. Our development focus from now, including new features, is on androidx.car.app. We encourage you to migrate now and we’ve created a migration guide that makes it easy. Migration usually takes less than a day, in our experience with early access partners.

We’re working hard to stabilize androidx.car.app and prepare the Google Play Store for production submissions. Production submissions will require androidx.car.app and you can get your app ready by using it in open testing today.