Author Archives: Android Developers

#WeArePlay | Meet app founders helping people around the world

Posted by Leticia Lago, Developer Marketing

There are millions of apps available on Google Play, created by thousands of founders across the world. Each single app is unique and special in its own right, but they all have one thing in common - their purpose is to help. From helping motorhome enthusiasts find somewhere to camp, small business owners manage their finances or waste pickers make a reliable income - in this latest batch of #WeArePlay stories, we celebrate app founders who are helping people across the world in extraordinarily different ways.

First we begin with Cristian. Originally from Villa Rica in southern Chile, he made his family very proud by being the first to go to university. During his studies in Santiago, he learned about the local waste pickers – people who make an income by searching through trash cans and finding valuable materials to sell. Despite his mother’s wishes, he was so motivated to help them that he dropped out of university and dedicated all his time to creating an app. Reciclapp works by helping waste pickers connect with local businesses, so they can collect resellable materials directly from them. So far, the app has helped waste pickers across the city save time and guarantee a more reliable income. As Cristian has grown his company to a team of 12 and expanded into Mexico, his mother is now very proud of his bravery and success.

Next, Kennedy and Duke. When they were children, their father’s business sadly failed because managing his finances and tracking spending was too hard. Years later, after a successful career abroad in tech, Kennedy decided it was time to return to his homeland of Nigeria and build his own company. Inspired by his father’s struggle, he partnered with brother Duke and travelled across the country to interview other business owners about their financial struggles. Using this research, they created Kippa - the app simplifies bookkeeping to make sending invoices, storing receipts and setting up a bank account easy. It’s now used by over half a million businesses in Nigeria, as Kennedy mentions “without Google Play, we couldn't help as many business owners”.

To round up today, Gijs and Eefje. The couple adore renting campervans and travelling around to explore the natural beauty of Europe, but they always seemed to struggle with one thing - easily finding places to stay. Feeling like nothing out there could help them, they decided to give app development a go and create Campy. The app works as a digital camping encyclopaedia: helping like-minded campervan enthusiasts discover the perfect spots to set up camp, plan their trips and meet others who love the outdoors. A few years after Campy launched, Gijs and Eefje now have 2 little girls to bring on their big adventures, and are elated with the feedback they have received - “it never ceases to amaze me what a tiny app can do for so many people”.

Check out all the stories from around the world at g.co/play/weareplay and stay tuned for more coming soon.


How useful did you find this blog post?



Latest updates on Android’s custom ML stack

Posted by The Android ML Platform Team

The use of on-device ML in Android is growing faster than ever thanks to its unique benefits over server based ML such as offline availability, lower latency, improved privacy and lower inference costs.

When building on-device ML based features, Android developers usually have a choice between two options: using a production ready SDK that comes with pre-trained and optimized ML models, such as ML Kit or, if they need more control, deploying their own custom ML models and features.

Today, we have some updates on Android’s custom ML stack - a set of essential APIs and services for deploying custom ML features on Android.


TensorFlow Lite in Google Play services is now Android’s official ML inference engine

We first announced TensorFlow Lite in Google Play services in Early Access Preview at Google I/O '21 as an alternative to standalone TensorFlow Lite. Since then, it has grown to serve billions of users every month via tens of thousands of apps.Last month we released the stable version of TensorFlow Lite in Google Play services and are excited to make it the official ML inference engine on Android.

Using TensorFlow Lite in Google Play services will not only allow you to save on binary size and benefit from performance improvements via automatic updates but also ensure that you can easily integrate with future APIs and services from Android’s custom ML stack as they will be built on top of our official inference engine.

If you are currently bundling TensorFlow Lite to your app, check out the documentation to migrate.

TensorFlow Lite Delegates now distributed via Google Play services

Released a few years ago, GPU delegate and NNAPI delegate let you leverage the processing power of specialized hardware such as GPU, DSP or NPU. Both GPU and NNAPI delegates are now distributed via Google Play services.

We are also aware that, for advanced use cases, some developers want to use custom delegates directly. We’re working with our hardware partners on expanding access to their custom delegates via Google Play services.

Acceleration Service will help you pick the best TensorFlow Lite Delegate for optimal performance in runtime

Identifying the best delegate for each user can be a complex task on Android due to hardware heterogeneity. To help you overcome this challenge, we are building a new API that allows you to safely optimize the hardware acceleration configuration at runtime for your TensorFlow Lite models.

We are currently accepting applications for early access to the Acceleration Service and aim for a public launch early next year.

We will keep investing in Android’s custom ML stack

We are committed to providing the essentials for high performance custom on-device ML on Android.

As a summary, Android’s custom ML stack currently includes:

  • TensorFlow Lite in Google Play Services for high performance on-device inference
  • TensorFlow Lite Delegates for accessing hardware acceleration

Soon, we will release an Acceleration Service, which will help pick the optimal delegate for you at runtime.

You can read about and stay up to date with Android’s custom ML stack at developer.android.com/ml.

New features and tools to help you showcase your Play Store listing

Posted by Allison Chang (Product Manager, Google Play), Weifang Sun (Product Manager, Chrome OS), Manuel Wang (Product Manager, Google Play Console), and Marcus Leal (Product Manager, Google Play)

Your Play Store listing is the best way to help prospective users understand the functionality and value of your app. The assets and information you provide - descriptions, images, and videos – are essential to users looking to make a decision on what to download.

Tailoring your app's assets to each form factor is more important than ever, as users are increasingly investing in connected devices beyond their phones, such as tablets, smart watches, and TVs. In fact, the number of active non-mobile Android devices has grown almost 30% in the last year.

Today, we’re announcing new features that put more of your store listing assets front and center in Google Play. We'll also walk through some best practices to help you optimize your listing and generate meaningful installs for your app.
 

Changes on Large Screens

New Content-Forward Formats on Play Homepages

On large screens like tablets, foldables, and Chromebooks, we’re continuing to make improvements that will enable users to discover the best apps for their devices. As we showcased at I/O earlier this year, we’re redesigning the Play Store for large screens and using your screenshots, videos, and descriptions directly in Apps and Games Home.
Play Store homepage for large screens (2023)
The goal of this content-forward approach is to better represent your app in the store and help users make install decisions.

We’ve published a set of content quality guidelines as best practices to showcase your app on large screens. Beginning early next year, apps with assets that follow these criteria will be able to take advantage of richer formats in Play. This won’t impact your app’s promotability, just the way your app is displayed in the Play Store.

Screenshot Support for ChromeOS

When users browse the Play Store on Chromebooks today, they see tablet or phone screenshots in the app’s store listing page. Since this does not always accurately portray the Chromebook experience, we’re now launching the ability to upload Chromebook-specific screenshots in Play Console.
Chromebook screenshots in Play Developer Console

This will allow up to 8 screenshots and will be shown primarily on the Play Store for Chromebooks. These screenshots will appear on both your app listing page and Play homepages.

We recommend using 16:9 screenshots for landscape with dimensions of 1080-7690px.

To get started, visit the Main Store Listing section in Play Console.

Updates to Tablet Screenshot Guidelines

With the new launch of ChromeOS screenshot support, we’re also updating our quality guidelines for tablets for consistency across large screens. While previously uploaded tablet screenshots will not be affected, this should help simplify the process of generating new screenshots when you make updates to your app.


Changes on Phones

Homepages for Other Devices

Last month, we introduced form-factor-specific homepages. This is a dedicated surface on phones for users that have additional non-mobile devices. These homepages improve the visibility of your app and store listing details by allowing users to browse for titles best suited for their smart watches, TVs or cars - all from their phones.
Homepages for other devices

Search Device Filters and Remote Install

Users can also filter results in search with a new device filter in Play. With the filter enabled, search results will only include titles that are compatible with the selected device.


Device search filters
Remote install to other devices

Store Listing Best Practices

Since these changes will make your store listing details much more prominent in Play, here are some ways to help you optimize your app assets:

Use device-specific screenshots that demonstrate the core app or game experience.

In Play Console, you can upload screenshots to show users how your app or game will look on different device types and highlight unique form factor features. When choosing screenshots, use imagery that conveys the primary user flows within your app. This will help users on all devices anticipate what the true app or game experience will be like for them.

Use device imagery with caution

Showing a physical device in your store listing may cause your screenshots and videos to become obsolete quickly or alienate some users. To save time maintaining your assets, use screenshots and videos of just the app or game experience.


Use high-quality images with the proper aspect ratio and resolution

Using high quality images is essential to ensuring your screenshots look great on all screen sizes. Don’t include screenshots that are pixelated, stretched or compressed, or improperly rotated.

Avoid overloading assets with text

To make sure your screenshots and videos look great when featured on Play homepages, avoid using too much text. Since we may resize your assets to fit certain screen sizes, this will prevent any text from being cut off unintentionally.

If you need to use text, avoid any time-sensitive copy that needs to be updated frequently.

As we continue to test ways to feature your store listing information more prominently in Play, the quality of your assets remain as important as ever. We hope these features and tips empower you to showcase the best of your app on all device types. For more tips like these to help you get started, visit our content quality guidelines.

Bringing passkeys to Android & Chrome

Posted by Diego Zavala, Product Manager (Android), Christiaan Brand, Product Manager (Account Security), Ali Naddaf, Software Engineer (Identity Ecosystems), Ken Buchanan, Software Engineer (Chrome)

Explore passkeys on Android & Chrome starting today

Starting today, Google is bringing passkey support to both Android and Chrome.

Passkeys are a significantly safer replacement for passwords and other phishable authentication factors. They cannot be reused, don't leak in server breaches, and protect users from phishing attacks. Passkeys are built on industry standards and work across different operating systems and browser ecosystems, and can be used for both websites and apps.

Passkeys follow already familiar UX patterns, and build on the existing experience of password autofill. For end-users, using one is similar to using a saved password today, where they simply confirm with their existing device screen lock such as their fingerprint. Passkeys on users’ phones and computers are backed up and synced through the cloud to prevent lockouts in the case of device loss. Additionally, users can use passkeys stored on their phone to sign in to apps and websites on other nearby devices.

Today’s announcement is a major milestone in our work with passkeys, and enables two key capabilities:

  1. Users can create and use passkeys on Android devices, which are securely synced through the Google Password Manager.
  2. Developers can build passkey support on their sites for end-users using Chrome via the WebAuthn API, on Android and other supported platforms.

To try this today, developers can enroll in the Google Play Services beta and use Chrome Canary. Both features will be generally available on stable channels later this year.

Our next milestone in 2022 will be an API for native Android apps. Passkeys created through the web API will work seamlessly with apps affiliated with the same domain, and vice versa. The native API will give apps a unified way to let the user pick either a passkey or a saved password. Seamless, familiar UX for both passwords and passkeys helps users and developers gradually transition to passkeys.

Signing in to a website on an Android device with a passkey

For the end-user, creating a passkey requires just two steps: (1) confirm the passkey account information, and (2) present their fingerprint, face, or screen lock when prompted.

 

Signing in is just as simple: (1) The user selects the account they want to sign in to, and (2) presents their fingerprint, face, or screen lock when prompted.

 

Signing in to a website on a nearby computer with a passkey on an Android device

A passkey on a phone can also be used to sign in on a nearby device. For example, an Android user can now sign in to a passkey-enabled website using Safari on a Mac. Similarly, passkey support in Chrome means that a Chrome user, for example on Windows, can do the same using a passkey stored on their iOS device.

Since passkeys are built on industry standards, this works across different platforms and browsers - including Windows, macOS and iOS, and ChromeOS, with a uniform user experience.

We will continue to do our part for a passwordless future

We have worked with others in the industry, including Apple and Microsoft, and members within the FIDO Alliance and the W3C to drive secure authentication standards for years. We have shipped support for W3C Webauthn and FIDO standards since their inception.

Today is another important milestone, but our work is not done. Google remains committed to a world where users can choose where their passwords, and now passkeys, are stored. Please stay tuned for more updates from us in the next year as we introduce changes to Android, enabling third party credential managers to support passkeys for their users.

Announcing an Experimental Preview of Jetpack Multiplatform Libraries

Posted by Márton Braun, Developer Relations Engineer

Since we announced Kotlin support for Android in 2017, developers have been excited about writing their Android apps using Kotlin. We’ve continuously expanded this support for the language over the years, going Kotlin-first with Jetpack libraries and documentation, and then further investing into Kotlin with Jetpack Compose. We’ve also seen the interest of the community in Kotlin’s multiplatform capabilities.

Kotlin Multiplatform Mobile from JetBrains is now in beta, and we have been experimenting with this technology to see how it can enable code sharing across platforms. As part of these experiments, we are now sharing a preview of Kotlin Multiplatform libraries in Jetpack.

The libraries available for multiplatform as part of this experimental preview are Collections and DataStore. These were chosen as they evaluate several important aspects of converting an existing library to multiplatform:

  • Collections is an example of a library written in the Java programming language that has no Android-specific dependencies, but implements Java collection APIs.
  • DataStore is written entirely in Kotlin, and it uses coroutines in both its implementation and APIs. It also depends on Java IO and Android platform APIs.

With this preview, we’re looking for your feedback about using these Jetpack libraries in multiplatform projects targeting Android and iOS applications. Keep in mind that these dev builds are experimental and should not be used in production. They are published outside the regular release cycle of these libraries, and they are not guaranteed to graduate to stable.

The libraries are available from Google’s Maven repository. To start using them, add the following dependencies to your Kotlin Multiplatform project:

val commonMain by getting {
  dependencies {
      implementation("androidx.collection:collection:1.3.0-dev01")

      // Lower-level APIs with support for custom serialization
      implementation("androidx.datastore:datastore-core-okio:1.1.0-dev01")
      // Higher-level APIs for storing values of basic types
      implementation("androidx.datastore:datastore-preferences-core:1.1.0-dev01")
  }
}

You can learn more about the available APIs by checking out our sample app which uses DataStore on Android and iOS, or in the preview API reference documentation available for both libraries.

To provide feedback about your experience with the multiplatform Jetpack libraries, or to show your interest in Kotlin Multiplatform, join the conversation in the Kotlinlang #multiplatform channel. You can also open bugs on the issue tracker for DataStore or for Collections.

*Java is a trademark or registered trademark of Oracle and/or its affiliates.

The new Google Pixel Watch is here – start building for Wear OS!

Posted by the Android Developers Team

If you caught yesterday's Made by Google event, then you saw the latest devices in the Pixel portfolio. Besides the Pixel 7 and Pixel 7 Pro phones, we wanted to showcase two of the latest form factors: the Google Pixel Tablet1 (Google's brand new tablet, coming in 2023), and the latest device powered with Wear OS by Google: the Google Pixel Watch! As consumers begin to preorder the watch, it's an especially great time to prepare your app so it looks great on all of the new watches that consumers will get their hands on over the holidays. Discover the latest updates to Wear OS, how apps like yours are upgrading their experiences, and how you can get started building a beautiful, efficient Wear OS app.

Here’s What’s New in Wear OS

The Google Pixel Watch is built on Wear OS and includes the latest updates to the platform, Wear OS 3.5. This version of Wear OS is also available on some of your other favorite Wear OS devices! The new Wear OS experience is designed to feel fluid and easy to navigate, bringing users the information they need with a tap, swipe, or voice command. With a refreshed UI and rich notifications, your users can see even more at a glance.

To take advantage of building on top of all of these new features, earlier this year we released Compose for Wear OS, our modern declarative UI toolkit designed to help you get your app running with fewer development hours - and fewer lines of code. It's built from the bottom up with Kotlin, and it moved to 1.0 earlier this year, meaning the API is stable and ready for you to get building. Here's what's in the 1.0 release:

  • Material: The Compose Material catalog for Wear OS already offers more components than are available with View-based layouts. The components follow material styling and also implement material theming, which allows you to customize the design for your brand.
  • Declarative: Compose for Wear OS leverages Modern Android Development and works seamlessly with other Jetpack libraries. Compose-based UIs in most cases result in less code and accelerate the development process as a whole, read more.
  • Interoperable: If you have an existing Wear OS app with a large View-based codebase, it's possible to gradually adopt Compose for Wear OS by using the Compose Interoperability APIs rather than having to rewrite the whole codebase.
  • Handles different watch shapes: Compose for Wear OS extends the foundation of Compose, adding a DSL for all curved elements to make it easy to develop for all Wear OS device shapes: round, square, or rectangular with minimal code.
  • Performance: Each Compose for Wear OS library ships with its own baseline profiles that are automatically merged and distributed with your app’s APK and are compiled ahead of time on device. In most cases, this achieves app performance for production builds that is on-par with View-based apps. However, it’s important to know how to configure, develop, and test your app’s performance for the best results. Learn more.

Another exciting update for Wear OS is the launch of the Tiles Material library to help you build tiles more quickly. The Tiles Material Library includes pre-built Material components and layouts that embrace the latest Material Design for Wear OS. This easy to use library includes components for buttons, progress arcs and more - saving you the time of building them from scratch. Plus, with the pre-built layouts, you can kickstart your tiles development knowing your layout follows Material design guidelines on how your tiles should be formatted.

Finally, in the recently released Android Studio Dolphin, we added a range of Wear OS features to help get your apps, tiles, and watch faces ready for all of the Wear OS 3 devices. With an updated Wear OS Emulator Toolbar, an intuitive Pairing Assistant, and the new Direct Surface Launch feature to quickly test watch faces, tiles, and complication, it's now simpler and more efficient than ever to make great apps for WearOS.

Get Inspired with New App Experiences

Apps like yours are already providing fantastic experiences for Wear OS, from Google apps to others like Spotify, Strava, Bitmoji, adidas Running, MyFitnessPal, and Calm. This year, Todoist, PeriodTracker, and Outdooractive all rebuilt their app with Compose - taking advantage of the tools and APIs that make building their app simpler and more efficient; in fact, Outdooractive found that using Compose for Wear OS cut development time by 30% for their team.

With the launch of the Google Pixel Watch, we are seeing fantastic new experiences from Google apps - using the new hardware features as another way to provide an exceptional user experience. Google Photos now allows you to set your favorite picture as your watch face on the Google Pixel Watch, which has 19 customizable watch faces, each with many personalization options. With Google Assistant built in, Google Pixel Watch users can interact with their favorite apps by using the Wear OS app or leveraging the built-in integration with Google Assistant. For example, Google Home’s latest updates users can easily control their smart home devices through the Wear OS app or by saying “Hey Google” to their watch to do everything from adjusting the thermostat to getting notifications from their Nest doorbell when a person or package at the door2.

Health and fitness apps have a lot of opportunity with the latest Wear OS platform and hardware updates. Google Pixel Watch includes Fitbit’s amazing health and fitness features, including accurate heart rate tracking with on-device machine learning and deep optimization down to the processor level. Users can get insights into key metrics like breathing rate, heart rate variability, sleep quality and more right on their Google Pixel Watch. With this improved data, there are more opportunities for health and fitness apps to provide meaningful insights and experiences for their users.

The updates and improvements from Wear OS and the Google Pixel Watch make building differentiated app experiences more tangible. Apps are using those capabilities to excite and delight users and so can you.

Get started

The Google Pixel Watch is the latest addition to an already incredible Wear OS device ecosystem. From improved APIs and tools to exciting new hardware, there is no time like the present to get started on your Wear OS app. To begin developing with Compose for Wear OS, get started on our curated learning pathway for a step-by-step learning journey. Then, check out the documentation including a quick start guide and get hands on experience with the Compose for Wear OS codelab!

Discover even more with the Wear OS session from Google I/O and hear the absolute latest and greatest from Wear OS by tuning into the keynote and technical sessions at the upcoming Android Developer Summit!

Want to learn more about all the MBG announcements? Check out the official blog here. Plus, get started with another exciting form factor coming to the Pixel ecosystem, the Google Pixel Tablet, by optimizing your app for tablets!

Disclaimers:

1. The Google Pixel Tablet has not been authorized as required by the rules of the Federal Communications Commission or other regulators. This device may not be sold or otherwise distributed until required legal authorizations have been obtained. 
2. Requires compatible smart home devices (sold separately).

Todoist adopted Compose for Wear OS and increased its growth rate by 50%

Posted by Posted by The Android Developers Team

Todoist is the world’s top task and time management app, empowering over 30 million people to organize, plan, and collaborate on projects big and small. As a company, Todoist is committed to creating more fulfilling ways for its users to work and live—which includes access to its app across all devices.

That’s why Todoist developers adopted Compose for Wear OS to completely rebuild its app for wearables. This new UI toolkit gives developers the same easy-to-use suite that Android has made available for other devices, allowing for efficient, manageable app development.

A familiar toolkit optimized for Wear OS

Developers at Todoist already had experience with Jetpack Compose for Android mobile, which allowed them to quickly familiarize themselves with Compose for Wear OS. “When the new Wear design language and Compose for Wear OS were announced, we were thrilled,” said Rastislav Vaško, head of Android for Todoist. “It gave us new motivation and an opportunity to invest in the future of the platform.”

As with Jetpack Compose for mobile, developers can integrate customizable components directly from the Compose for Wear OS toolkit, allowing them to write code and implement design requirements much faster than with the View-based layouts they used previously. With the available documentation and hands-on guidance from the Compose for Wear OS codelab, they were able to translate their prior toolkit knowledge to the wearable platform.

“Compose for Wear OS had almost everything we needed to create our layouts,” said Rastislav. “Swipe-dismiss, TimeText, and ScalingLazyList were all components that worked very well out of the box for us, while still allowing us to make a recognizable and distinct app.” For features that were not yet available in the toolkit, the Todoist team used Google’s Horologist—a group of open-source libraries which provide Wear OS developers with features that are commonly required by developers but not yet available. From there, they used the Compose Layout Library to incorporate the fade away modifier that matched the native design guidelines.

Compose for Wear OS shifts development into overdrive

Compose for Wear OS simplifies UI development for Wear OS, letting engineers create complex screens that are both readable and maintainable because of its rich Kotlin syntax and modern declarative approach. This was a significant benefit for the production of the new Todoist application, enabling developers to achieve more in less time.

The central focus of the overhaul was to redesign all screens and interactions to conform with the latest Material Design for Wear OS. Using Compose for Wear OS, Todoist developers shifted away from WearableDrawerLayout in favor of a flatter app structure. This switch followed Material Design for Wear OS guidance and modernized the application’s layout.

Todoist developers designed each screen specifically for Wear OS devices, removing unnecessary elements that complicated the user experience.

“For wearables, we’re always thinking about what we can leave out, to keep only streamlined, focused, and quick interactions,” Rastislav said. Compose for Wear OS helped the Todoist team tremendously with both development and design, allowing them to introduce maintainable implementation while providing a consistent user experience.

"Since we rebuilt our app with Compose for Wear OS, Todoist’s growth rate of installations on Google Play increased by 50%." 

An elevated user and developer experience

The developers at Todoist rapidly and efficiently created a refreshed application for Wear OS using Jetpack Compose. The modern tooling; intuitive APIs; and host of resources, documentation, and samples made for a smooth design and development process that required less code and accelerated the delivery of a new, functional user experience.

Since the app was revamped, the growth rate for Todoist installs on Google Play has increased 50%, and the team has received positive feedback from internal teams and on social media.

The team at Todoist is looking forward to discovering what else Compose for Wear OS can do for its application. They saw the refresh as an investment in the future of wearables and are excited for the additional opportunities and feature offerings provided by devices running Wear OS 3.

Transform your app with Compose for Wear OS

Todoist completely rebuilt and redesigned its Wear OS application with Compose for Wear OS, improving both the user and developer experience.

Learn more about Jetpack Compose for Wear OS:

Prepare your Android Project for Android Gradle plugin 8.0 API changes

Posted by Wojtek Kaliciński, Boris Farber, Android Developer Relations Engineers, and Xavier Ducrohet, Android Studio Tech Lead

To improve build speed and provide stable APIs, the Transform APIs will be removed in Android Gradle plugin (AGP) version 8.0. Most use cases have replacement APIs which are available starting from AGP version 7.2. Read on for more details.

The Android developer community's top request has been to improve build speed while making sure Android Gradle plugin (AGP) has a solid, stable, and well supported API.

To improve build speed starting from AGP 7.2, we have stabilized the Artifacts API and updated the Instrumentation API. For common use cases, these APIs replace the Transform APIs, which cause longer build times and are gone in AGP 8.0.

This article walks you through transitioning off the Transform APIs, whether you're working on a Gradle plugin or an application.

Guidance for Gradle plugins

To improve build times, we split Transform's functionality into the following APIs that are optimized for common use cases:

  • The Instrumentation API lets you transform and analyze compiled app classes using ASM callbacks. For example use this API to add custom traces to methods or classes for additional or custom logging.
  • The Artifacts API gives access to files or directories, whether temporary or final, that are produced by AGP during the build. Use this API to:
    • Add additional generated classes to the app, such as glue code for dependency injection.
    • Implement transformations based on whole program analysis, when all classes can be transformed together in a single task. This is only available starting from AGP 7.4.0-alpha06. The build.gradle.kts file in the “modifyProjectClasses'' Gradle recipe shows how to do it.

For examples of how to use the replacement APIs see the Transform API update note, and our Gradle recipes.

Guidance for apps

Make sure that you update your plugins to be AGP 8.0 compliant before updating your app to AGP 8.0. If the relevant plugins are not compliant, please create a bug that includes a link to this post and send it to the plugin authors.

Several commonly used plugins have already migrated to use these new APIs, including the Hilt Gradle plugin.

Share your feedback

If your use case is not covered by any of the new APIs, please file a bug.

We encourage you to get started with making your plugins compatible with the new AGP APIs. Getting started now means that you have enough time to familiarize yourself with the APIs, share your feedback and then upgrade your dependencies and plugins.

Android Dev Summit ‘22: Coming to you, online and around the world!

Posted by Yasmine Evjen, Community Lead, Android Developer Relations Android Dev Summit is back, and this year, we’re coming to you! Whether you’re tuning in online or–for the first time since 2019–joining in person at locations around the world, we can’t wait to see you! It’s your opportunity to learn from the source, about building excellent apps across devices.


Android Dev Summit ‘22 kicks off on October 24 with the keynote, your opportunity to hear directly from the Android team. We’ll cover the latest in Modern Android Development, innovations in our core platform, and how to take advantage of Android’s momentum across devices, including wearables and large screens. This technical keynote will be packed with demos, and it kicks off at 9AM PT on October 24, live on YouTube.

One of the most important parts of ADS are the deeply technical sessions, a huge part of what we look forward to at ADS. This year, we’ll be sharing the sessions live on YouTube in three tracks spread across three weeks:
  • Modern Android Development, live on Oct 24
  • Form Factors, live on Nov 9
  • Platform, live on Nov 14
ADS is a place where we get to connect directly with you - hearing what’s most important to you and how we can make it easier for you to build on Android. And there’s no better way to do that than connecting in-person. Since travel is still tough for many of you, we’re doing our best this year to come to you, with events popping up around the world. The first stop for ADS will be in the San Francisco Bay Area on October 24 (if you’re local, you can apply to come here). Next, ADS’22 moves to London on November 9 (apply here if you’re in London). The fun will continue in Asia in December with several roadshow stops (more details to come!).

If you’re not able to join us in person, we’d still love to connect! At the end of each of our session tracks, we’ll be hosting a live Q&A – #AskAndroid - for you to get your burning questions answered. Post your questions to Twitter or comment in the YouTube livestream using #AskAndroid, and tune in to see if we answer your question live.

Over the coming weeks, we’ll be dropping more info around ADS’22 on the website; check back when we release the full session track details and more, or sign up for updates through the Android Developer newsletter.

We can’t wait to see you soon!

**Hi everyone, I'm Yasmine Evjen, the new Community Lead for Android Developer Relations. I'm so excited to connect with you all in-person and virtually at #AndroidDevSummit. Bringing two of my favorite things together, exciting new tech and the developers that bring it to life.

Outdooractive boosts user experience on wearable devices with 30% less development time using Compose for Wear OS

Posted by The Android Developers Team

Outdooractive, Europe’s largest outdoor platform, provides trail maps and information to a global community of more than 12 million nature enthusiasts. As a platform focused on helping users plan and navigate their outdoor adventures, Outdooractive has long viewed wearable devices like smart watches as essential to the growth of their app. Users value wearables as navigation tools and activity trackers, so when Google reached out with Android’s new UI toolkit, Compose for Wear OS, Outdooractive’s developers jumped on the opportunity to improve their app for this growing market.

The application overhaul quickly showed the benefits of Compose for Wear OS. It cut development time by an estimated 30% for Outdooractive’s developers, accelerating their ability to create streamlined user interfaces. “What would have taken us days now takes us hours,” said Liam Hargreaves, the Senior Project Manager of Outdooractive.

Having a modern code base and increasing the development speed helped make the UI code more intuitive for the developers to read and write, allowing for faster prototyping in the design phase and more fluid collaboration. This helped the developers create a more convenient experience for users.

Using new tools to create an improved user experience

Outdooractive’s app strives to deliver accurate information in real time to the users’ wearable devices, including turn-by-turn navigation, trail conditions, and weather updates.

“Our app has a relatively complex set of interactions,” said Liam. “Each of these needs to be kept simple, quick, easy to access, and clearly presented — all whilst our customer could be standing on a hillside or in a storm or wearing winter hiking gear and gloves!”

New features in Compose for Wear OS helped the Outdooractive developers create a higher quality app experience for users on the go. The Chip component significantly improved the process for creating lists and allowed developers to use pre-built design elements, saving the team days of work. ScalingLazyColumn also optimized the creation of scrolling screens without the need for RecyclerView or ScrollView.

The developers were also excited by how easy it was to use the AnimatedVisibility component because it allowed them to animate functions that they previously didn’t have time for. The team especially appreciated how Compose for Wear OS made it much easier to present different UI states to the user by communicating “loading” or “error” conditions more clearly.

"Compose makes the UI code more intuitive to write and read, allowing us to prototype faster in the design phase and also collaborate better on the code." 


Experimentation without the overhead

Since implementing Compose for Wear OS, Outdooractive’s users have spent more time on wearable devices doing things they normally would have done on their phones, such as navigating hiking routes — a key UI goal that Compose for Wear OS helped the developers to achieve.

“We see wearables as a critical part of our product and market strategy — and the reaction from our users is extremely positive,” Liam said.

Outdooractive’s developers used another Android Wear OS capability called Health Services to implement fitness tracking features such as heart rate monitoring into the app by leveraging on-device sensors to offer an experience unique to wearable devices. Health Services on Wear OS automizes the configuration of all health- and fitness-related sensors; collects their data; and calculates metrics such as heart rate, distance traveled, and pace, making it easy for developers to implement sophisticated app features while also maximizing battery life. With Health Services and Compose for Wear OS, Outdooractive’s developers plan to further expand the app’s offerings that are made possible by body sensors.

Outdooractive’s streamlined process shows just how easy Compose for Wear OS makes application development because it gives developers the flexibility to experiment with different layouts without increasing development overhead.

Liam had clear advice for other developers considering using Compose for Wear OS: “Fully embrace it.”

Boost your wearable app’s capabilities

Jetpack Compose for Wear OS helps build more engaging user experiences for wearable devices.

To get a first look, watch the Wear OS talk from Google I/O or try the Compose for Wear OS codelab to start learning now.