Tag Archives: latest

Google I/O 2023: What’s new in Jetpack

Posted by Amanda Alexander, Product Manager, Android

Android Jetpack is a key pillar of Modern Android Development. It is a suite of over 100 libraries, tools and guidance to help developers follow best practices, reduce boilerplate code, and write code that works consistently across Android versions and devices so that you can focus on building unique features for your app. The majority of apps on Google Play rely on Jetpack, in fact over 90% of the top 1000 apps use Jetpack.

Below we’ll cover highlights of recent updates in three major areas of Jetpack:

  • Architecture Libraries and Guidance
  • Performance Optimization of Applications
  • User Interface Libraries and Guidance

And then conclude with some additional key updates.


1. Architecture Libraries and Guidance

App architecture libraries and components ensure that apps are robust, testable, and maintainable.

Data Persistence

Most applications need to persist local state - whether it be caching results, managing local lists of user enter data, or powering data returned in the UI. Room is the recommended data persistence layer which provides an abstraction layer over SQLite, allowing for increased usability and safety over the platform.

In Room, we have added many brand-new features, such as the Upsert operation, which attempts to insert an entity when there is no uniqueness conflict or update the entity if there is a conflict, and support for using Kotlin value classes for KSP. These new features are available in Room 2.6-alpha with all library sources written in Kotlin and supports both the Java programming language and Kotlin code generation.

Managing tasks with WorkManager

The WorkManager library makes it easy to schedule deferrable, asynchronous tasks that must be run reliably for instance uploading backups or analytics. These APIs let you create a task and hand it off to WorkManager to run when the work constraints are met.

Now, WorkManager allows you to update a WorkRequest after you have already enqueued it. This is often necessary in larger apps that frequently change constraints or need to update their workers on the fly. As of WorkManager 2.8.0, the updateWork() API is the means of doing this without having to go through the process of manually canceling and enqueuing a new WorkRequest. This greatly simplifies the development process.

DataStore

The DataStore library is a robust data storage solution that addresses issues with SharedPreferences and provides a modern coroutines based API.

In DataStore 1.1 alpha we added a widely requested feature: multi-process support which allows you to access the DataStore from multiple processes while providing data consistency guarantees between them. Additional features include a new storage interface that enables the underlying storage mechanism for Datastore to be switched out (we have provided implementations for java.io and okio), and we have also added support for Kotlin Multiplatform.

Lifecycle management

Lifecycle-aware components perform actions in response to a change in the lifecycle status of another component, such as activities and fragments. These components help you produce better-organized, and often lighter-weight code, that is easier to maintain.

We released a stable version of Lifecycle 2.6.0 that includes more Compose integration. We added a new extension method on Flow, collectAsStateWithLifecycle(), that collects from flows and represents its latest value as Compose State in a lifecycle-aware manner. Additionally, a large number of classes are converted to Kotlin and still retain their binary compatibility with previous versions.

Predictive Back Gesture

moving image illustrating predictive back texture

In Android 13, we introduced a predictive back gesture for Android devices such as phones, large screens, and foldables. It is part of a multi-year release; when fully implemented, this feature will let users preview the destination or other result of a back gesture before fully completing it, allowing them to decide whether to continue or stay in the current view.

The Activity APIs for Predictive Back for Android are stable and we have updated the best practices for using the supported system back callbacks; BackHandler (for Compose), OnBackPressedCallback, or OnBackInvokedCallback. We are excited to see Google apps adopt Predictive Back including PlayStore, Calendar, News, and TV!

In the Activity 1.8 alpha releases, The OnBackPressedCallback class now contains new Predictive Back progress callbacks for handling the back gesture starting, progress throughout the gesture, and the back gesture being canceled in addition to the previous handleOnBackPressed() callback for when the back gesture is committed. We also added ComponentActivity.setUpEdgeToEdge() to easily set up the edge-to-edge display in a backward-compatible manner.

Activity updates for more consistent Photo Picker experience

The Android photo picker is a browsable interface that presents the user with their media library. In Activity 1.7.0, the Photo Picker activity contracts have been updated to contain an additional fallback that allows OEMs and system apps, such as Google Play services, to provide a consistent Photo Picker experience on a wider range of Android devices and API levels by implementing the fallback action. Read more in the Photo Picker Everywhere blog.

Incremental Data Fetching

The Paging library allows you to load and display small chunks of data to improve network and system resource consumption. App data can be loaded gradually and gracefully within RecyclerViews or Compose lazy lists.

In Paging Compose 1.0.0-alpha19, there is support for all lazy layouts including custom layouts provided by the Wear and TV libraries. To support more lazy layouts, Paging Compose now provides slightly lower level extension methods on LazyPagingItems in itemKey and itemContentType. These APIs focus on helping you implement the key and contentType parameters to the standard items APIs that already exist for LazyColumnLazyVerticalGrid as well as their equivalents in APIs like HorizontalPager. While these changes do make the LazyColumn and LazyRow examples a few lines longer, it provides consistency across all lazy layouts.


2. Performance Optimization of Applications

Using performance libraries allows you to build performant apps and identify optimizations to maintain high performance, resulting in better end-user experiences.

Improving Start-up Times

Baseline Profiles allow you to partially compile your app at install time to improve runtime and launch performance, and are getting big improvements in new tooling and libraries:

Jetpack provides a new Baseline Profile Gradle Plugin in alpha, which supports AGP 8.0+, and can be easily added to your project in Studio Hedgehog (now in canary). The plugin lets you automate the task of running generation tasks, and pulling profiles from the device and integrating them into your build either periodically, or as part of your release process.

The plugin also allows you to easily automate the new Dex Layout Optimization feature in AGP 8.1, which lets you define BaselineProfileRule tests that collect classes used during startup, and move them to the primary dex file in a multidex app to increase locality. In a large app, this can improve cold startup time by 30% on top of Baseline Profiles!

Macrobenchmark 1.2 has shipped a lot of new features in alpha, such as Power metrics and Custom trace metrics, generation of Baseline Profiles without root on Android 13, and recompilation without clearing app data on Android 14.

You can read everything in depth in the blog "What's new in Android Performance".


3. User Interface Libraries and Guidance

Several changes have been made to our UI libraries to provide better support for large-screen compatibility, foldables, and emojis.

Jetpack Compose

Jetpack Compose, Android’s modern toolkit for building native UI, recently had its May 2023 release which includes new features for text and layouts, continued performance improvements, enhanced tooling support, increased support for large screens, and updated guidance. You can read more in the What’s New in Jetpack Compose I/O blog to learn more.

Glance

The Glance library, now in 1.0-beta, lets you develop app widgets optimized for Android phone, tablet, and foldable homescreens using Jetpack Compose. The library gives you the latest Android widget improvements out of the box, using Kotlin and Compose.

Compose for TV

With the alpha release of the TV library, you can now build experiences for Android TV using components optimized for the living room experience. Compose for TV unlocks all the benefits of Jetpack Compose for your TV apps, allowing you to build apps with less code, easier maintenance and a modern Material 3 look straight out of the box. See the Compose for TV blog for details.

Material 3 for Compose

Material Design 3 is the next evolution of Material Design, enabling you to build expressive, spirited and personal apps. It is the recommended design system for Android apps and the 1.1 stable release brings exciting new features such as bottom sheets, date and time pickers, search bars, tooltips, and added more motion and interaction support. Read more in the release blog.

Understanding Window State

The new WindowManager library helps developers adapt their apps to support multi-window environments and new device form factors by providing a common API surface with support back to API level 14.

In 1.1.0-beta01, new features and capabilities have been added to activity embedding and window layout that enables you to optimize your multi-activity apps for large screens. With the 1.1 release of Jetpack WindowManager, activity embedding APIs are no longer experimental and are recommended for multi-activity applications to provide improved large screen layouts. Check out the What’s new in WindowManager 1.1.0-beta01 blog for details and migration steps.


Other key updates

Kotlin Multiplatform

We continue to experiment with using Kotlin Multiplatform to share business logic between Android and iOS. The Collections 1.3.0-alpha03 and DataStore 1.1.0-alpha02 have been updated so you can now use these libraries in KMM projects. If you are using Kotlin Multiplatform in your app, we would like your feedback!

This was a look at all the changes in Jetpack over the past few months to help you build apps more productively. For more details on each Jetpack library, check out the AndroidX release notes, quickly find relevant libraries with the API picker and watch the Google I/O talks for additional highlights.

Java is a trademark or registered trademark of Oracle and/or its affiliates.

Android Studio @ I/O ‘23: Announcing Studio Bot, an AI-powered coding assistant

Posted by Adarsh Fernando, Senior Product Manager, Android Studio

We first announced Android Studio at I/O 2013 with a promise to deliver a best-in-class integrated development environment (IDE) focused on Android app developers. 10 years later, this commitment to developer productivity still drives the team to deliver new tools and solutions that help teams around the world to create amazing app experiences for their users. And with Google's push to unlock the power of AI to help you throughout your day, Android Studio Hedgehog introduces a key breakthrough: an AI-powered conversational experience designed to make you more productive.

In addition to accelerating coding productivity, this latest version of the IDE provides better tools when you develop for multiple form factors, and helps you improve app quality with new insights, debugging, and testing solutions. All these improvements add to the many updates we’ve included in Android Studio Giraffe, which is now in the Beta channel and helps make it easier to configure your builds with Kotlin DSL support, improve sync times with new data and guidance, target the latest Android SDK version with the new Android SDK Upgrade Assistant, and more.

To see highlights of the new features in action including Studio Bot, watch the What’s new in Android Developer Tools session from Google I/O 2023.

What’s new in Android Development Tools - with Studio Bot Demo

Jump right in and download Android Studio Hedgehog, or learn more about the most exciting new features below.

Coding productivity

Introducing Android Studio Bot

At the heart of our mission is to accelerate your ability to write high-quality code for Android. In this release we are excited to introduce an AI-powered conversational experience called Studio Bot, that leverages Codey, Google's foundation model for coding that is a descendant of PaLM 2, to help you generate code for your app and make you more productive. You can also ask questions to learn more about Android development or help fix errors in your existing code — all without ever having to leave Android Studio. Studio Bot is in its very early days, and we’re training it to become even better at answering your questions and helping you learn best practices. We encourage you to try it out for yourselves, and help it improve by sharing your feedback directly with Studio Bot.

Privacy is top of mind, and what is unique in this integration is that you don’t need to send your source code to Google to use Studio Bot—only the chat dialogue between you and Studio Bot is shared. Much like our work on other AI projects, we stick to a set of principles that hold us accountable. We’re taking a measured approach to our rollout; for this initial launch, Studio Bot is only available to Android developers in the US. You can read more here

Studio Bot

Live Edit

Live Edit helps keep you in the flow by minimizing interruptions when you make updates to your Compose UI and validates those changes on a running device. You can use it in manual mode to control when the running app should be updated or in automatic mode to update the running app as you make code changes. Live Edit is available in Android Studio Giraffe Beta, with the Hedgehog release providing additional improvements in error handling and reporting.

Moving image showing live edit with Compose
Live Edit with Compose

Build productivity

Kotlin DSL and Version Catalogs

A number of updates help you leverage more modern syntax and conventions when configuring your build. Kotlin is the recommended language when developing for Android. Now, with official support for Kotlin DSL in your Gradle build scripts, it’s also the preferred way to configure your build because Kotlin is more readable and offers better compile-time checking and IDE support. Additionally, we’ve also added experimental support for TOML-based Gradle Version Catalogs, a feature that lets you manage dependencies in one central location and share dependencies across modules or projects. Android Studio now makes it easier to configure version catalogs through editor suggestions and integrations with the Project Structure dialog, plus the New Project Wizard.

Screengrab showing Kotlin DSL and Version Catalogs in the New Project Wizard
Kotlin DSL and Version Catalogs in the New Project Wizard

Per-app language preferences

Typically, multilingual users set their system language to one language—such as English—but they want to select other languages for specific apps, such as Dutch, Chinese, or Hindi. Android 13 introduced support for per-app language preferences, and now Android Gradle plugin 8.1 and higher can configure your app to support it automatically. Learn more.

Download impact during Sync

When using Android Gradle Plugin 7.3 or higher, The Build > Sync tool window now includes a summary of time spent downloading dependencies and a detailed view of downloads per repository, so you can easily determine whether unexpected downloads are impacting build performance. Additionally, it can help you identify inefficiencies in how you configure your repositories. Learn more.

Screengrab of Build Analyzer showing impact of downloads during build
Build Analyzer showing impact of downloads during build

New Android SDK Upgrade Assistant

Android Studio Giraffe introduces the Android SDK Upgrade Assistant, a new tool that helps you upgrade the targetSdkVersion, which is the API level that your app targets. Instead of having to navigate every API change with an Android SDK release, the Android SDK Upgrade Assistant guides you through upgrading targetSdkVersion level by level by creating a customized filter of API changes that are relevant to your app. For each migration step, it highlights the major breaking changes and how to address them, helping you get to taking advantage of what the latest versions of Android have to offer much more quickly. To open the Android SDK Upgrade Assistant, go to Tools > Android SDK Upgrade Assistant. In the Assistant panel, select the API level that you want to upgrade to for guidance.

Screengrab of Build Analyzer showing impact of downloads during build
Upgrade more quickly with the Android SDK Upgrade Assistant

Developing for form factors

Google Pixel Fold and Tablet Virtual Devices

Although these devices won’t launch until later this year, you can start preparing your app to take full advantage of the expanded screen sizes and functionality of these devices by creating virtual devices using new Google Pixel Fold and Google Pixel Tablet device profiles in Android Studio Hedgehog. To start, open Device Manager and select Create Device.

Screengrab of Pixel Tablet running on the Android Emulator
Pixel Tablet running on the Android Emulator

Emulator Support for Wear OS 4 Developer Preview

Wear OS 4 is the next generation OS for Wear. Based on Android 13, it officially launches in the fall and has a great selection of new features and optimizations. We’re giving you a preview of all the new platform features with the new Wear OS 4 emulator. We recommend you try it with Android Studio Hedgehog and test that your Wear OS app works as intended with the latest platform updates. The Wear OS 4 emulator will give you a faster and smoother transition to Wear OS 4, and help you make apps ready in time for the official Wear OS 4 release on real devices. Check out the Wear 4 Preview site for how to get started with the new Wear OS 4 emulator.

Watch Face Format support in Wear OS 4 Emulator

Together with Samsung, we’re excited to announce the launch of the Watch Face Format, a new way to build watch faces for Wear OS. The Watch Face Format is a declarative XML format, meaning there will be no code in your watch face APK. The platform takes care of the logic needed to render the watch face so you no longer have to worry about code optimizations or battery performance. Use watch face creation tools such as Watch Face Studio to design watch faces, or you can manually or dynamically edit the watch face format to build watch faces directly. You can test the new Watch Face Format on the Wear OS 4 emulator.

Moving image of Watch Face Format Watchface on Wear 4 Emulator
Watch Face Format Watchface on Wear 4 Emulator

Device Mirroring for local devices

Whether you use a direct USB connection or ADB over Wi-Fi, Device Mirroring lets you see and interact with your local physical devices directly within the Android Studio Running Devices window. This feature lets you focus on how you develop and test your app all in one place. With the Hedgehog release, we’re adding more functionality, including the ability to mirror Wear OS devices and simulate folding actions on foldable devices directly from the IDE.

Screengrab showing device mirroring with the Pixel Fold
Device Mirroring with the Pixel Fold

Android Device Streaming

We know sometimes it’s critical for you to see and test how your apps work on physical hardware to ensure that your users have the best experience. However, accessing the latest flagship devices isn’t always easy. Building on top of Device Mirroring for local devices, we’re introducing device streaming of remote physical Google Pixel devices, such as the Pixel Fold and Pixel Tablet, directly within Android Studio. Device streaming will let you deploy your app to these remote devices and interact with them, all without having to leave the IDE. If you’re interested in getting early access later this year, enroll now.

Espresso Device API

Automated testing of your app using Espresso APIs helps you catch potential issues early, before they reach users. However, testing your app across configuration changes, such as rotating or folding a device, has always been a challenge. Espresso Device API is now available to help you write tests that perform synchronous configuration changes when testing on Android virtual devices running API level 24 and higher. You can also set up test filters to ensure that tests that require certain device features, such as a folding action, only run on devices that support them. Learn more.

Example of test code for synchronous device configuration changes using the Espresso Device API
Synchronous device configuration changes using the Espresso Device API

Improve your app quality

App Quality Insights with Android vitals

App Quality Insights launched in Android Studio Electric Eel to provide access to Firebase Crashlytics issue reports directly from the IDE. The integration lets you navigate between your stack trace and code with a click, use filters to see only the most important issues, and see report details to help you reproduce issues. In Android Studio Hedgehog, you can now view important crash reports from Android vitals, powered by Google Play. Android vitals reports also include useful insights, such as notes from SDK providers so that you can quickly diagnose and resolve crashes related to SDKs your app might be using.

Screengrab showing Android vitals crash reports in the App Quality Insights window
Android vitals crash reports in the App Quality Insights window

App Quality Insights with improved code navigation

When you publish your app using the latest version of AGP 8.2, crash reports now attach minimal git commit hash data to help Android Studio navigate to your code when investigating Crashlytics crash reports in the IDE. Now, when you view a report that includes the necessary metadata, you can choose to either navigate to the line of code in your current git checkout, or view a diff between the checkout and the version of your codebase that generated the crash. To get started with the right dependencies, see the documentation.

Compose State information in Debugger

When parts of your Compose UI recompose unexpectedly, it can sometimes be difficult to understand why. Now, when setting a breakpoint on a Composable function, the debugger lists the parameters of the composable and their state, so you can more easily identify what changes might have caused the recomposition. For example, when you pause on a composable, the debugger can tell you exactly which parameters have “Changed” or have remained “Unchanged”, so you can more efficiently investigate the cause of the recomposition.

Screengrab showing Compose state information in the debugger
Compose state information in the debugger

New Power Profiler

We are excited to announce a brand new Power Profiler in Android Studio Hedgehog, which shows power consumption on the Pixel 6 and higher devices running Android 10 and higher. Data is segmented by each sub-system (such as, Camera, GPS, and more). This data is made available when recording a System Trace via the Profiler and helps you to visually correlate power consumption of the device to the actions happening in your app. For example, you can A/B test multiple algorithms of your video calling app to optimize power consumed by the camera sensor.

Image of the new power profiler
The new Power Profiler

Device Explorer

The Device File Explorer in Giraffe has been renamed to Device Explorer and updated to include information about debuggable processes running on connected devices. In addition to the Files tab, which includes existing functionality that allows you to explore a device’s file hierarchy, the new Processes tab allows you to view a list of debuggable processes for the connected device. From there you can also select a process and perform a Kill process action (which runs am kill), a Force stop (which runs am force-stop) , or attach the debugger to a selected process.

Image of the new power profiler
Processes tab in the Device Explorer window

Compose animation preview

Compose Animation Preview in Android Studio Hedgehog now supports a number of additional Compose APIs, animate*AsState, CrossFade, rememberInfiniteTransition, and AnimatedContent (in addition to updateTransition and AnimatedVisibility). Compose Animation Preview also has new pickers that let you set non-enum or boolean states to debug your Compose animation using precise inputs. For all supported Compose Animation APIs, you can play, pause, scrub, control speed, and coordinate.

Moving image of Compose Animation preview
Compose Animation Preview

Embedded Layout Inspector

You can now run Layout Inspector directly embedded in the Running Device Window in Android Studio! Try out this feature today in Android Studio Hedgehog to conserve screen real estate and organize your UI debugging workflow in a single tool window. You can access common Layout Inspector features such as debugging the layout of your app by showing a view hierarchy and allowing you to inspect the properties of each view. Additionally, because the embedded Layout Inspector overlays on top of the existing device mirroring stream, overall performance when using the inspector is now much faster. To get started and understand known limitations, read the release notes.

Screengrab of embedded Layout Inspector
Embedded Layout Inspector

Firebase Test Lab support for Gradle Managed Devices

Gradle Managed Devices launched in Android Gradle Plugin (AGP) 7.3 to make it easier to utilize virtual devices when running automated tests in your continuous integration (CI) infrastructure by allowing Gradle to manage all aspects of device provisioning. All you need to do is use the AGP DSL to describe the devices you wanted Gradle to use. But sometimes you need to run your tests on physical Android devices. With AGP 8.2, we have expanded Gradle Managed Devices with the ability to target real physical (and virtual) devices running in Firebase Test Lab (FTL). The capability makes it easier than ever to scalably test across the large selection of FTL devices with only a few simple steps. Additionally, this version of AGP can also take advantage of FTL’s new Smart Sharding capabilities, which allows you to get test results back much more quickly by utilizing multiple devices that run in parallel. To learn more and get started, read the release notes.

Image of gradle managed devices with support for Firebase Test Lab
Gradle Managed Devices with support for Firebase Test Lab

IntelliJ

IntelliJ Platform Update

Android Studio Hedgehog (2023.1) includes the IntelliJ 2023.1 platform release, which comes with IDE startup performance improvements, faster import of Maven projects, and a more streamlined commit process. Read the IntelliJ release notes here.

New UI

Along with the IntelliJ platform update comes further improvements to the New UI. In large part due to community feedback, there’s a new Compact Mode, which provides a more consolidated look and feel of the IDE, and an option to vertically split the tool window area and conveniently arrange the windows, just like in the old UI. We also improved the Android-specific UI by updating the main toolbar, tool windows, and new iconography. To use the New UI, enable it in Settings > Appearance & Behavior > New UI. For a full list of changes, see the IntelliJ New UI documentation.

Screengrab showing the new UI adopted from IntelliJ
The New UI adopted from IntelliJ

Summary

To recap, Android Studio Giraffe is available in the Beta channel. Android Studio Hedgehog is the latest version of the IDE and is available in the Canary channel, and includes all of these new enhancements and features:

Coding productivity

  • Android Studio Bot, is a tightly integrated, AI-powered assistant in Android Studio designed to make you more productive.
  • (Beta) Live Edit, which helps keep you in the flow by minimizing interruptions when you make updates to your Compose UI and validate those changes on a running device.

Build productivity

  • (Beta) Kotlin DSL and Version Catalogs, which helps you take advantage of more modern syntax and conventions when configuring your build.
  • (Beta) Per-app language preferences, built-in support in AGP for automatically configuring per-app language preferences.
  • (Beta) Download impact in Build Analyzer, which provides a summary of time spent downloading dependencies and a detailed view of downloads per repository, so you can easily determine whether unexpected downloads are impacting build performance.
  • (Beta) New Android SDK Upgrade Assistant, which helps you upgrade the targetSdkVersion, which is the API level that your app targets, much more quickly.

Developing for form factors

  • Google Pixel Fold and Google Pixel Tablet Virtual Devices, which can help you start preparing your app to take full advantage of the expanded screen sizes and functionality of these devices before they are available in stores.
  • Wear OS 4 Developer Preview Emulator, which similarly provides you early access to test and optimize your app against the next generation of Wear OS by Google.
  • Watch Face Format support in Wear OS 4 Developer Preview Emulator, a new way to build watch faces for Wear OS.
  • Device Mirroring for local devices, which lets you see and interact with your local physical devices directly within Android Studio’s Running Devices window.
  • Android Device Streaming, a device streaming of remote physical Google Pixel devices, which you can register for early access today!
  • Espresso Device API, which helps you write tests that perform synchronous configuration changes when testing on Android virtual devices running API level 24 and higher.

Improve your app quality

  • App Quality Insights: Android vitals, which now lets your view, filter, and navigate important crash reports from Android vitals, powered by Google Play.
  • App Quality Insights with improved code navigation, which lets you now choose to either navigate to the line of code in your current git checkout, or view a diff between the checkout and the version of your codebase that generated the crash.
  • Compose State information in Debugger, which lists the parameters of the composable and their state when paused on a breakpoint in a composable, so you can more easily identify what changes might have caused the recomposition.
  • New Power Profiler, which shows highly accurate power consumption from the device segmented by each sub-system.
  • (Beta) Device Explorer, which now includes information about debuggable processes running on connected devices and actions you can perform on them.
  • (Beta) Compose animation preview, now supports a number of additional Compose APIs and new pickers that let you set non-enum or boolean states to debug your Compose animation using precise inputs.
  • Embedded Layout Inspector, which runs Layout Inspector directly embedded in the Running Device Window in Android Studio, leading to a more seamless debugging experience and significant performance improvements.
  • Firebase Test Lab support for Gradle Managed Devices, which leverages GMD to help you seamlessly configure Firebase Test Lab devices for your automated testing, and now with additional support for smart sharding.

IntelliJ

  • IntelliJ Platform Update to the IntelliJ 2023.1 platform release, which includes a number of performance and quality of life improvements.
  • New UI update that allows Android Studio to adopt a number of improvements to IntilliJ’s modern design language.

See the Android Studio Preview release notes and the Android Emulator release notes for more details.


Download Android Studio Today!

You can download Android Studio Hedgehog Canary or Android Studio Giraffe Beta today to incorporate the new features into your workflow. You can install them side by side with a stable version of Android Studio by following these instructions. The Beta release is near stable release quality, but bugs might still exist, and Canary features are leading edge features. As always, we appreciate any feedback on things you like or features you would like to see. If you find a bug, please report the issue and also check out known issues. Remember to also follow us on Twitter, Medium, or YouTube for more Android development updates!

What’s new in Android Health

Posted by Sara Hamilton, Developer Relations Engineer

Health and fitness data is interconnected – sleep, nutrition, workouts and more all inform one another. For example, consider that your sleep impacts your recovery, which impacts your readiness to run your favorite 5k. Over time, your recovery and workout habits drive metrics like heart rate variability, resting heart rate, VO2Max and more! Often this data exists in silos, making it hard for users to get a holistic view of their health data.

We want to make it simple for people to use their favorite apps and devices to track their health by bringing this data together. They should have full control of what data they share, and when they share it. And, we want to make sure developers can enable this with less complexity and fewer lines of code.

This is why we’ve continued to improve our Android Health offerings, and why today at I/O 2023, we’re announcing key updates across both Health Connect and Health Services for app developers and users.

What is Android Health?

Android Health brings together two important platforms for developers to deliver robust health and fitness app to users; Health Connect and Health Services.

Health Connect is an on-device data store that provides APIs for storing and sharing health and fitness data between Android apps. Before Health Connect, there was not a consistent way for developers to share data across Android apps. They had to integrate with many different APIs, each with a different set of data types and different permissions management frameworks.

Now, with Health Connect, there is less fragmentation. Health Connect provides a consistent set of 40+ data types and a single permissions management framework for users to control data permissions. This means that developers can share data with less effort, enabling people to access their health data in their favorite apps, and have more control over data permissions.

Screenshot of permissions via Health Connect

Health Services is our API surface for accessing sensor data on Wear OS devices in a power-efficient way. Before Health Services, developers had to work directly with low-level sensors, which required different configurations on different devices, and was not battery-efficient.

With Health Services, there is now a consistent API surface across all Wear OS 3+ devices, allowing developers to write code once and run it across all devices. And, the Health Services architecture means that developers get great power savings in the process, allowing people to track longer workouts.

Health Connect is coming to Android 14 with new features

Health Connect and Android 14 logos with an X between them to indicate collaboration

Health Connect is currently available for download as an app on the Play Store. We are excited to announce that starting with the release of Android 14 later this year, Health Connect will be a core part of Android and available on all Android mobile devices. Users will be able to access Health Connect directly from Settings on their device, helping to control how their health data is shared across apps.

Screenshot showing Health Connect avaialble in the privacy settings of an Android device

Several new features will be shipped with the Health Connect Android 14 release. We’re adding a new exercise routes feature to allow users to share maps of their workouts through Health Connect. We’ve also made improvements to make it easier for people to log their menstrual cycles. And, Health Connect updates will be delivered through Google Play System Updates, which will allow new features to be updated often.

Health Services now supports more uses cases with new API capabilities

We’ve released several exciting changes to Health Services this year to support more use cases. Our new Batching Modes feature allows developers to adjust the data delivery frequency of heart rate data to support home gym use cases. We’ve also added new API capabilities, like golf shot detection.

The new version of Wear OS arrives later this year. Wear OS 4 will be the most performant yet, delivering improved battery life for the next generation of Wear OS watches. We will be releasing additional Health Services updates with this change, including improved background body sensor permissions.

Our developer ecosystem is growing

There are over 50 apps already integrated with Health Connect and hundreds of apps with health services, including Peloton, Withings, Oura, and more. These apps are using Health Connect to incorporate new data, to give people an interconnected health experience, without building out many new API integrations. Learn more about how these health and fitness apps are creating new experiences for users in areas like sleep, exercise, nutrition, and more in our I/O technical session.

We also have over 100 apps integrated with Health Services. Apps using Health Services are seeing higher engagement from users with Wear apps, and are giving their users longer battery life in the process. For example, Strava found that users with their Wear app did 25% more activities than those without.

Get started with Health Connect

We hope many more developers will join us in bringing unique experiences within Android Health to your users this year.

If you’d like to create a more interconnected health experience for your users, we encourage you to integrate with Health Connect. And if you are a Wear developer, make sure you are using Health Services to get the best battery performance and future proofing for all upcoming Wear OS devices.

Check out our Health Services documentation, Health Connect documentation, and code samples to get started!

To learn more, watch the I/O session:

Price in-app products with confidence by running price experiments in Play Console

Posted by Phalene Gowling, Product Manager, Google Play

At this year’s Google I/O, our “Boost your revenue with Play Commerce” session highlights the newest monetization tools that are deeply integrated into Google Play, with a focus on helping you optimize your pricing strategy. Pricing your products or content correctly is foundational to driving better user lifetime value and can result in reaching new buyers, improving conversion, and encouraging repeat orders. It can be the difference between a successful sale and pricing yourself out of one, or even undervaluing your products and missing out on key sales opportunities.

To help you price with confidence, we’re excited to announce price experiments for in-app products in Play Console, allowing you to test price points and optimize for local purchasing power at scale. Price experiements will launch in the coming weeks - so read on to get the details on the new tool and learn how you can prepare to take full advantage when it's live.

  • A/B test to find optimal local pricing that’s sensitive to the purchasing power of buyers in different markets. Adjusting your price to local markets has already been an industry-wide practice amongst developers, and at launch you will be able to test and manage your global prices, all within Play Console. An optimized price helps reach both new and existing buyers who may have previously been priced out of monetized experiences in apps and games. Additionally, an optimized price can help increase repeat purchases by buyers of their favorite products.
  • Image of two mobile devices showing A/B price testing in Google Play Console
    Illustrative example only. A/B test price points with ease in Play Console 
  • Experiment with statistical confidence: price experiments enables you to track how close you are to statistical significance with confidence interval tracking, or for a quick summary, you can view the top of the analysis when enough data has been collected in the experiment to determine a statistically significant result. To help make your decision on whether to apply the ‘winning’ price easier, we’ve also included support for tracking key monetization metrics such as revenue uplift, revenue derived from new installers, buyer ratio, orders, and average revenue per paying user. This gives you a more detailed understanding of how buyers behave differently for each experiment arm per market. This can also inspire further refinements towards a robust global monetization strategy.
  • Improve return on investment in user acquisition. Having a localized price and a better understanding of buyer behavior in each market, allows you to optimize your user acquisition strategy having known how buyers will react to market-specific products or content. It could also inform which products you chose to feature on Google Play.

Set up price experiments in minutes in Play Console

Price experiments will be easy to run with the new dedicated section in Play Console under Monetize > Products > Price experiments. You’ll first need to determine the in-app products, markets, and the price points you’d like to test. The intuitive interface will also allow you to refine the experiment settings by audience, confidence level and sensitivity. And once your experiment has reached statistical significance, simply apply the winning price to your selected products within the tool to automatically populate your new default price point for your experiment markets and products. You also have the flexibility to stop any experiment before it reaches statistical significance if needed.

You’ll have full control of what and how you want to test, reducing any overhead of managing tests independently or with external tools – all without requiring any coding changes.

Learn how to run an effective experiment with Play Academy

Get Started

You can start preparing now by strategizing what type of price experiment you might want to run first. For a metric-driven source of inspiration, game developers can explore strategic guidance, which can identify country-specific opportunities for buyer conversion. Alternatively, start building expertise on running effective pricing experiments for in-app products by taking our new Play Academy course, in
preparation for price experiments rolling out in the coming weeks.



Build smarter Android apps with on-device Machine Learning

Posted by Thomas Ezan, Developer Relations

In the past year, the Android team made significant improvements to on-device machine learning to help developers create smarter apps with more features to process images, sound, and text. In the Google I/O talk Build smarter Android apps with on-device Machine Learning, David Miro-Llopis PM on ML Kit and Thomas Ezan Android Developer Relation Engineer review new Android APIs and solutions and showcase applications using on-device ML.

Running ML processes on-device enables low-latency, increases data-privacy, enables offline support and potentially reduces cloud bill. Applications such as Lens AR Translate or the document scanning feature available in Files in India, benefit from the advantages of on-device ML.

To deploy ML features on Android, developers have two options:

  • ML Kit: which offers production-ready ML solutions to common user flows, via easy-to-use APIs.
  • Android’s custom ML stack: which is built on top of Tensorflow Lite, and provides control over the inference process and the user experience.

ML Kit released new APIs and improved existing features

Over the last year, the ML Kit team worked on both improving existing APIs and launching new ones: face mesh and document scanner. ML Kit is launching a new document scanner API in Q3 2023, that will provide a consistent scanning experience across apps in Android. Developers will be able to use it only with a few lines of code, without needing camera permission and with low apk size impact (given that it will be distributed via Google Play Services. In a similar fashion, Google code scanner is now generally available and provides a consistent scanning experience across apps, without needing camera permission, via Google Play Services.

Image a series of three photos of two girls smiling to show how face mesh improves facial recognition

Additionally, ML Kit improved the performance of the following APIs: barcode detection (by 17%), text recognition, digital ink recognition, pose detection, translation, and smart reply. ML Kit also integrated some APIs to Google Play Services so you don’t have to bundle the models to your application. Many developers are using ML Kit to easily integrate machine learning into their apps; for example, WPS uses ML Kit to translate text in 43 languages and save $65M a year.


Acceleration Service in Android’s custom ML stack is now in public beta

To support custom machine learning, the Android ML team is actively developing Android’s custom ML stack. Last year, TensorFlow Lite and GPU delegates were added to the Google Play Services which lets developers use TensorFlow Lite without bundling it to their app and provides automatic updates. With improved inference performance, hardware acceleration can in turn also significantly improve the user experience of your ML-enabled Android app. This year, the team is also announcing Acceleration Service, a new API enabling developers to pick the optimal hardware acceleration configuration at runtime. It is now in public beta and developers can learn more and get started here.

To learn more, watch the video:

What’s new with Android for Cars: I/O 2023

Posted by Jennifer Tsau, Product Management Lead and David Dandeneau, Engineering Lead

For more than a decade, Google has been committed to bringing safe and seamless connected experiences to cars. We’re continuing to see strong momentum and adoption across Android for Cars. Android Auto is supported by nearly every major car maker, and will be in nearly 200 million cars by the end of this year. And the number of cars powered by Android Automotive OS with Google built-in — which includes top brands like Chevrolet, Volvo, Polestar, Honda, Renault and more — is expected to nearly double by the end of this year.

With cars becoming more connected and equipped with immersive displays, there’s more opportunities for developers to bring app experiences to cars. We’re excited to share updates and new ways for developers to reach more users in the car.


Apps designed for driving experiences

Helping drivers while on the road - whether they are navigating, listening to music, or checking the weather - is a top priority. We’re continuing to invest in tools and resources, including the Android for Cars App Library, to make it even easier for developers to build new apps or port existing Android apps over to cars.

New capabilities for navigation apps

Today, we announced Waze rolling out globally on the Google Play Store for all cars with Google built-in, expanding its availability beyond Android Auto. As a part of this launch, we created more templates in Android for Cars App Library to help speed up development time across a number of app categories, including navigation.

For navigation apps, it’s also now possible to integrate with the instrument cluster, providing turn-by-turn directions right in the driver's line of sight. And developers can also access car sensor data to surface helpful information like range, fuel level, and speed to provide more contextual assistance to drivers.

A car dashboard shows the Waze app open on the display panel
The Waze app is coming to all cars with Google built-in, including the first-ever Chevrolet Blazer EV launching this year.

Tools to easily port your media apps across Android for Cars

Media apps continue to be a top use case in the car, and it’s quicker than ever to bring your media apps to Android Auto and Android Automotive OS. Audible recently joined popular streaming audio apps like Deezer, Soundcloud, and Spotify to offer their apps across both Android Auto and cars with Google built-in. If you have a media app on mobile, port it over to reach new users in the car.

New app categories for driving experiences

The Android for Cars App Library now allows developers to bring new apps to cars including internet of things (IoT) and weather apps to cars. The IoT category is available for all developers, while weather is in an early access program. In the weather category, The Weather Channel app will join other weather apps like Weather & Radar later this year.

We’re also working with messaging apps like Zoom, Microsoft Teams, and Webex by Cisco to allow you to join meetings by audio from your car display in the coming months.

A car display shows a Zoom meeting schedule next to a route in Google Maps.
Coming soon, join meetings by audio from your car display.

Apps designed for parked and passenger experiences

With screens expanding in size and more being added for passengers, there is growing demand for parked and passenger experiences in cars.

Video, gaming, and browsing in cars

Now, video and gaming app categories are available in the car, with an early access program for browsing apps coming soon. YouTube is now available for car makers to offer in cars with Google built-in. And drivers of cars with Google built-in will soon have access to popular titles like Beach Buggy Racing 2, Solitaire FRVR, and My Talking Tom Friends from publishers like Vector Unit, FRVR and Outfit7 Limited. Developers can now port their large screen optimized apps to cars to take advantage of this opportunity.

A car display shows a YouTube video of an animated character singing.
YouTube is coming to cars with Google built-in, like the Polestar 2.

More screens in cars allows for new experiences between drivers and passengers, including individual and shared entertainment experiences. We're excited to announce multi-screen support is coming to Android Automotive OS 14 — stay tuned for more updates.

A car with a panoramic front display and screens in headrests showing apps and video content.
Support for multiple screens is coming to Android Automotive OS 14.

Start developing apps for cars today

To learn how to bring your apps to cars, check out the technical session, codelab and documentation on the Android for Cars developer site. With all the opportunities across car screens, there has never been a better time to bring your apps and experiences to cars. Thanks for all the contributions to the Android ecosystem. See you on the road!

Android Studio Flamingo is stable

Posted by Steven Jenkins, Product Manager, Android Studio

Today, we are thrilled to announce the stable release of Android Studio Flamingo🦩: The official IDE for building Android apps!

This release includes improvements to help you build pixel-perfect UI with Live Edit, new features that assist with inspecting your app, IntelliJ updates, and more. Read on or watch the video to learn more about how Android Studio Flamingo🦩 can help supercharge your productivity and download the latest stable version today!

  

UI Tools

Jetpack Compose and Material 3 templates – Jetpack Compose is now recommended for new projects so the templates use Jetpack Compose and Material 3 by default.

Live Edit (Compose) experimental – Iteratively build an app using Compose by pushing code changes directly to an attached device or emulator. Push changes on file save or automatically and watch your UI update in real time. Live Edit is experimental and can be enabled in the Editor Settings. There are known limitations. Please send us your feedback so that we can continue to improve it. Learn more.

Moving image illustrating a live edit
Live edit

Themed app icon Preview support – You can now use the System UI Mode selector on the toolbar to switch wallpapers and see how your themed app icons react to the chosen wallpaper. (Note: required in apps targeting API level 33 and higher.)

Moving image illustrating preview of themed app icons across different wallpapers
Previewing Themed app icons across different wallpapers
Dynamic color Preview

Enable dynamic color in your app and use the new wallpaper attribute in an @Preview composable to switch wallpapers and see how your UI reacts to different wallpapers. (Note: you must use Compose 1.4.0 or higher.)

Moving image illustrating dynamic color wallpaper in Compose Preview
Compose Preview: dynamic color wallpaper

Build

Build Analyzer task categorization – Build Analyzer now groups tasks by categories such as Manifest, Android Resources, Kotlin, Dexing and more. Categories are sorted by duration and can be expanded to display a list of the corresponding tasks for further analysis. This makes it easy to know which categories have the most impact on build time.

Image of Build Analyzer Task Categorization
Build Analyzer Task Categorization

One-click automated profileable build and run – When you are profiling your app, you want to avoid profiling a debuggable build. These are great during development, but the results can be skewed. Instead, you should profile a non-debuggable build because that is what your users will be running. This is now more convenient with one-click automated profileable build and run. Easily configure a profileable app and profile it with one click. You can still choose to profile your debuggable build by selecting Profile app with complete data. Read more on the blog.

Image illustrating One-click Automated Profileable Build and Run
One-click Automated Profileable Build and Run

Lint support for SDK extensions – SDK extensions leverage modular system components to add APIs to the public SDK for previously released API levels. Now, you can scan for and fix SDK extension issues with lint support. Android Studio automatically generates the correct version checks for APIs that are launched using SDK extensions.

Image showing Lint Support for SDK Extensions
Lint Support for SDK Extensions

Android Gradle Plugin 8.0.0 – Android Studio Flamingo ships with a new, major version of the Android Gradle plugin. The plugin brings many improvements, but also introduces a number of behavior changes and the Transform API removal. Please make sure to read about the required changes before you upgrade the AGP version in your projects.

Inspect

Updates to App Quality Insights – Discover, investigate, and reproduce issues reported by Crashlytics with App Quality Insights. You can filter by app version, Crashlytics signals, device type, or operating system version. In the latest update you can now close issues or add useful annotations in the Notes pane.

Image showing how you can annotate and close issues inside the notes pane
Annotate and close issues inside the notes pane

Network Inspector traffic interception – Network Inspector now shows all traffic data for the full timeline by default. Create and manage rules that help test how your app behaves when encountering different responses such as status codes, and response headers and bodies. The rules determine what responses to intercept and how to modify these responses before they reach the app. You can choose which rule to enable or disable by checking the Active box next to each rule. Rules are automatically saved every time you modify them.

Image showing Network Inspector Traffic Interception
Network Inspector Traffic Interception

Auto-connect to foreground process in Layout Inspector – Layout Inspector now automatically connects to the foreground process. You no longer have to click to attach it to your app.

IntelliJ

IntelliJ Platform Update – Android Studio Flamingo (2022.2.1) includes the IntelliJ 2022.2 platform release, which comes with IDE performance improvements, enhanced rendering performance on macOS thanks to the Metal API and more. It also improves the IDE performance when using Kotlin, which positively impacts code highlighting, completion, and find usages. Read the IntelliJ release notes here.

Summary

To recap, Android Studio Flamingo (2022.2.1) includes these new enhancements and features:

UI Tools
  • Live Edit (Compose) - Experimental
  • Themed app icon Preview support
  • Dynamic color Preview
  • Jetpack Compose and Material 3 Templates

Build
  • Build Analyzer Task Categorization
  • One-click Automated Profileable Build and Run
  • Lint Support for SDK Extensions
  • Breaking changes in Android Gradle Plugin 8.0

Inspect
  • Updates to App Quality Insights
  • Network Inspector Traffic Interception
  • Auto-connect to foreground process in Layout Inspector

IntelliJ
  •  IntelliJ Platform 2022.2 Update

Check out the Android Studio release notes, Android Gradle plugin release notes, and the Android Emulator release notes for more details.

Download Studio Today!

Now is the time to download Android Studio Flamingo (2022.2.1) to incorporate the new features into your workflow. As always, we appreciate any feedback on things you like and issues or features you would like to see. If you find a bug or issue, please file an issue and also check out known issues. Remember to also follow us on Twitter, Medium, or YouTube for more Android development updates!

Android 14 Beta 1

Posted by Dave Burke, VP of Engineering

Illustration of badge style Android 14 logo

Today we're releasing the first Beta of Android 14, building around our core themes of privacy, security, performance, developer productivity, and user customization while continuing to improve the large-screen device experience on tablets, foldables, and more. We've been making steady progress refining the features and stability of Android 14, and it's time to open the experience up to both developers and early-adopters.

Android delivers enhancements and new features year-round, and your feedback on the Android beta program plays a key role in helping Android continuously improve. The Android 14 developer site has lots more information about the beta, including downloads for Pixel and the release timeline. We’re looking forward to hearing what you think, and thank you in advance for your continued help in making Android a platform that works for everyone.


Working across form factors

Android 14 builds on the work done in past releases to support tablets and foldable form factors, and we've been building tools and resources to help polish your app experience, including design inspiration and development guides.


Smarter System UI

In the Android operating system, features are implemented by two separate yet equally important packages: the framework, which provides services, and the System UI, which gives the user control of those services. Each Android release brings new refinements to the system UI, and here are some that you might notice in Beta 1.


New back arrow

Image showing the back arrow indicating gesture navigation on a mobile device

The gesture navigation experience includes a more prominent back arrow while interacting with your app to help improve back gesture understanding and usefulness. The back arrow also compliments the user's wallpaper or device theme.



A superior system sharesheet

Screen image of custom sharesheet with direct share targets


In Android 14, apps can now add custom actions to system sharesheets they invoke. Create your custom ChooserAction using ChooserAction.Builder and specify a list of your ChooserActions as the Intent.EXTRA_CHOOSER_CUSTOM_ACTIONS of the Intent created with Intent.createChooser.

In addition, the system now uses more app signals to determine the ranking of the direct share targets. You provide the signal by calling pushDynamicShortcut to report shortcut usage with the corresponding capability bindings.






More graphics capabilities

Android 14 adds new graphics features that you can use to make your app really stand out.


Paths are now queryable and interpolatable

Android's Path API is a powerful and flexible mechanism for creating and rendering vector graphics. Starting in Android 14, you can query paths to find out what's inside of them. The API updates include functionality to interpolate between paths whose structures match exactly, enabling morphing effects, and an AndroidX library provides backwards compatibility back to API 21. More details here.


Personalization

Per-app language preferences

Android 14 enhances per-app language preferences, allowing for dynamic customization of the set of languages displayed in the Android Settings per-app language list, and giving IMEs a way to know the UI language of the current app. Starting with Android Studio Giraffe Canary 7 and AGP 8.1.0-alpha07, you can configure your app to support per-app language preferences automatically. Based on your project resources, the Android Gradle plugin generates the LocaleConfig file and adds a reference to it in the generated manifest file, so you no longer have to create or update the file manually when your language support changes. See Automatic per-app language support for more information and leave feedback.


Privacy

Limiting visibility to disability-focused accessibility services

Android 14 introduces the accessibilityDataSensitive attribute to allow apps to limit visibility of specified views only to accessibility services that claim to help users with disabilities. Play Protect ensures apps downloaded from the Play Store are truthful about these claims. TalkBack and other services that claim to help users with disabilities will not be affected by this attribute.

Apps can consider using accessibilityDataSensitive to:

  • Protect user data (such as personal details or plaintext passwords) 
  • Prevent critical actions from being executed unintentionally (such as transfering money or checking out in a shopping app)
  •  

App compatibility

If you haven't yet tested your app for compatibility with Android 14, now is the time to do it! With Android 14 now in beta, we're opening up access to early-adopter users as well as developers. In the weeks ahead, you can expect more users to be trying your app on Android 14 and raising issues they find.

To test for compatibility, install your published app on a device or emulator running Android 14 Beta and work through all of the app’s flows. Review the behavior changes to focus your testing. After you’ve resolved any issues, publish an update as soon as possible.

Image of Android 14 preview and release timeline indicating we are on target with Beta release in April

It’s also a good time to start getting ready for your app to target Android 14, by testing with the app compatibility changes toggles in Developer Options.

Partial screenshot of App Compatibility Changes in Developer Options
App compatibility toggles in Developer Options.

Get started with Android 14

Today's Beta release has everything you need to try the Android 14 features, test your apps, and give us feedback. For testing your app with tablets and foldables, the easiest way to get started is using the Android Emulator in a tablet or foldable configuration in the latest preview of the Android Studio SDK Manager. Now that we've entered the beta phase, you can enroll any supported Pixel device here to get this and future Android 14 Beta and feature drop Beta updates over-the-air. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio.

For the best development experience with Android 14, we recommend that you use the latest preview of Android Studio Giraffe (or more recent Giraffe+ versions). Once you’re set up, here are some of the things you should do:

  • Try the new features and APIs - your feedback is critical as we finalize the APIs. Report issues in our tracker on the feedback page
  • Test your current app for compatibility - learn whether your app is affected by default behavior changes in Android 14. Install your app onto a device or emulator running Android 14 and extensively test it. 
  • Test your app with opt-in changes - Android 14 has opt-in behavior changes that only affect your app when it’s targeting the new platform. It’s important to understand and assess these changes early. To make it easier to test, you can toggle the changes on and off individually.

We’ll update the preview and beta system images and SDK regularly throughout the Android 14 release cycle.

If you are already enrolled in the Android 13 QPR Beta program and your device is supported, Android 14 Beta 1 will be made available to you without taking any additional action.

For complete information on how to get the Beta, visit the Android 14 developer site.

Giving Users More Transparency and Control Over Account Data

Posted by Bethel Otuteye, Senior Director, Product Management, Android App Safety

Google Play has launched a number of recent initiatives to help developers build consumer trust by showcasing their apps' privacy and security practices in a way that is simple and easy to understand. Today, we’re building on this work with a new data deletion policy that aims to empower users with greater clarity and control over their in-app data.

For apps that enable app account creation, developers will soon need to provide an option to initiate account and data deletion from within the app and online. This web requirement, which you will link in your Data safety form, is especially important so that a user can request account and data deletion without having to reinstall an app.

While Play’s Data safety section already lets developers highlight their data deletion options, we know that users want an easier and more consistent way to request them. By creating a more intuitive experience with this policy, we hope to better educate our shared users on the data controls available to them and create greater trust in your apps and in Google Play more broadly.

As the new policy states, when you fulfill a request to delete an account, you must also delete the data associated with that account. The feature also gives developers a way to provide more choice: users who may not want to delete their account entirely can choose to delete other data only where applicable (such as activity history, images, or videos). For developers that need to retain certain data for legitimate reasons such as security, fraud prevention, or regulatory compliance, you must clearly disclose those data retention practices.

Moving image of a accessing account deletion from a mobile device.
Note: Images are examples and subject to change

While we’re excited about the greater control this will give people over their data, we understand it will take time for developers to prepare, especially those without an existing deletion functionality or web presence, which is why we’re sharing information and resources today.

As a first step, we’re asking developers to submit answers to new Data deletion questions in your app’s Data Safety form by December 7. Early next year, Google Play users will begin to see reflected changes in your app’s store listing, including the refreshed data deletion badge in the Data safety section and the new Data deletion area.

Developers who need more time can file for an extension in Play Console until May 31, 2024 to comply with the new policy.

For more information on data deletion and other policy changes announced today:

As always, thank you for your continued partnership in making Google Play a safe and trustworthy platform for everyone.

Mercari reduces 355K lines of code, a 69% difference, by rebuilding with Jetpack Compose

Posted by the Android team

In 2020, the Mercari team took on a big initiative to update its app’s technical infrastructure. At the time, its codebase was seven years old and hadn’t undergone any major architectural updates. This affected the team’s ability to develop new features and release timely app updates. To resolve this technical debt, Mercari launched what it called the GroundUP initiative—a complete rewrite of its application across platforms, including Android.

The goal was to create a completely modernized application with a scalable design. While retooling the app, Mercari developers turned to Jetpack Compose, Android’s modern declarative toolkit for creating native UI. During the evaluation, the team learned rewriting in Jetpack Compose would help clean up their codebase and have more control over how the app looks.

A rewrite with less code

The Mercari team completely rewrote the architecture and tech stack for its Android app using Jetpack Compose. Mercari developers created a new design system and completely integrated it using Compose, enabling them to easily test and implement new features. Using this new design system, the Mercari team rewrote more than 130 UI screens for its marketplace and modernized the look and feel of many of their components.

With the help of the Jetpack Libraries, Mercari’s team eliminated all legacy code during the rewrite, drastically reducing its codebase and making it more manageable for developers. “Virtually, it’s the same app with way less code,” said Allan Conda, Android technology lead at Mercari. “The rewritten app has about 355,000 fewer lines of code, which is about 69% less than what it had before.”

Moving image showing lines of code that appear and disappear on the leftmost panel of the screen. The spacing between the boxes in the center panel changes, and the overall app view reflects these changes in the rightmost panel.

Interoperability with Views as an early adopter

When the Mercari team first began its GroundUP initiative, Jetpack Compose was only available in developer preview. They wanted the app written completely in Jetpack Compose due to its new declarative approach to creating UI. However, because it was still so new, they found themselves having to solve for unique edge cases using both toolkits.

For example, on Mercari’s listing form screens, users are prompted to input details about the merchandise they want to list. Users were then supposed to be able to select photos from their device gallery and rearrange them on this screen using a drag gesture. Gesture APIs weren’t available in Jetpack Compose at the time, so the team took advantage of Compose's AndroidView to seamlessly integrate Views that handled gestures on the listing form screen. This provided a stable yet temporary solution to implementing drag gestures until the feature became available with Jetpack Compose.

The Mercari team was impressed by how easy it was to switch between the two toolkits, and having the option to use Views alongside Compose gave them better control of edge cases like this. Compose now supports gesture APIs, and Mercari developers have since completely written and integrated the drag gesture component solely using Compose.

Jetpack Compose has matured a lot since Mercari’s initial adoption, and most Android developers no longer need to worry about having to interoperate with both toolkits as Android apps can now be written completely in Compose.

Improving and monitoring performance with Compose

Using Compose, the Mercari team automated baseline profile generation for every stable release of the app and found it to be really helpful. The home screen renders frames up to 2x faster with the default Compose baseline profile compared to without a baseline profile. By providing a custom profile, there’s an additional improvement of up to 20% faster when Mercari users are scrolling compared to just having the default baseline profile.

The team also wrote automated performance tests based on the app’s core scenarios with Android Macrobenchmark. “Using Android Macrobenchmark, we can automatically test start-up, scroll, and screen load times performance,” said Allan. “Currently, we have six core scenarios covered by these tests, like search results and browsing items.”

Additionally, Mercari developers integrated Firebase Performance Monitoring, a real-time app performance monitoring tool, with custom code to calculate scrolling performance on Compose screens. With Firebase Performance Monitoring, the Mercari team detected a performance issue on its search result screen. Using the Android Profiler to pinpoint the problem, Mercari developers discovered there were poor frame rates when scrolling search results. This resulted in the slow rendering instances being reduced by around 23.6%.

The Mercari team solved this frame rate issue with guidance from Google’s Compose performance best practices and Compose stability. Mercari developers had the app skip its Composables and hoist the unused states on the search results screen, significantly improving the frame rates.

Headshot of Allan Conda, Android Tech Lead at Mercari, similing, with quote text reads 'Jetpack Compose helped us implement our Design System and rewrite 130+ screens and many of our components'

More opportunities with Jetpack Compose

With less code to maintain, it’s much easier for Mercari developers to test and implement features. “We have a ton of experiments we can finally conduct using our refreshed platforms. Our users can expect new features coming to the Mercari app at a faster rate,” said Allan.

Mercari’s developers are excited to further develop the app using Animation APIs. With Compose, it’s much easier to animate components, which can result in huge improvements for Android UXs.

Get started

Optimize your UI development with Jetpack Compose.