Author Archives: Android Developers

Top 3 things to know in Modern Android Development at Google I/O ’23

Posted by Rebecca Franks, Android Developer Relations Engineer

Google I/O 2023 was filled with exciting updates and announcements. Modern Android Development (MAD) is all about making Android app development faster and easier! By creating libraries, tools and guidance to speed up your flow and help you write safer, better code so that you can focus on building amazing experiences.

Here are our top three announcements from Google I/O 2023:

#1 Get your development questions answered with Studio Bot

One of the announcements we’re most excited about is Studio Bot, an experimental new AI powered coding assistant, right in your IDE. You can ask it questions or use it to help fix errors — all without ever having to leave Android Studio or upload your source code.

Studio Bot is in its very early days, and is currently available for developers in the US. Download Android Studio canary to try it out and help it improve.



#2 Jetpack Compose has improvements for flow layouts, new Material components, and more

Jetpack Compose continues to be a big focus area, making it easier to build rich UIs. The May 2023 release included many new layouts and improvements such as horizontal and vertical pagers, flow layouts and new Material 3 components such as date and time pickers and bottom sheets.

There have also been large performance improvements to the modifier system, with more updates still in the works. For text alone, this update resulted in an average 22% performance gain that can be seen in the latest alpha release, and these improvements apply across the board. To get these benefits in your app, all you have to do is update your Compose version!

You can now also use Jetpack Compose to build home screen widgets with the Glance library and TV apps with Compose for TV.

Read the blog post for more information about “What’s new in Jetpack Compose”.


#3 Use Kotlin everywhere, throughout your app

Since the official Kotlin for Android support announcement in 2017, we’ve continued to make improvements to helping you develop with Kotlin. Six years later, we are continuing to invest in improvements for Kotlin.

Firstly, we are collaborating with JetBrains on the new K2 compiler which is already showing significant improvements in compilation speed. We are actively working on integration into our tools such as Android Studio, Android Lint, KSP, Compose and more, and leveraging Google’s large Kotlin codebases to verify compatibility of the new compiler.

In addition, we now recommend using Kotlin for your build scripts and version catalogs. With Kotlin in your build and in your UI with Compose, you can now use Kotlin everywhere, throughout your app.


For more information, check out the “What’s new in Kotlin” talk.

And that's our top 3 Modern Android Development announcements from Google I/O 2023, check out this playlist for more.

Designing for Wear OS: Getting started with designing inclusive smartwatch apps

Posted by Matthew Pateman & Mallory Carroll (UX Research), and Josef Burnham (UX Design)

Smartwatches are becoming increasingly popular, with many people using them to stay connected, track their health, and control their devices. Watches enable people to get information at a glance and then take action. These quick and frequent interactions can help people get back to being present in their daily lives.

To help with the challenges of designing and building great watch experiences that work for all, we have created a series of videos. These videos cover a variety of topics starting with how to understand what people want from a smartwatch app. We cover how best to design for your target audience, and how to make the most of the watch’s form factor with a series of design principles. Lastly, we give you an introduction on how to approach product inclusion throughout the whole development lifecycle, and how this approach can help make your products better for all. If you’re interested in learning more, be sure to check out the videos below.


1. Introduction to UX Research & Product Inclusion on Wear OS

If you’re considering building a smartwatch app but don’t know how to begin, this video will help you get started. It shows how to uncover what people want from a smartwatch app, what a great Wear OS experience should look like, and how to ensure it addresses real needs of the people you are building for. Lastly, you’ll find out how to take an equity-focused approach when developing products, apps, and experiences.


2. Introduction to UX Design on Wear OS

Did you know that the average smartwatch interaction is approximately 5 seconds long? In this video you will learn how to design effective and engaging experiences for Wear OS. We’ll guide you on how to make the most out of these short watch interactions by covering key differences between mobile and smartwatch design, the importance of a glanceable user experience, and practical tips for designing for different Wear OS surfaces.


3. Introduction to Product Inclusion & Equity

We will introduce you to Product Inclusion and Equity, and how to approach it when designing for Wear OS. You will learn how to build for belonging and make products more accessible and usable by all.


4. Case Studies: Inclusion and Exclusion in Technology Design

Here you will see a series of case studies showing how product and design choices can be impactful on a personal, community, and systemic level. Designs can both be affirming and inclusive, or harmful and exclusionary to various people and communities. We’ll use some examples to highlight how important inclusion and equity considerations are when making product decisions.


5. Considerations for Community Co-Design

The last video in this series will give you an introduction into community co-design, a powerful approach that focuses on building solutions with, not for, historically marginalized communities. In community co-design, we engage with people based on identity, culture, community, and context. You’ll find out how to engage people and communities in a safe, respectful, and equity-centered way in product development.


Keep your eyes peeled for more updates from us as we continue to share and evolve our latest design thinking and practices, principles, and guidelines.

We also have many more resources to help get you started designing for Wear OS:

  • Find inspiring designs for different types of apps in our gallery
  • Interested in designing for multiple devices from TV’s to mobiles to tablets, check out our design hub
  • Access developer documentation for Wear OS

WPS uses ML Kit to seamlessly translate 43 languages and net $65M in annual savings

Posted by the Android team

WPS is an office suite software that lets users effortlessly view and edit all their documents, presentations, spreadsheets, and more. As a global product, WPS requires a top-notch and reliable in-suite translation technology that doesn’t require users to leave the app. To ensure all its users can enjoy the full benefits of the suite and its content in their preferred language, WPS uses the translation API from ML Kit, Google's on-device and production-ready machine learning toolkit for Android development.

WPS users rely on text translation

Many WPS users rely on ML Kit’s translation tools when reading, writing, or viewing their documents. According to a WPS data sample on single-day usage, there were 6,762 daily active users using ML Kit to translate 17,808 pages across all 43 of its supported languages. Students, who represent 44% of WPS’s userbase, especially rely on translation technology in WPS. WPS helps students better learn to read and write foreign languages by providing them with instant, offline translations through ML Kit.

Moving image of text bubbles with 'hello' in different languages appear (Spanish, French, Korean, English, Greek, Chinese, Italian, Russian, Portuguese, Tamil)

ML Kit provides free, offline translations

When choosing a translation provider, the WPS team looked at a number of popular options. But the other services the company considered only supported cloud-based translation and couldn’t translate text for some complex languages. The WPS team wanted to ensure all of its users could benefit from text translation, regardless of language or network availability. WPS ultimately chose ML Kit because it could both translate text offline and for each of the languages it serves.

“WPS has many African users, among whom are speakers of Swahili and Tamil, which are complex languages that aren’t supported by other translation services,” said Zhou Qi, Android team leader at WPS. “We’re very happy to provide these users with the translation services they need through ML Kit.”

What’s more, the other translation services WPS considered were expensive. ML Kit is completely free to use, and WPS estimates it's saving roughly $65 million per year by choosing ML Kit over another, paid translation software development kit.

Optimizing WPS for ML Kit’s translation API

ML Kit not only provides powerful multilingual translation but also supports App Bundle and Dynamic Delivery, which gives users the option to download ML Kit's translation module on demand. Without App Bundle and Dynamic Delivery, users who don’t need ML Kit would have had to download it anyway, impacting install-time delivery.

“When a user downloads the WPS app, the basic module is downloaded by default. And when the user needs to use the translation feature, only then will it be downloaded. This reduces the initial download size and ensures users who don't need translation assistance won’t be bothered by downloading the module,” said Zhou.

Quote card with headshot of Zhou Qui and text reads, “By using ML Kit’s free API, we provide our users with very useful functions, adding convenience to their daily lives and making document reading and processing more efficient.” — Zhou Qi, Android team leader, WPS

ML Kit’s resources made the process easy

During implementation the WPS team used ML Kit’s official guides frequently to steer their development processes. These tools allowed them to learn the ins and outs of the API and ensure any changes met all of its users’ needs. With the documentation and recommendations provided directly from the ML Kit site, WPS developers were able to quickly and easily integrate the new toolkit to their workflow.

“With the provided resources, we rarely had to search for help. The documentation was clear and concise. Plus, the API was straightforward and developer-friendly, which greatly reduced the learning curve,” said Zhou.

Streamlining UX with ML Kit

Before implementing ML Kit, WPS users had to open a separate application to translate their documents, creating a burdensome user experience. With ML Kit’s automatic language identification and instant translations, WPS now provides its users a streamlined way to translate text quickly, accurately, and without ever leaving the application, significantly improving platform UX.

Moving forward, WPS plans to expand its use of ML Kit, particularly with text recognition. WPS users continue to request the ability to process text on captured photos, so the company plans to use ML Kit to refine the app’s text recognition abilities as well.

Integrate machine learning into your workflow

Learn more about how ML Kit makes on-device machine learning easy.

14 Things to know for Android developers at Google I/O!

Posted by Matthew McCullough, Vice President, Product Management, Android Developer

Today, at Google I/O 2023, you saw how we are ushering in important breakthroughs in AI across all of Google. For Android developers, we see this technology helping you out in your flow, saving you time so you can focus on building engaging new experiences for your users. Time saving tools are going to be even more important, as your users are asking you to support their experiences across an expanding portfolio of screens, like large screens and wearables in particular. Across the Google and Developer Keynotes, Android showed you a number of ways to support you in this mission to help build great experiences for your users; read on for our 14 new things to know in the world of Android Developer (and yes, we also showed you the latest Beta for Android 14!).


BRINGING AI INTO YOUR WORKFLOW

#1: Leverage AI in your development with Studio Bot

As part of Google’s broader push to help unlock the power of AI to help you throughout your day, we introduced Studio Bot, an AI powered conversational experience within Android Studio that helps you generate code, fix coding errors, and be more productive. Studio Bot is in its very early days, and we’re training it to become even better at answering your questions and helping you learn best practices. We encourage you to read the Android Studio blog, download the latest version of Android Studio, and read the documentation to learn how you can get started.


#2: Generate Play Store Listings with AI

Starting today, when you draft a store listing in English, you’ll be able to use Google’s Generative-AI technology to help you get started. Just open our AI helper in Google Play Console, enter a couple of prompts, like an audience and a key theme, and it will generate a draft you can edit, discard, or use. Because you can always review, you’re in complete control of what you submit and publish on Google Play.

Moving image shaing generating Google Play listings with AI

BUILDING FOR A MULTI-DEVICE WORLD

#3: Going big on Android foldables & tablets

Google is all in on large screens, with two new Android devices coming from Pixel - the Pixel Fold and the Pixel Tablet - and 50+ Google apps optimized to look great on the Android large screen ecosystem, alongside a range of apps from developers around the world. It is a great time to invest, with improved tools and guidance like the new Pixel Fold and Pixel Tablet emulator configurations in Android Studio Hedgehog Canary 3, expanded Material design updates, and inspiration for gaming and creativity apps. You can start optimizing for these and other large screen devices by reading the do’s and don’ts of optimizing your Android app for large screens and watching the Developing high quality apps for large screens and foldables session.


#4: Wear OS: Watch faces, Wear OS 4, & Tiles animations

Wear OS active devices have grown 5x since launching Wear OS 3, so there’s more reason than ever to build a great app experience for the wrist. To help you on your way, we announced the new Watch Face Format, a new declarative XML format built in partnership with Samsung to help you bring your great idea to the watch face market. We’re also releasing new APIs to bring rich animations to tiles and helping you get ready for the next generation of platform updates with the Wear OS 4 Developer Preview. Learn more about all the latest updates by checking out our blog, watching the session, and taking a look at the brand new Wear OS gallery.

Moving image shaing generating Google Play listings with AI

#5: Android Health: An interconnected health experience across apps and devices

With 50+ apps in our Health Connect ecosystem and 100+ apps integrated with Health Services, we’re improving Android Health offerings so more developers can work together to bring unique health and fitness experiences to users. Health Connect is coming to Android 14 this fall, making it even easier for users to control how their health data is being shared across apps directly from Settings on their device. Read more about what we announced at I/O and check out our Health Services documentation, Health Connect documentation, and code samples to get started!

#6: Android for Cars: New apps & experiences

Our efforts in cars continue to grow: Android Auto will be available in 200 million cars this year and the number of cars with Google built-in will double in the same period. It’s easier than ever to port existing Android apps to cars and bring entirely new experiences to cars, like video and games. To get started, check out the What’s New with Android for Cars session and check out the developer blog.

#7: Android TV: Compose for TV and more!

We continue our commitment to bringing the best of the app ecosystem to Android TV OS. Today, we’re announcing Compose for TV, the latest UI framework for developing beautiful and functional apps for Android TV OS. To learn more, read the blog post and check out the developer guides, design reference, our new codelab and sample code. Also, please continue to give us feedback so we can continue shaping Compose for TV to fit your needs.

#8: Assistant: Simplified voice experiences across Android

Building Google Assistant integrations inside familiar Android development paths is even easier than before. With the new App Actions Test Library and the Google Assistant plugin for Android Studio–which is now also available for Wear and Auto–it is now easier to code, easier to emulate your user’s experience to forecast user expectations, and easier to deploy App Actions integrations across primary and complementary Android devices. To get started, check out the session What's new in Android development tools and check out the developer documentation.


MODERN ANDROID DEVELOPMENT

#9: Build UI with Compose across screens

Jetpack Compose, our modern UI toolkit for Android development has been steadily growing in the Android community: 24% of the top 1000 apps on Google Play are using Jetpack Compose, which has doubled from last year. We’re bringing Compose to even more surfaces with Compose for TV in alpha, and homescreen widgets with Glance, now in beta. Read more about what we announced at Google I/O, and get started with Compose for building UI across screens.


#10: Use Kotlin everywhere, throughout your app

The Kotlin programming language is at the core of our development platform, and we keep expanding the scale of Kotlin support for Android apps. We’re collaborating with JetBrains on the new K2 compiler, and are actively working on integration into our tools such as Android Studio, Android Lint, KSP, Compose etc and leveraging Google’s large Kotlin codebases to verify compatibility of the new compiler. We now recommend using Kotlin DSL for build scripts. Watch the What’s new in Kotlin for Android talk to learn more.

#11: App Quality Insights now contain Android Vitals reports

Android Studio’s App Quality Insights enables you to access Firebase Crashlytics issue reports directly from the IDE, allowing you to navigate between stack trace and code with a click, use filters to see only the most important issues, and see report details to help you reproduce issues. In the latest release of Android Studio, you can now view important crash reports from Android Vitals, all without adding any additional SDKs or instrumentation to your app. Read more about Android Studio Hedgehog for updates on your favorite Android Studio features.


AND THE LATEST FROM ANDROID & PLAY

#12: What’s new in Play

Get the latest updates from Google Play, including new ways to drive audience growth and monetization. You can now create custom store listings for more user segments including inactive users, and soon for traffic from specific Google Ads campaigns. New listing groups also make it easier to create and maintain multiple listings. Optimize your monetization strategy with price experiments for in-app products and new subscription capabilities that allow you to offer multiple prices per billing period. Learn about these updates and more in our blog post.

#13: Design beautiful Android apps with the new Android UI Design Hub

To make it even easier to build compelling UI across form factors, check out the new Android UI Design Hub. A comprehensive resource to understand how to create user-friendly interfaces for Android with guidance - sharing takeaways, examples and do’s and don’ts, figma starter kits, UI code samples and inspirational galleries.

#14: And of course, Android 14!

We just launched Android 14 Beta 2, bringing enhancements to the platform around camera and media, privacy and security, system UI, and developer productivity. Get excited about new features and changes including Health Connect, ultra HDR for images, predictive back, and ML. ML Kit is launching new APIs like face mesh and document scanner, and Acceleration Service in custom ML stack is now in public beta so you can deliver more fluid, lower latency user experiences. Learn more about Beta 2 and get started by downloading the beta onto a supported device or testing your app in the Emulator.

This was just a small peek of some of the new ways Android is here to help support you. Don’t forget to check out the Android track at Google I/O, including some of our favorite talks like how to Reduce reliance on passwords in Android apps with passkey support and Building for the future of Android. The new Activity embedding learning pathway is also now available to enable you to differentiate your apps on tablets, foldables, and ChromeOS devices. Whether you’re joining us online or in-person at one of the events around the world, we hope you have a great Google I/O - and we can’t wait to see the great experiences you build with the updates that are coming out today!

What’s new in Jetpack Compose

Posted by Jolanda Verhoef, Android Developer Relations Engineer

It has been almost two years since we launched the first stable version of Jetpack Compose, and since then, we’ve seen its adoption and feature set grow spectacularly. Whether you write an application for smartphones, foldables, tablets, ChromeOS devices, smartwatches, or TVs, Compose has got you covered! We recommend you to use Compose for all new Wear OS, phone and large-screen apps. With new tooling and library features, extended Material Design 3, large screen, and Wear OS support, and alpha versions of Compose for homescreen widgets and TV… This is an exciting time!

Compose in the community

In the last year, we’ve seen many companies investigating and choosing Compose to build new features and migrate screens in their production applications. 24% of the top 1000 apps on Google Play have already chosen to adopt Compose! For example, Dropbox engineers told us that they rewrote their search experience in Compose in just a few weeks, which was 40% less time than anticipated, and less than half the time it took the team to build the feature on iOS. They also shared that they were interested in adopting Compose “because of its first-class support for design systems and tooling support”. Our Google Drive team cut their development time nearly in half when using Compose combined with architecture improvements.

It’s great to see how these teams experience faster development cycles, and also feel their UI code is more testable. Inspired? Start by reading our guide How to Adopt Compose for your Team, which outlines how and where to start, and shows the areas of development where Compose can bring huge added value.


Library features & development

Since we released the first Compose Bill of Materials in October last year, we’ve been working on new features, bug fixes, performance improvements, and bringing Compose to everywhere you build UI: phones, tablets, foldables, watches, TV, and your home screen. You can find all changes in the May 2023 release and the latest alpha versions of the Compose libraries.

We’ve heard from you that performance is something you care about, and that it’s not always clear how to create performant Compose applications. We’re continuously improving the performance of Compose. For example, as of last October, we started migrating modifiers to a new and more efficient system, and we’re starting to see the results of that migration. For text alone, this work resulted in an average 22% performance gain that can be seen in the latest alpha release, and these improvements apply across the board. To get these benefits in your app, all you have to do is update your Compose version!

Text and TextField got many upgrades in the past months. Next to the performance improvements we already mentioned, Compose now supports the latest emoji version 🫶 and includes new text features such as outlining text, hyphenation support, and configuring line breaking behavior. Read more in the release notes of the compose-foundation and compose-ui libraries.

The new pager component allows you to horizontally or vertically flip through content, which is similar to ViewPager2 in Views. It allows deep customization options, making it possible to create visually stunning effects:

Moving image showing Hoizontal Pager composable
Choose a song using the HorizontalPager composable. Learn how to implement this and other fancy effects in Rebecca Franks' blog post.

The new flow layouts FlowRow and FlowColumn make it easy to arrange content in a vertical or horizontal flow, much like lines of text in a paragraph. They also enable dynamic sizing using weights to distribute the items across the container.

Image of search filters in a real estate app created with flow layouts
Using flow layouts to show the search filters in a real estate app

To learn more about the new features, performance improvements, and bug fixes, see the release notes of the latest stable and newest alpha release of the Compose libraries.

Tools

Developing your app using Jetpack Compose is much easier with the new and improved tools around it. We added tons of new features to Android Studio to improve your workflow and efficiency. Here are some highlights:

Android Studio Flamingo is the latest stable release, bringing you:

  • Project templates that use Compose and Material 3 by default, reflecting our recommended practices.
  • Material You dynamic colors in Compose previews to quickly see how your composable responds to differently colored wallpapers on a user device.
  • Compose functions in system traces when you use the System Trace profiler to help you understand which Compose functions are being recomposed.

Android Studio Giraffe is the latest beta release, containing features such as:

  • Live Edit, allowing you to quickly iterate on your code on emulator or physical device without rebuilding or redeploying your app.
  • Support for new animations APIs in Animation preview so you can debug any animations using animate*AsStateCrossFaderememberInfiniteTransition, and AnimatedContent.
  • Compose Preview now supports live updates across multiple files, for example, if you make a change in your Theme.kt file, you can see all Previews updates automatically in your UI files.
  • Improving auto-complete behavior. For example, we now show icon previews when you’re adding Material icons, and we keep the @Composable annotation when running “Implement Members".

Android Studio Hedgehog contains canary features such as:

  • Showing Compose state information in the debugger. While debugging your app, the debugger will tell you exactly which parameters have “Changed” or have remained “Unchanged”, so you can more efficiently investigate the cause of the recomposition.
  • You can try out the new Studio Bot, an experimental AI powered conversational experience in Android Studio to help you generate code, fix issues, and learn about best practices, including all things Compose. This is an early experiment, but we would love for you to give it a try!
  • Emulator support for the newly announced Pixel Fold and Tablet Virtual Devices, so that you can test your Compose app before these devices launch later this year.
  • A new Espresso Device API that lets you apply rotation changes, folds, and other synchronous configuration changes to your virtual devices under test.

We’re also actively working on visual linting and accessibility checks for previews so you can automatically audit your Compose UI and check for issues across different screen sizes, and on multipreview templates to help you quickly add common sets of previews.

Material 3

Material 3 is the recommended design system for Android apps, and the latest 1.1 stable release adds a lot of great new features. We added new components like bottom sheets, date and time pickers, search bars, tooltips, and others. We also graduated many of the core components to stable, added more motion and interaction support, and included edge-to-edge support in many components. Watch this video to learn how to implement Material You in your app:


Extending Compose to more surfaces

We want Compose to be the programming model for UI wherever you run Android. This means including first-class support for large screens such as foldables and tablets and publishing libraries that make it possible to use Compose to write your homescreen widgets, smartwatch apps, and TV applications.

Large screen support

We’ve continued our efforts to make development for large screens easy when you use Compose. The pager and flow layouts that we released are common patterns on large screen devices. In addition, we added a new Compose library that lets you observe the device’s window size class so you can easily build adaptive UI.

When attaching a mouse to an Android device, Compose now correctly changes the mouse cursor to a caret when you hover the cursor over text fields or selectable text. This helps the user to understand what elements on screen they can interact with.

Moving image of Compose adjusting the mouse cursor to a caret when the mouse is hovering over text field

Glance

Today we publish the first beta version of the Jetpack Glance library! Glance lets you develop widgets optimized for Android phone, tablet, and foldable homescreens using Jetpack Compose. The library gives you the latest Android widget improvements out of the box, using Kotlin and Compose:

  • Glance simplifies the implementation of interactive widgets, so you can showcase your app’s top features, right on a user’s home screen.
  • Glance makes it easy to build responsive widgets that look great across form factors.
  • Glance enables faster UI Iteration with your designers, ensuring a high quality user experience.
Image of search filters in a real estate app created with flow layouts

Wear OS

We launched Compose for Wear OS 1.1 stable last December, and we’re working hard on the new 1.2 release which is currently in alpha. Here’s some of the highlights of the continuous improvements and new features that we are bringing to your wrist:

  • The placeholder and placeholderShimmer add elegant loading animations that can be used on chips and cards while content is loading.
  • expandableItems make it possible to fold long lists or long text, and only expand to show their full length upon user interaction.
  • Rotary input enhancements available in Horologist add intuitive snap and fling behaviors when a user is navigating lists with rotary input.
  • Android Studio now lets you preview multiple watch screen and text sizes while building a Compose app. Use the Annotations that we have added here.

Compose for TV

You can now build pixel perfect living room experiences with the alpha release of Compose for TV! With the new AndroidX TV library, you can apply all of the benefits of Compose to the unique requirements for Android TV. We worked closely with the community to build an intuitive API with powerful capabilities. Engineers from Soundcloud shared with us that “thanks to Compose for TV, we are able to reuse components and move much faster than the old Leanback View APIs would have ever allowed us to.” And Plex shared that “TV focus and scrolling support on Compose has greatly improved our developer productivity and app performance.”

Compose for TV comes with a variety of components such as ImmersiveList and Carousel that are specifically optimized for the living room experience. With just a few lines of code, you can create great TV UIs.

Moving image of TVLazyGrid on a screen

TvLazyColumn {   items(contentList) { content ->     TvLazyRow { items(content) { cardItem -> Card(cardItem) }   } }

Learn more about the release in this blog post, check out the “What’s new with TV and intro to Compose” talk, or see the TV documentation!

Compose support in other libraries

It’s great to see more and more internally and externally developed libraries add support for Compose. For example, loading pictures asynchronously can now be done with the GlideImage composable from the Glide library. And Google Maps released a library which makes it much easier to declaratively create your map implementations.

GoogleMap( //... ) { Marker( state = MarkerState(position = LatLng(-34, 151)), title = "Marker in Sydney" ) Marker( state = MarkerState(position = LatLng(35.66, 139.6)), title = "Marker in Tokyo" ) }

New and updated guidance

No matter where you are in your learning journey, we’ve got you covered! We added and revamped a lot of the guidance on Compose:

Happy Composing!

We hope you're as excited by these developments as we are! If you haven't started yet, it's time to learn Jetpack Compose and see how your team and development process can benefit from it. Get ready for improved velocity and productivity. Happy Composing!

Second Beta of Android 14

Posted by Dave Burke, VP of Engineering
Android 14 logo

Today, coinciding with Google I/O, we're releasing the second Beta of Android 14. Google I/O includes sessions covering many of Android 14's new features in detail, and Beta 2 includes enhancements around camera and media, privacy and security, system UI, and developer productivity. We're continuing to improve the large-screen device experience, and the Android 14 beta program is now available for the first time on select partner phones, tablets, and foldables.

Android delivers enhancements and new features year-round, and your feedback on the Android beta program plays a key role in helping Android continuously improve. The Android 14 developer site has lots more information about the beta, including downloads for Pixel and the release timeline. We’re looking forward to hearing what you think, and thank you in advance for your continued help in making Android a platform that works for everyone.


Now available on more devices

Image of Android 14 logo with text reads Beta available today along with logos for Google Pixel, iQOO, Lenovo, Nothing, OnePlus, OPPO, Realme, Tecno, vivo, and Xiaomi

The Android 14 beta is now available from partners including iQOO, Lenovo, Nothing, OnePlus, OPPO, Realme, Tecno, vivo, and Xiaomi.


Premium camera and media experiences

Android devices are known for premium cameras, and Android 13 added support for recording vivid high dynamic range (HDR) video supporting billions of colors, camera extensions that device manufacturers use to expose capabilities such as night mode and bokeh, stream use cases for optimized camera streams, and more. Android 14 builds on these capabilities.

Ultra HDR for images

Cropped image showing Ultra HDR support on an Android device
Android adds support for 10-bit high dynamic range (HDR) images, retaining more of the information from the sensor when taking a photo, enabling vibrant colors and greater contrast. The Ultra HDR format Android uses is fully backwards compatible with JPEG, allowing apps to seamlessly interoperate with HDR images, displaying them in standard dynamic range as needed. Rendering these images in the UI in HDR is done automatically by the framework when your app opts in to using HDR UI for its Activity Window, either through a Manifest entry or at runtime by calling Window.setColorMode.

You can also capture 10-bit compressed still images on supported devices. With more colors recovered from the sensor, editing in post can be more flexible. The Gainmap associated with Ultra HDR images can be used to render them using OpenGL or Vulkan.


Zoom, Focus, Postview, and more in Camera Extensions

Android 14 upgrades and improves Camera Extensions, allowing apps to handle longer processing times, enabling improved images using compute-intensive algorithms like low-light photography on supported devices. This will give users an even more robust experience when using Camera Extension capabilities. Examples of these improvements include:

In-sensor zoom

When REQUEST_AVAILABLE_CAPABILITIES_STREAM_USE_CASE in CameraCharacteristics contains SCALER_AVAILABLE_STREAM_USE_CASES_CROPPED_RAW, your app can leverage advanced sensor capabilities to give a cropped RAW stream the same pixels as the full field of view by using a CaptureRequest with a RAW target that has stream use case set to CameraMetadata.SCALER_AVAILABLE_STREAM_USE_CASES_CROPPED_RAW. By implementing the request override controls, the updated camera will give users zoom control even before other camera controls are ready.

Lossless USB audio

Android 14 gains support for lossless audio formats for audiophile-level experiences over USB wired headsets. You can query a USB device for its preferred mixer attributes, register a listener for changes in preferred mixer attributes, and configure mixer attributes using a new AudioMixerAttributes class. It represents the format, such as channel mask, sample rate, and behavior of the audio mixer. The class allows for audio to be sent directly, without mixing, volume adjustment, or processing effects. We are working with our OEM partners to enable this feature in devices later this year.


More graphics capabilities

Android 14 adds advanced graphics features that can be used to take advantage of sophisticated GPU capabilities from within the Canvas layer.

Custom meshes with vertex and fragment shaders

Android has long supported drawing triangle meshes with custom shading, but the input mesh format has been limited to a few predefined attribute combinations. Android 14 adds support for custom meshes, which can be defined as triangles or triangle strips, and can, optionally, be indexed. These meshes are specified with custom attributes, vertex strides, varying, and vertex/fragment shaders written in AGSL. The vertex shader defines the varyings, such as position and color, while the fragment shader can optionally define the color for the pixel, typically by using the varyings created by the vertex shader. If color is provided by the fragment shader, it is then blended with the current Paint color using the blend mode selected when drawing the mesh. Uniforms can be passed into the fragment and vertex shaders for additional flexibility.


Hardware buffer renderer for Canvas

To assist in using Android's Canvas API to draw with hardware acceleration into a HardwareBuffer, Android 14 introduces HardwareBufferRenderer. It is particularly useful when your use case involves communication with the system compositor through SurfaceControl for low-latency drawing.


Privacy

Android 14 continues to focus on privacy, with new functionality that gives users more control and visibility over their data and how it's shared.


Health Connect

Image of two devices side-by-side showing Health Connect on the left and app permissions in Health Connect on the right

Health Connect is an on-device repository for user health and fitness data. It allows users to share data between their favorite apps, with a single place to control what data they want to share with these apps.

Health Connect is currently available to download as an app on the Google Play store. Starting with Android 14, Health Connect is part of the platform and receives updates via Google Play system updates without requiring a separate download. With this, Health Connect can be updated frequently, and your apps can rely on Health Connect being available on devices running Android 14+. Users can access Health Connect from the Settings in their device, with privacy controls integrated into the system settings.

We're launching support for exercise routes in Health Connect, allowing users to share a route of their workout which can be visualized on a map. A route is defined as a list of locations saved within a window of time, and your app can insert routes into exercise sessions, tying them together. To ensure that users have complete control over this sensitive data, users must allow sharing individual routes with other apps.

That's not all that's new! We have a separate blog post with more detail on Health Connect and more in What's new in Android Health.

Data Sharing Updates

Image of two screens side-by-side showing Data safety form and location sharing permission dialog

Users will see a new section in the location runtime permission dialog that highlights when an app shares location data with third parties, where they can get more information and control the app’s data access. This information is from the Data safety form of the Google Play Console. Other app stores will be able to provide a mechanism to pass along this information as well. We encourage you to review your apps’ location data sharing policies and make any applicable updates to your apps' data safety information to ensure that they are up to date. This change will be rolling out shortly.

In addition, users will get a periodic notification if any of their apps with the location permission change their data sharing practices to start sharing their data with 3rd parties.

The new location data sharing updates page will be accessible from within device settings.



Secure full screen Intent notifications

With Android 11 (API level 30), it was possible for any app to use Notification.Builder#sendFullScreenIntent to send full-screen intents while the phone is locked. You could auto-grant this on app install by declaring the USE_FULL_SCREEN_INTENT permission in the AndroidManifest.

Full-screen intent notifications are designed for extremely high-priority notifications demanding the user's immediate attention, such as an incoming phone call or user-configured alarm clock settings. Starting with Android 14, we are limiting the apps granted this permission on app install to those that provide calling and alarms only.

This permission remains enabled for apps installed on the phone before the user updates to Android 14. Users can turn this permission on and off.

You can use the new API NotificationManager.canUseScreenIntent to check if your app has the permission; if not, your app can use the new intent ACTION_MANAGE_APP_USE_FULL_SCREEN_INTENT to launch the settings page where users can grant the permission. 


System UI

Predictive Back

Material Component Search Animation

Material Component Search Animation

With the Android 14 Beta 2 release, we've added multiple improvements and new guidance for developers to have more seamless animation when moving between activities within an app.

With Android 14 Beta 2, all features of Predictive Back remain behind a developer option. See the developer guide to migrate your app to predictive back, as well as the developer guide to creating custom in-app transitions.


App compatibility

With Beta 2, we're just a step away from platform stability in June 2023, when we'll have the final Android 14 SDK and NDK APIs and final app-facing system behaviors. Now that more devices will be running the Android 14 beta, In the weeks ahead, you can expect more users to be trying your app on Android 14 and raising issues they find.

To test for compatibility, install your published app on a device or emulator running the Android 14 Beta and work through all of the app’s flows. Review behavior changes to focus your testing. After you’ve resolved any issues, publish an update as soon as possible.

image of timeline illustrates that we are in May and on track with Beta Releases for Android 14

It’s also a good time to start getting ready for your app to target Android 14, by testing with the app compatibility changes toggles in Developer Options.

image of timeline illustrates that we are in May and on track with Beta Releases for Android 14
App compatibility toggles in Developer Options

Get started with Android 14

Today's Beta 2 release has everything you need to try the Android 14 features, test your apps, and give us feedback. For testing your app with tablets and foldables, you can test with devices from our partners, but the easiest way to get started is using the 64-bit Android Emulator system images for the Pixel Tablet or Pixel Fold configurations found in the latest preview of the Android Studio SDK Manager. You can also enroll any supported Pixel device here to get this and future Android 14 Beta and feature drop Beta updates over-the-air.

For the best development experience with Android 14, we recommend that you use the latest release of Android Studio Hedgehog. Once you’re set up, here are some of the things you should do:

  • Try the new features and APIs – your feedback is critical as we finalize the APIs. Report issues in our tracker on the feedback page.
  • Test your current app for compatibility – learn whether your app is affected by default behavior changes in Android 14. Install your app onto a device or emulator running Android 14 and extensively test it.
  • Test your app with opt-in changes – Android 14 has opt-in behavior changes that only affect your app when it’s targeting the new platform. It’s important to understand and assess these changes early. To make it easier to test, you can toggle the changes on and off individually.

We’ll update the beta system images and SDK regularly throughout the Android 14 release cycle.

For complete information on how to get the Beta, visit the Android 14 developer site.

Google I/O 2023: What’s new in Jetpack

Posted by Amanda Alexander, Product Manager, Android

Android Jetpack is a key pillar of Modern Android Development. It is a suite of over 100 libraries, tools and guidance to help developers follow best practices, reduce boilerplate code, and write code that works consistently across Android versions and devices so that you can focus on building unique features for your app. The majority of apps on Google Play rely on Jetpack, in fact over 90% of the top 1000 apps use Jetpack.

Below we’ll cover highlights of recent updates in three major areas of Jetpack:

  • Architecture Libraries and Guidance
  • Performance Optimization of Applications
  • User Interface Libraries and Guidance

And then conclude with some additional key updates.


1. Architecture Libraries and Guidance

App architecture libraries and components ensure that apps are robust, testable, and maintainable.

Data Persistence

Most applications need to persist local state - whether it be caching results, managing local lists of user enter data, or powering data returned in the UI. Room is the recommended data persistence layer which provides an abstraction layer over SQLite, allowing for increased usability and safety over the platform.

In Room, we have added many brand-new features, such as the Upsert operation, which attempts to insert an entity when there is no uniqueness conflict or update the entity if there is a conflict, and support for using Kotlin value classes for KSP. These new features are available in Room 2.6-alpha with all library sources written in Kotlin and supports both the Java programming language and Kotlin code generation.

Managing tasks with WorkManager

The WorkManager library makes it easy to schedule deferrable, asynchronous tasks that must be run reliably for instance uploading backups or analytics. These APIs let you create a task and hand it off to WorkManager to run when the work constraints are met.

Now, WorkManager allows you to update a WorkRequest after you have already enqueued it. This is often necessary in larger apps that frequently change constraints or need to update their workers on the fly. As of WorkManager 2.8.0, the updateWork() API is the means of doing this without having to go through the process of manually canceling and enqueuing a new WorkRequest. This greatly simplifies the development process.

DataStore

The DataStore library is a robust data storage solution that addresses issues with SharedPreferences and provides a modern coroutines based API.

In DataStore 1.1 alpha we added a widely requested feature: multi-process support which allows you to access the DataStore from multiple processes while providing data consistency guarantees between them. Additional features include a new storage interface that enables the underlying storage mechanism for Datastore to be switched out (we have provided implementations for java.io and okio), and we have also added support for Kotlin Multiplatform.

Lifecycle management

Lifecycle-aware components perform actions in response to a change in the lifecycle status of another component, such as activities and fragments. These components help you produce better-organized, and often lighter-weight code, that is easier to maintain.

We released a stable version of Lifecycle 2.6.0 that includes more Compose integration. We added a new extension method on Flow, collectAsStateWithLifecycle(), that collects from flows and represents its latest value as Compose State in a lifecycle-aware manner. Additionally, a large number of classes are converted to Kotlin and still retain their binary compatibility with previous versions.

Predictive Back Gesture

moving image illustrating predictive back texture

In Android 13, we introduced a predictive back gesture for Android devices such as phones, large screens, and foldables. It is part of a multi-year release; when fully implemented, this feature will let users preview the destination or other result of a back gesture before fully completing it, allowing them to decide whether to continue or stay in the current view.

The Activity APIs for Predictive Back for Android are stable and we have updated the best practices for using the supported system back callbacks; BackHandler (for Compose), OnBackPressedCallback, or OnBackInvokedCallback. We are excited to see Google apps adopt Predictive Back including PlayStore, Calendar, News, and TV!

In the Activity 1.8 alpha releases, The OnBackPressedCallback class now contains new Predictive Back progress callbacks for handling the back gesture starting, progress throughout the gesture, and the back gesture being canceled in addition to the previous handleOnBackPressed() callback for when the back gesture is committed. We also added ComponentActivity.setUpEdgeToEdge() to easily set up the edge-to-edge display in a backward-compatible manner.

Activity updates for more consistent Photo Picker experience

The Android photo picker is a browsable interface that presents the user with their media library. In Activity 1.7.0, the Photo Picker activity contracts have been updated to contain an additional fallback that allows OEMs and system apps, such as Google Play services, to provide a consistent Photo Picker experience on a wider range of Android devices and API levels by implementing the fallback action. Read more in the Photo Picker Everywhere blog.

Incremental Data Fetching

The Paging library allows you to load and display small chunks of data to improve network and system resource consumption. App data can be loaded gradually and gracefully within RecyclerViews or Compose lazy lists.

In Paging Compose 1.0.0-alpha19, there is support for all lazy layouts including custom layouts provided by the Wear and TV libraries. To support more lazy layouts, Paging Compose now provides slightly lower level extension methods on LazyPagingItems in itemKey and itemContentType. These APIs focus on helping you implement the key and contentType parameters to the standard items APIs that already exist for LazyColumnLazyVerticalGrid as well as their equivalents in APIs like HorizontalPager. While these changes do make the LazyColumn and LazyRow examples a few lines longer, it provides consistency across all lazy layouts.


2. Performance Optimization of Applications

Using performance libraries allows you to build performant apps and identify optimizations to maintain high performance, resulting in better end-user experiences.

Improving Start-up Times

Baseline Profiles allow you to partially compile your app at install time to improve runtime and launch performance, and are getting big improvements in new tooling and libraries:

Jetpack provides a new Baseline Profile Gradle Plugin in alpha, which supports AGP 8.0+, and can be easily added to your project in Studio Hedgehog (now in canary). The plugin lets you automate the task of running generation tasks, and pulling profiles from the device and integrating them into your build either periodically, or as part of your release process.

The plugin also allows you to easily automate the new Dex Layout Optimization feature in AGP 8.1, which lets you define BaselineProfileRule tests that collect classes used during startup, and move them to the primary dex file in a multidex app to increase locality. In a large app, this can improve cold startup time by 30% on top of Baseline Profiles!

Macrobenchmark 1.2 has shipped a lot of new features in alpha, such as Power metrics and Custom trace metrics, generation of Baseline Profiles without root on Android 13, and recompilation without clearing app data on Android 14.

You can read everything in depth in the blog "What's new in Android Performance".


3. User Interface Libraries and Guidance

Several changes have been made to our UI libraries to provide better support for large-screen compatibility, foldables, and emojis.

Jetpack Compose

Jetpack Compose, Android’s modern toolkit for building native UI, recently had its May 2023 release which includes new features for text and layouts, continued performance improvements, enhanced tooling support, increased support for large screens, and updated guidance. You can read more in the What’s New in Jetpack Compose I/O blog to learn more.

Glance

The Glance library, now in 1.0-beta, lets you develop app widgets optimized for Android phone, tablet, and foldable homescreens using Jetpack Compose. The library gives you the latest Android widget improvements out of the box, using Kotlin and Compose.

Compose for TV

With the alpha release of the TV library, you can now build experiences for Android TV using components optimized for the living room experience. Compose for TV unlocks all the benefits of Jetpack Compose for your TV apps, allowing you to build apps with less code, easier maintenance and a modern Material 3 look straight out of the box. See the Compose for TV blog for details.

Material 3 for Compose

Material Design 3 is the next evolution of Material Design, enabling you to build expressive, spirited and personal apps. It is the recommended design system for Android apps and the 1.1 stable release brings exciting new features such as bottom sheets, date and time pickers, search bars, tooltips, and added more motion and interaction support. Read more in the release blog.

Understanding Window State

The new WindowManager library helps developers adapt their apps to support multi-window environments and new device form factors by providing a common API surface with support back to API level 14.

In 1.1.0-beta01, new features and capabilities have been added to activity embedding and window layout that enables you to optimize your multi-activity apps for large screens. With the 1.1 release of Jetpack WindowManager, activity embedding APIs are no longer experimental and are recommended for multi-activity applications to provide improved large screen layouts. Check out the What’s new in WindowManager 1.1.0-beta01 blog for details and migration steps.


Other key updates

Kotlin Multiplatform

We continue to experiment with using Kotlin Multiplatform to share business logic between Android and iOS. The Collections 1.3.0-alpha03 and DataStore 1.1.0-alpha02 have been updated so you can now use these libraries in KMM projects. If you are using Kotlin Multiplatform in your app, we would like your feedback!

This was a look at all the changes in Jetpack over the past few months to help you build apps more productively. For more details on each Jetpack library, check out the AndroidX release notes, quickly find relevant libraries with the API picker and watch the Google I/O talks for additional highlights.

Java is a trademark or registered trademark of Oracle and/or its affiliates.

What’s new in Android Health

Posted by Sara Hamilton, Developer Relations Engineer

Health and fitness data is interconnected – sleep, nutrition, workouts and more all inform one another. For example, consider that your sleep impacts your recovery, which impacts your readiness to run your favorite 5k. Over time, your recovery and workout habits drive metrics like heart rate variability, resting heart rate, VO2Max and more! Often this data exists in silos, making it hard for users to get a holistic view of their health data.

We want to make it simple for people to use their favorite apps and devices to track their health by bringing this data together. They should have full control of what data they share, and when they share it. And, we want to make sure developers can enable this with less complexity and fewer lines of code.

This is why we’ve continued to improve our Android Health offerings, and why today at I/O 2023, we’re announcing key updates across both Health Connect and Health Services for app developers and users.

What is Android Health?

Android Health brings together two important platforms for developers to deliver robust health and fitness app to users; Health Connect and Health Services.

Health Connect is an on-device data store that provides APIs for storing and sharing health and fitness data between Android apps. Before Health Connect, there was not a consistent way for developers to share data across Android apps. They had to integrate with many different APIs, each with a different set of data types and different permissions management frameworks.

Now, with Health Connect, there is less fragmentation. Health Connect provides a consistent set of 40+ data types and a single permissions management framework for users to control data permissions. This means that developers can share data with less effort, enabling people to access their health data in their favorite apps, and have more control over data permissions.

Screenshot of permissions via Health Connect

Health Services is our API surface for accessing sensor data on Wear OS devices in a power-efficient way. Before Health Services, developers had to work directly with low-level sensors, which required different configurations on different devices, and was not battery-efficient.

With Health Services, there is now a consistent API surface across all Wear OS 3+ devices, allowing developers to write code once and run it across all devices. And, the Health Services architecture means that developers get great power savings in the process, allowing people to track longer workouts.

Health Connect is coming to Android 14 with new features

Health Connect and Android 14 logos with an X between them to indicate collaboration

Health Connect is currently available for download as an app on the Play Store. We are excited to announce that starting with the release of Android 14 later this year, Health Connect will be a core part of Android and available on all Android mobile devices. Users will be able to access Health Connect directly from Settings on their device, helping to control how their health data is shared across apps.

Screenshot showing Health Connect avaialble in the privacy settings of an Android device

Several new features will be shipped with the Health Connect Android 14 release. We’re adding a new exercise routes feature to allow users to share maps of their workouts through Health Connect. We’ve also made improvements to make it easier for people to log their menstrual cycles. And, Health Connect updates will be delivered through Google Play System Updates, which will allow new features to be updated often.

Health Services now supports more uses cases with new API capabilities

We’ve released several exciting changes to Health Services this year to support more use cases. Our new Batching Modes feature allows developers to adjust the data delivery frequency of heart rate data to support home gym use cases. We’ve also added new API capabilities, like golf shot detection.

The new version of Wear OS arrives later this year. Wear OS 4 will be the most performant yet, delivering improved battery life for the next generation of Wear OS watches. We will be releasing additional Health Services updates with this change, including improved background body sensor permissions.

Our developer ecosystem is growing

There are over 50 apps already integrated with Health Connect and hundreds of apps with health services, including Peloton, Withings, Oura, and more. These apps are using Health Connect to incorporate new data, to give people an interconnected health experience, without building out many new API integrations. Learn more about how these health and fitness apps are creating new experiences for users in areas like sleep, exercise, nutrition, and more in our I/O technical session.

We also have over 100 apps integrated with Health Services. Apps using Health Services are seeing higher engagement from users with Wear apps, and are giving their users longer battery life in the process. For example, Strava found that users with their Wear app did 25% more activities than those without.

Get started with Health Connect

We hope many more developers will join us in bringing unique experiences within Android Health to your users this year.

If you’d like to create a more interconnected health experience for your users, we encourage you to integrate with Health Connect. And if you are a Wear developer, make sure you are using Health Services to get the best battery performance and future proofing for all upcoming Wear OS devices.

Check out our Health Services documentation, Health Connect documentation, and code samples to get started!

To learn more, watch the I/O session:

I/O 2023: What’s new in Google Play

Posted by Alex Musil, Senior Director of Engineering and Product, Google Play

Over the past year, our teams have built exciting new features and made major changes to help you thrive with us. These updates have focused on:

  • Being the best partner to help you grow your audiences across the lifecycle of your business,
  • Being the best platform to help you effectively monetize your users at scale, and
  • Being the safest place to publish and distribute your hard work with Android.

Watch our video for more details, or keep reading to get the highlights.



More store listing enhancements designed to drive growth

Attracting users is the foundation of any app business, and it all starts with your store listing. These updates can help you craft better and more personalized content to drive more audience growth.

  • Last year, we gave every title the ability to create at least 50 custom store listings. Now, in addition to tailoring by country and pre-registration status, you can also customize your listing for inactive users, highlighting why they should give your app or game another chance.
  • Soon, we’ll launch custom store listings for Google Ads App campaign ad groups. These will allow you to serve custom listings to users coming from specific ads on AdMob and YouTube so you can create a more seamless user experience from Google Ads to Google Play.
  • All these new tools mean managing more listings, so we’re launching store listing groups to streamline the process. Now you can design for different audiences by simply creating a base listing, then overriding specific elements.
Image showing an example of a store lisitng group in Google Play
Create a base listing as your primary template and modify elements for different audiences with store listing groups.
  • To help you connect with people in their native language, we just launched new machine translation models for 10 languages from Google Translate in Play Console. It can translate your app and store listing in minutes, at no cost.
  •  

AI-powered features to highlight the best of your app

We’re bringing the benefits of AI to Google Play to make it easier for you and your users to get things done. From helping you showcase your app or game in the best possible light to helping users discover your title, these AI-powered features help you highlight the best of your app experience with ease.

  • Starting today, you can use Google’s generative AI technology to help you get started with store listings in English. This is an experimental feature to help you draft content with less effort. Just open our AI helper, enter a couple of prompts like audience and key theme, and it will generate a draft you can edit, discard, or use. You’re always in complete control of what you submit and publish.
Moving image of using Generative AI to create a custom store listing in Google Play
Draft an AI-generated store listing with just a few prompts
  • To help users learn from each other about what makes your app or game special at a glance, we’re launching review summaries powered by Google’s generative AI technology. Starting with an experiment in English, and expanding later this year.
screenshot of user review summaries in Google Play on a mobile device
Review summaries highlight what users are saying about your app or game at a glance


New opportunities to boost user discovery

Google Play can also help you grow your audience by partnering with you to promote important events, new content, or exciting offers. Use Promotional content to let us know when these are happening so we can amplify your growth. Almost 25,000 apps and games already have access to Promotional content, and we’re rolling out to more titles later this year.

  • We’re launching multiple new, dedicated high-traffic surfaces to showcase your most exciting content, including via Play notifications. Participating games are seeing a median 20% uplift in store-wide acquisitions and reacquisitions, driven by increases of over 60% from organic Explore traffic. 
image of four mobile screens displays side-by-side showing new high-traffic surfaces
New Play surfaces showcase your most exciting content
  • To enhance how and where your Promotional content is viewed on Play, we’re updating our reporting so you can track and optimize your events’ direct performance. Check it out in Play Console under “Promotional content performance reports.”

To be eligible for these new growth opportunities, your app or game needs to be of high quality and deliver the great experiences your users expect. Because it’s so important, we’re sharing more insights into how we think about quality and improving our tooling to help you meet these goals.

  • Today, we launched a unified framework for app and game quality that explains how we evaluate quality across a number of dimensions for promotion and featuring. Learn more with this article and I/O session, “What great quality looks like on Play.

More effective monetization features

We’re also rolling out new features that leverage Play’s reach, expertise, and technologies to help you more effectively generate revenue.

  • Soon, you’ll be able to run price experiments for in-app products right within Play Console. Experiment with different price points across markets and identify when you may be pricing yourself out of a sale or undervaluing your in-app products.
View of price experiments in Google Play Console
Find the right price point for your in-app products with our experiments tool in Play Console
  • Also coming soon is a new type of Promotional content called “featured products” that will allow you to sell your in-app items directly on Play. Feature specific in-app items in different countries or offer discounts to excite users and increase conversions.
moving image showing in-app items on the details page in Google Play
Feature in-app products on your store listing and nominate them for further promotion across Play surfaces
  • We’ve also made new updates to subscriptions to help you expand your reach, increase conversions, and improve retention. This year, we launched multiple prices per billing period so you can provide different auto-renewing and prepaid plan prices as desired, like giving “VIP” users recurring discounts.
  • Our commerce platform continues to evolve by improving access to buyers with new payment methods, exploring expanded billing options through our user choice billing pilot, and investing in secure purchase experiences that prevented over $2 billion in fraudulent and abusive transactions in 2022.

Learn more in our “Boost your revenue with Play Commerce” session.

Finally, we’re also working to increase the effectiveness of your marketing-to-sales funnel.

  • Last year, we launched a Play Console page dedicated to deep links. This page flags broken deep links and provides contextual guidance on how to fix them. Coming soon, we’ll make it easier for you to rationalize your web-to-app mapping with a convenient way to review your top website URLs alongside their deep link status. To help you validate your deep links, we're adding a simple way to compare your app to your web experience for a given URL, side-by-side.

Find out more in our deep links talk, “Optimize app experiences with deep linking.


Enhanced privacy and security protection for developers and users

Protecting your users and your work is critical to a successful ecosystem, so we’ve continued to strengthen our platform-wide protections and roll out more tools to help you protect your apps.

  • Google Play Protect scans billions of apps each day across billions of Android devices to keep users safe from threats like malware and unwanted software. Last year we prevented 1.4 million policy-violating apps from entering Google Play
  • Play Integrity API lets you check that user actions and server requests come from unmodified versions of your app, running on genuine Android devices. We’re rolling out a new beta integration option which gives Play Integrity API verdicts 10x faster. We launched status.play.google.com so you can monitor Play Integrity API service status and be notified of any issues.
  • We’re also expanding access to Automatic integrity protection for apps and games so anti-tamper and anti-piracy protection can be applied in “one-click” with no need to integrate an API in a backend server. Developers who use these products see a reduction in unauthorized usage of 80% on average.
  • Finally, we are building new tools to help you steer users away from broken app versions with prompts to update. First, automatic update prompts for crashing apps are triggered if your app crashes in the foreground and a more stable version is available. And second, you can prompt users on specific app versions to update. No prior integration is required and it will be available to all apps built with app bundles in the coming months.

We’re also continuing to improve Google Play and Play Console to help you provide safe, trustworthy experiences to users.

  • Last year, we launched the Data safety section to help explain what data your app may collect or share, and why. Since the launch, we’ve seen millions of users engaging with this feature every day, and it’s become an important way for users to evaluate an app’s safety before installing it.

    Now, we're enhancing this feature with new data deletion options both inside and outside an app, and policy requirements to help you build trust and empower users with greater clarity and control. You also have the option to give users the choice to clean up their account and request that data like activity history, images, and videos be deleted, rather than their entire account.

  • The redesigned App content page makes outstanding tasks clearer, so you can quickly identify what you need to do to comply with our policies. And soon, you’ll see upcoming declaration requirements and deadlines, so you have more time to plan.

Finally, we rebuilt the Play Console App around modern developer needs. The new app is more customized, so you can tailor the homepage with the metrics you care about most, and integrates Inbox so you can stay up to date with key messages from Google Play. Join the open beta and let us know what you think.

We understand how exciting and challenging building and running a mobile business can be, and our teams are dedicated to building the tools and opportunities you need to succeed across your app lifecycle. Thank you for partnering with us, and please continue to share your feedback as we work together to build the future of Google Play.


Price in-app products with confidence by running price experiments in Play Console

Posted by Phalene Gowling, Product Manager, Google Play

At this year’s Google I/O, our “Boost your revenue with Play Commerce” session highlights the newest monetization tools that are deeply integrated into Google Play, with a focus on helping you optimize your pricing strategy. Pricing your products or content correctly is foundational to driving better user lifetime value and can result in reaching new buyers, improving conversion, and encouraging repeat orders. It can be the difference between a successful sale and pricing yourself out of one, or even undervaluing your products and missing out on key sales opportunities.

To help you price with confidence, we’re excited to announce price experiments for in-app products in Play Console, allowing you to test price points and optimize for local purchasing power at scale. Price experiements will launch in the coming weeks - so read on to get the details on the new tool and learn how you can prepare to take full advantage when it's live.

  • A/B test to find optimal local pricing that’s sensitive to the purchasing power of buyers in different markets. Adjusting your price to local markets has already been an industry-wide practice amongst developers, and at launch you will be able to test and manage your global prices, all within Play Console. An optimized price helps reach both new and existing buyers who may have previously been priced out of monetized experiences in apps and games. Additionally, an optimized price can help increase repeat purchases by buyers of their favorite products.
  • Image of two mobile devices showing A/B price testing in Google Play Console
    Illustrative example only. A/B test price points with ease in Play Console 
  • Experiment with statistical confidence: price experiments enables you to track how close you are to statistical significance with confidence interval tracking, or for a quick summary, you can view the top of the analysis when enough data has been collected in the experiment to determine a statistically significant result. To help make your decision on whether to apply the ‘winning’ price easier, we’ve also included support for tracking key monetization metrics such as revenue uplift, revenue derived from new installers, buyer ratio, orders, and average revenue per paying user. This gives you a more detailed understanding of how buyers behave differently for each experiment arm per market. This can also inspire further refinements towards a robust global monetization strategy.
  • Improve return on investment in user acquisition. Having a localized price and a better understanding of buyer behavior in each market, allows you to optimize your user acquisition strategy having known how buyers will react to market-specific products or content. It could also inform which products you chose to feature on Google Play.

Set up price experiments in minutes in Play Console

Price experiments will be easy to run with the new dedicated section in Play Console under Monetize > Products > Price experiments. You’ll first need to determine the in-app products, markets, and the price points you’d like to test. The intuitive interface will also allow you to refine the experiment settings by audience, confidence level and sensitivity. And once your experiment has reached statistical significance, simply apply the winning price to your selected products within the tool to automatically populate your new default price point for your experiment markets and products. You also have the flexibility to stop any experiment before it reaches statistical significance if needed.

You’ll have full control of what and how you want to test, reducing any overhead of managing tests independently or with external tools – all without requiring any coding changes.

Learn how to run an effective experiment with Play Academy

Get Started

You can start preparing now by strategizing what type of price experiment you might want to run first. For a metric-driven source of inspiration, game developers can explore strategic guidance, which can identify country-specific opportunities for buyer conversion. Alternatively, start building expertise on running effective pricing experiments for in-app products by taking our new Play Academy course, in
preparation for price experiments rolling out in the coming weeks.