Tag Archives: Jetpack Compose

Peacock built adaptively on Android to deliver great experiences across screens

Posted by Sa-ryong Kang and Miguel Montemayor - Developer Relations Engineers

Peacock is NBCUniversal’s streaming service app available in the US, offering culture-defining entertainment including live sports, exclusive original content, TV shows, and blockbuster movies. The app continues to evolve, becoming more than just a platform to watch content, but a hub of entertainment.

Today’s users are consuming entertainment on an increasingly wider array of device sizes and types, and in particular are moving towards mobile devices. Peacock has adopted Jetpack Compose to help with its journey in adapting to more screens and meeting users where they are.

Disclaimer: Peacock is available in the US only. This video will only be viewable to US viewers.

Adapting to more flexible form factors

The Peacock development team is focused on bringing the best experience to users, no matter what device they’re using or when they want to consume content. With an emerging trend from app users to watch more on mobile devices and large screens like foldables, the Peacock app needs to be able to adapt to different screen sizes. As more devices are introduced, the team needed to explore new solutions that make the most out of each unique display permutation.

The goal was to have the Peacock app to adapt to these new displays while continually offering high-quality entertainment without interruptions, like the stream reloading or visual errors. While thinking ahead, they also wanted to prepare and build a solution that was ready for Android XR as the entertainment landscape is shifting towards including more immersive experiences.

quote card featuring a headshot of Diego Valente, Head of Mobile, Peacock & Global Streaming, reads 'Thinking adaptively isn't just about supporting tablets or large screens - it's about future proofing your app. Investing in adaptability helps you meet user's expectations of having seamless experiencers across all their devices and sets you up for what's next.'

Building a future-proof experience with Jetpack Compose

In order to build a scalable solution that would help the Peacock app continue to evolve, the app was migrated to Jetpack Compose, Android’s toolkit for building scalable UI. One of the essential tools they used was the WindowSizeClass API, which helps developers create and test UI layouts for different size ranges. This API then allows the app to seamlessly switch between pre-set layouts as it reaches established viewport breakpoints for different window sizes.

The API was used in conjunction with Kotlin Coroutines and Flows to keep the UI state responsive as the window size changed. To test their work and fine tune edge case devices, Peacock used the Android Studio emulator to simulate a wide range of Android-based devices.

Jetpack Compose allowed the team to build adaptively, so now the Peacock app responds to a wide variety of screens while offering a seamless experience to Android users. “The app feels more native, more fluid, and more intuitive across all form factors,” said Diego Valente, Head of Mobile, Peacock and Global Streaming. “That means users can start watching on a smaller screen and continue instantly on a larger one when they unfold the device—no reloads, no friction. It just works.”

Preparing for immersive entertainment experiences

In building adaptive apps on Android, John Jelley, Senior Vice President, Product & UX, Peacock and Global Streaming, says Peacock has also laid the groundwork to quickly adapt to the Android XR platform: “Android XR builds on the same large screen principles, our investment here naturally extends to those emerging experiences with less developmental work.”

The team is excited about the prospect of features unlocked by Android XR, like Multiview for sports and TV, which enables users to watch multiple games or camera angles at once. By tailoring spatial windows to the user’s environment, the app could offer new ways for users to interact with contextual metadata like sports stats or actor information—all without ever interrupting their experience.

Build adaptive apps

Learn how to unlock your app's full potential on phones, tablets, foldables, and beyond.

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.


16 things to know for Android developers at Google I/O 2025

Posted by Matthew McCullough – VP of Product Management, Android Developer

Today at Google I/O, we announced the many ways we’re helping you build excellent, adaptive experiences, and helping you stay more productive through updates to our tooling that put AI at your fingertips and throughout your development lifecycle. Here’s a recap of 16 of our favorite announcements for Android developers; you can also see what was announced last week in The Android Show: I/O Edition. And stay tuned over the next two days as we dive into all of the topics in more detail!

Building AI into your Apps

1: Building intelligent apps with Generative AI

Generative AI enhances apps' experience by making them intelligent, personalized and agentic. This year, we announced new ML Kit GenAI APIs using Gemini Nano for common on-device tasks like summarization, proofreading, rewrite, and image description. We also provided capabilities for developers to harness more powerful models such as Gemini Pro, Gemini Flash, and Imagen via Firebase AI Logic for more complex use cases like image generation and processing extensive data across modalities, including bringing AI to life in Android XR, and a new AI sample app, Androidify, that showcases how these APIs can transform your selfies into unique Android robots! To start building intelligent experiences by leveraging these new capabilities, explore the developer documentation, sample apps, and watch the overview session to choose the right solution for your app.

New experiences across devices

2: One app, every screen: think adaptive and unlock 500 million screens

Mobile Android apps form the foundation across phones, foldables, tablets and ChromeOS, and this year we’re helping you bring them to cars and XR and expanding usages with desktop windowing and connected displays. This expansion means tapping into an ecosystem of 500 million devices – a significant opportunity to engage more users when you think adaptive, building a single mobile app that works across form factors. Resources, including Compose Layouts library and Jetpack Navigation updates, help make building these dynamic experiences easier than before. You can see how Peacock, NBCUniveral’s streaming service (available in the US) is building adaptively to meet users where they are.

Disclaimer: Peacock is available in the US only. This video will only be viewable to US viewers.

3: Material 3 Expressive: design for intuition and emotion

The new Material 3 Expressive update provides tools to enhance your product's appeal by harnessing emotional UX, making it more engaging, intuitive, and desirable for users. Check out the I/O talk to learn more about expressive design and how it inspires emotion, clearly guides users toward their goals, and offers a flexible and personalized experience.

moving image of Material 3 Expressive demo

4: Smarter widgets, engaging live updates

Measure the return on investment of your widgets (available soon) and easily create personalized widget previews with Glance 1.2. Promoted Live Updates notify users of important ongoing notifications and come with a new Progress Style standardized template.

moving image of Material 3 Expressive demo

5: Enhanced Camera & Media: low light boost and battery savings

This year's I/O introduces several camera and media enhancements. These include a software low light boost for improved photography in dim lighting and native PCM offload, allowing the DSP to handle more audio playback processing, thus conserving user battery. Explore our detailed sessions on built-in effects within CameraX and Media3 for further information.

6: Build next-gen app experiences for Cars

We're launching expanded opportunities for developers to build in-car experiences, including new Gemini integrations, support for more app categories like Games and Video, and enhanced capabilities for media and communication apps via the Car App Library and new APIs. Alongside updated car app quality tiers and simplified distribution, we'll soon be providing improved testing tools like Android Automotive OS on Pixel Tablet and Firebase Test Lab access to help you bring your innovative apps to cars. Learn more from our technical session and blog post on new in-car app experiences.

7: Build for Android XR's expanding ecosystem with Developer Preview 2 of the SDK

We announced Android XR in December, and today at Google I/O we shared a bunch of updates coming to the platform including Developer Preview 2 of the Android XR SDK plus an expanding ecosystem of devices: in addition to the first Android XR headset, Samsung’s Project Moohan, you’ll also see more devices including a new portable Android XR device from our partners at XREAL. There’s lots more to cover for Android XR: Watch the Compose and AI on Android XR session, and the Building differentiated apps for Android XR with 3D content session, and learn more about building for Android XR.

product image of XREAL’s Project Aura against a nebulous black background
XREAL’s Project Aura

8: Express yourself on Wear OS: meet Material Expressive on Wear OS 6

This year we are launching Wear OS 6: the most powerful and expressive version of Wear OS. Wear OS 6 features Material 3 Expressive, a new UI design with personalized visuals and motion for user creativity, coming to Wear, Android, and Google apps later this year. Developers gain access to Material 3 Expressive on Wear OS by utilizing new Jetpack libraries: Wear Compose Material 3, which provides components for apps and Wear ProtoLayout Material 3 which provides components and layouts for tiles. Get started with Material 3 libraries and other updates on Wear.

moving image displays examples of Material 3 Expressive on Wear OS experiences
Some examples of Material 3 Expressive on Wear OS experiences

9: Engage users on Google TV with excellent TV apps

You can leverage more resources within Compose's core and Material libraries with the stable release of Compose for TV, empowering you to build excellent adaptive UIs across your apps. We're also thrilled to share exciting platform updates and developer tools designed to boost app engagement, including bringing Gemini capabilities to TV in the fall, opening enrollment for our Video Discovery API, and more.

Developer productivity

10: Build beautiful apps faster with Jetpack Compose

Compose is our big bet for UI development. The latest stable BOM release provides the features, performance, stability, and libraries that you need to build beautiful adaptive apps faster, so you can focus on what makes your app valuable to users.

moving image of compose adaptive layouts updates in the Google Play app
Compose Adaptive Layouts Updates in the Google Play app

11: Kotlin Multiplatform: new Shared Template lets you build across platforms, easily

Kotlin Multiplatform (KMP) enables teams to reach new audiences across Android and iOS with less development time. We’ve released a new Android Studio KMP shared module template, updated Jetpack libraries and new codelabs (Getting started with Kotlin Multiplatform and Migrating your Room database to KMP) to help developers who are looking to get started with KMP. Shared module templates make it easier for developers to craft, maintain, and own the business logic. Read more on what's new in Android's Kotlin Multiplatform.

12: Gemini in Android Studio: AI Agents to help you work

Gemini in Android Studio is the AI-powered coding companion that makes Android developers more productive at every stage of the dev lifecycle. In March, we introduced Image to Code to bridge the gap between UX teams and software engineers by intelligently converting design mockups into working Compose UI code. And today, we previewed new agentic AI experiences, Journeys for Android Studio and Version Upgrade Agent. These innovations make it easier to build and test code. You can read more about these updates in What’s new in Android development tools.

13: Android Studio: smarter with Gemini

In this latest release, we're empowering devs with AI-driven tools like Gemini in Android Studio, streamlining UI creation, making testing easier, and ensuring apps are future-proofed in our ever-evolving Android ecosystem. These innovations accelerate development cycles, improve app quality, and help you stay ahead in a dynamic mobile landscape. To take advantage, upgrade to the latest Studio release. You can read more about these innovations in What’s new in Android development tools.

moving image of Gemini in Android Studio Agentic Experiences including Journeys and Version Upgrade

And the latest on driving business growth

14: What’s new in Google Play

Get ready for exciting updates from Play designed to boost your discovery, engagement and revenue! Learn how we’re continuing to become a content-rich destination with enhanced personalization and fresh ways to showcase your apps and content. Plus, explore powerful new subscription features designed to streamline checkout and reduce churn. Read I/O 2025: What's new in Google Play to learn more.

a moving image of three mobile devices displaying how content is displayed on the Play Store

15: Start migrating to Play Games Services v2 today

Play Games Services (PGS) connects over 2 billion gamer profiles on Play, powering cross-device gameplay, personalized gaming content and rewards for your players throughout the gaming journey. We are moving PGS v1 features to v2 with more advanced features and an easier integration path. Learn more about the migration timeline and new features.

16: And of course, Android 16

We unpacked some of the latest features coming to users in Android 16, which we’ve been previewing with you for the last few months. If you haven’t already, make sure to test your apps with the latest Beta of Android 16. Android 16 includes Live Updates, professional media and camera features, desktop windowing and connected displays, major accessibility enhancements and much more.

Check out all of the Android and Play content at Google I/O

This was just a preview of some of the cool updates for Android developers at Google I/O, but stay tuned to Google I/O over the next two days as we dive into a range of Android developer topics in more detail. You can check out the What’s New in Android and the full Android track of sessions, and whether you’re joining in person or around the world, we can’t wait to engage with you!

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.


Updates to the Android XR SDK: Introducing Developer Preview 2

Posted by Matthew McCullough – VP of Product Management, Android Developer

Since launching the Android XR SDK Developer Preview alongside Samsung, Qualcomm, and Unity last year, we’ve been blown away by all of the excitement we’ve been hearing from the broader Android community. Whether it's through coding live-streams or local Google Developer Group talks, it's been an outstanding experience participating in the community to build the future of XR together, and we're just getting started.

Today we’re excited to share an update to the Android XR SDK: Developer Preview 2, packed with new features and improvements to help you develop helpful and delightful immersive experiences with familiar Android APIs, tools and open standards created for XR.

At Google I/O, we have two technical sessions related to Android XR. The first is Building differentiated apps for Android XR with 3D content, which covers many features present in Jetpack SceneCore and ARCore for Jetpack XR. The future is now, with Compose and AI on Android XR covers creating XR-differentiated UI and our vision on the intersection of XR with cutting-edge AI capabilities.

Android XR sessions at Google I/O 2025
Building differentiated apps for Android XR with 3D content and The future is now, with Compose and AI on Android XR

What’s new in Developer Preview 2

Since the release of Developer Preview 1, we’ve been focused on making the APIs easier to use and adding new immersive Android XR features. Your feedback has helped us shape the development of the tools, SDKs, and the platform itself.

With the Jetpack XR SDK, you can now play back 180° and 360° videos, which can be stereoscopic by encoding with the MV-HEVC specification or by encoding view-frames adjacently. The MV-HEVC standard is optimized and designed for stereoscopic video, allowing your app to efficiently play back immersive videos at great quality. Apps built with Jetpack Compose for XR can use the SpatialExternalSurface composable to render media, including stereoscopic videos.

Using Jetpack Compose for XR, you can now also define layouts that adapt to different XR display configurations. For example, use a SubspaceModifier to specify the size of a Subspace as a percentage of the device’s recommended viewing size, so a panel effortlessly fills the space it's positioned in.

Material Design for XR now supports more component overrides for TopAppBar, AlertDialog, and ListDetailPaneScaffold, helping your large-screen enabled apps that use Material Design effortlessly adapt to the new world of XR.

An app adapts to XR using Material Design for XR with the new component overrides
An app adapts to XR using Material Design for XR with the new component overrides

In ARCore for Jetpack XR, you can now track hands after requesting the appropriate permissions. Hands are a collection of 26 posed hand joints that can be used to detect hand gestures and bring a whole new level of interaction to your Android XR apps:

moving image demonstrates how hands bring a natural input method to your Android XR experience.
Hands bring a natural input method to your Android XR experience.

For more guidance on developing apps for Android XR, check out our Android XR Fundamentals codelab, the updates to our Hello Android XR sample project, and a new version of JetStream with Android XR support.

The Android XR Emulator has also received updates to stability, support for AMD GPUs, and is now fully integrated within the Android Studio UI.

the Android XR Emulator in Android STudio
The Android XR Emulator is now integrated in Android Studio

Developers using Unity have already successfully created and ported existing games and apps to Android XR. Today, you can upgrade to the Pre-Release version 2 of the Unity OpenXR: Android XR package! This update adds many performance improvements such as support for Dynamic Refresh Rate, which optimizes your app’s performance and power consumption. Shaders made with Shader Graph now support SpaceWarp, making it easier to use SpaceWarp to reduce compute load on the device. Hand meshes are now exposed with occlusion, which enables realistic hand visualization.

Check out Unity’s improved Mixed Reality template for Android XR, which now includes support for occlusion and persistent anchors.

We recently launched Android XR Samples for Unity, which demonstrate capabilities on the Android XR platform such as hand tracking, plane tracking, face tracking, and passthrough.

moving image of Google’s open-source Unity samples demonstrating platform features and showing how they’re implemented
Google’s open-source Unity samples demonstrate platform features and show how they’re implemented

The Firebase AI Logic for Unity is now in public preview! This makes it easy for you to integrate gen AI into your apps, enabling the creation of AI-powered experiences with Gemini and Android XR. The Firebase AI Logic fully supports Gemini's capabilities, including multimodal input and output, and bi-directional streaming for immersive conversational interfaces. Built with production readiness in mind, Firebase AI Logic is integrated with core Firebase services like App Check, Remote Config, and Cloud Storage for enhanced security, configurability, and data management. Learn more about this on the Firebase blog or go straight to the Gemini API using Vertex AI in Firebase SDK documentation to get started.

Continuing to build the future together

Our commitment to open standards continues with the glTF Interactivity specification, in collaboration with the Khronos Group. which will be supported in glTF models rendered by Jetpack XR later this year. Models using the glTF Interactivity specification are self-contained interactive assets that can have many pre-programmed behaviors, like rotating objects on a button press or changing the color of a material over time.

Android XR will be available first on Samsung’s Project Moohan, launching later this year. Soon after, our partners at XREAL will release the next Android XR device. Codenamed Project Aura, it’s a portable and tethered device that gives users access to their favorite Android apps, including those that have been built for XR. It will launch as a developer edition, specifically for you to begin creating and experimenting. The best news? With the familiar tools you use to build Android apps today, you can build for these devices too.

product image of XREAL’s Project Aura against a nebulous black background
XREAL’s Project Aura

The Google Play Store is also getting ready for Android XR. It will list supported 2D Android apps on the Android XR Play Store when it launches later this year. If you are working on an Android XR differentiated app, you can get it ready for the big launch and be one of the first differentiated apps on the Android XR Play Store:

And we know many of you are excited for the future of Android XR on glasses. We are shaping the developer experience now and will share more details on how you can participate later this year.

To get started creating and developing for Android XR, check out developer.android.com/develop/xr where you will find all of the tools, libraries, and resources you need to work with the Android XR SDK. In particular, try out our samples and codelabs.

We welcome your feedback, suggestions, and ideas as you’re helping shape Android XR. Your passion, expertise, and bold ideas are vital as we continue to develop Android XR together. We look forward to seeing your XR-differentiated apps when Android XR devices launch later this year!

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.


Google I/O 2025: Build adaptive Android apps that shine across form factors

Posted by Fahd Imtiaz – Product Manager, Android Developer

If your app isn’t built to adapt, you’re missing out on the opportunity to reach a giant swath of users across 500 million devices! At Google I/O this year, we are exploring how adaptive development isn’t just a good idea, but essential to building apps that shine across the expanding Android device ecosystem. This is your guide to meeting users wherever they are, with experiences that are perfectly tailored to their needs.

The advantage of building adaptive

In today's multi-device world, users expect their favorite applications to work flawlessly and intuitively, whether they're on a smartphone, tablet, or Chromebook. This expectation for seamless experiences isn't just about convenience; it's an important factor for user engagement and retention.

For example, entertainment apps (including Prime Video, Netflix, and Hulu) users on both phone and tablet spend almost 200% more time in-app (nearly 3x engagement) than phone-only users in the US*.

Peacock, NBCUniversal’s streaming service has seen a trend of users moving between mobile and large screens and building adaptively enables a single build to work across the different form factors.

“This allows Peacock to have more time to innovate faster and deliver more value to its customers.”
– Diego Valente, Head of Mobile, Peacock and Global Streaming

Adaptive Android development offers the strategic solution, enabling apps to perform effectively across an expanding array of devices and contexts through intelligent design choices that emphasize code reuse and scalability. With Android's continuous growth into new form factors and upcoming enhancements such as desktop windowing and connected displays in Android 16, an app's ability to seamlessly adapt to different screen sizes is becoming increasingly crucial for retaining users and staying competitive.

Beyond direct user benefits, designing adaptively also translates to increased visibility. The Google Play Store actively helps promote developers whose apps excel on different form factors. If your application delivers a great experience on tablets or is excellent on ChromeOS, users on those devices will have an easier time discovering your app. This creates a win-win situation: better quality apps for users and a broader audience for you.

examples of form factors across small phones, tablets, laoptops, and auto

Latest in adaptive Android development from Google I/O

To help you more effectively build compelling adaptive experiences, we shared several key updates at I/O this year.

Build for the expanding Android device ecosystem

Your mobile apps can now reach users beyond phones on over 500 million active devices, including foldables, tablets, Chromebooks, and even compatible cars, with minimal changes. Android 16 introduces significant advancements in desktop windowing for a true desktop-like experience on large screens and when devices are connected to external displays. And, Android XR is opening a new dimension, allowing your existing mobile apps to be available in immersive virtual environments.

The mindset shift to Adaptive

With the expanding Android device ecosystem, adaptive app development is a fundamental strategy. It's about how the same mobile app runs well across phones, foldables, tablets, Chromebooks, connected displays, XR, and cars, laying a strong foundation for future devices and differentiating for specific form factors. You don't need to rebuild your app for each form factor; but rather make small, iterative changes, as needed, when needed. Embracing this adaptive mindset today isn't just about keeping pace; it's about leading the charge in delivering exceptional user experiences across the entire Android ecosystem.

examples of form factors including vr headset

Leverage powerful tools and libraries to build adaptive apps:

    • Compose Adaptive Layouts library: This library makes adaptive development easier by allowing your app code to fit into canonical layout patterns like list-detail and supporting pane, that automatically reflow as your app is resized, flipped or folded. In the 1.1 release, we introduced pane expansion, allowing users to resize panes. The Socialite demo app showcased how one codebase using this library can adapt across six form factors. New adaptation strategies like "Levitate" (elevating a pane, e.g., into a dialog or bottom sheet) and "Reflow" (reorganizing panes on the same level) were also announced in 1.2 (alpha). For XR, component overrides can automatically spatialize UI elements.

    • Jetpack Navigation 3 (Alpha): This new navigation library simplifies defining user journeys across screens with less boilerplate code, especially for multi-pane layouts in Compose. It helps handle scenarios where list and detail panes might be separate destinations on smaller screens but shown together on larger ones. Check out the new Jetpack Navigation library in alpha.

    • Jetpack Compose input enhancements: Compose's layered architecture, strong input support, and single location for layout logic simplify creating adaptive UIs. Upcoming in Compose 1.9 are right-click context menus and enhanced trackpad/mouse functionality.

    • Window Size Classes: Use window size classes for top-level layout decisions. AndroidX.window 1.5 introduces two new width size classes – "large" (1200dp to 1600dp) and "extra-large" (1600dp and larger) – providing more granular breakpoints for large screens. This helps in deciding when to expand navigation rails or show three panes of content. Support for these new breakpoints was also announced in the Compose adaptive layouts library 1.2 alpha, along with design guidance.

    • Compose previews: Get quick feedback by visualizing your layouts across a wide variety of screen sizes and aspect ratios. You can also specify different devices by name to preview your UI on their respective sizes and with their inset values.

    • Testing adaptive layouts: Validating your adaptive layouts is crucial and Android Studio offers various tools for testing – including previews for different sizes and aspect ratios, a resizable emulator to test across different screen sizes with a single AVD, screenshot tests, and instrumental behavior tests. And with Journeys with Gemini in Android Studio, you can define tests using natural language for even more robust testing across different window sizes.

Ensuring app availability across devices

Avoid unnecessarily declaring required features (like specific cameras or GPS) in your manifest, as this can prevent your app from appearing in the Play Store on devices that lack those specific hardware components but could otherwise run your app perfectly.

Handling different input methods

Remember to handle various input methods like touch, keyboard, and mouse, especially with Chromebook detachables and connected displays.

Prepare for orientation and resizability API changes in Android 16

Beginning in Android 16, for apps targeting SDK 36, manifest and runtime restrictions on orientation, resizability, and aspect ratio will be ignored on displays that are at least 600dp in both dimensions. To meet user expectations, your apps will need layouts that work for both portrait and landscape windows, and support resizing at runtime. There's a temporary opt-out manifest flag at both the application and activity level to delay these changes until targetSdk 37, and these changes currently do not apply to apps categorized as "Games". Learn more about these API changes.

Adaptive considerations for games

Games need to be adaptive too and Unity 6 will add enhanced support for configuration handling, including APIs for screenshots, aspect ratio, and density. Success stories like Asphalt Legends Unite show significant user retention increases on foldables after implementing adaptive features.

examples of form factors including vr headset

Start building adaptive today

Now is the time to elevate your Android apps, making them intuitively responsive across form factors. With the latest tools and updates we’re introducing, you have the power to build experiences that seamlessly flow across all devices, from foldables to cars and beyond. Implementing these strategies will allow you to expand your reach and delight users across the Android ecosystem.

Get inspired by the “Adaptive Android development makes your app shine across devices” talk, and explore all the resources you’ll need to start your journey at developer.android.com/adaptive-apps!

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.


*Source: internal Google data

Androidify: Building powerful AI-driven experiences with Jetpack Compose, Gemini and CameraX

Posted by Rebecca Franks – Developer Relations Engineer

The Android bot is a beloved mascot for Android users and developers, with previous versions of the bot builder being very popular - we decided that this year we’d rebuild the bot maker from the ground up, using the latest technology backed by Gemini. Today we are releasing a new open source app, Androidify, for learning how to build powerful AI driven experiences on Android using the latest technologies such as Jetpack Compose, Gemini through Firebase, CameraX, and Navigation 3.

a moving image of various droid bots dancing individually

Androidify app demo

Here’s an example of the app running on the device, showcasing converting a photo to an Android bot that represents my likeness:

moving image showing the conversion of an image of a woman in a pink dress holding na umbrella into a 3D image of a droid bot wearing a pink dress holding an umbrella

Under the hood

The app combines a variety of different Google technologies, such as:

    • Gemini API - through Firebase AI Logic SDK, for accessing the underlying Imagen and Gemini models.
    • Jetpack Compose - for building the UI with delightful animations and making the app adapt to different screen sizes.
    • Navigation 3 - the latest navigation library for building up Navigation graphs with Compose.
    • CameraX Compose and Media3 Compose - for building up a custom camera with custom UI controls (rear camera support, zoom support, tap-to-focus) and playing the promotional video.

This sample app is currently using a standard Imagen model, but we've been working on a fine-tuned model that's trained specifically on all of the pieces that make the Android bot cute and fun; we'll share that version later this year. In the meantime, don't be surprised if the sample app puts out some interesting looking examples!

How does the Androidify app work?

The app leverages our best practices for Architecture, Testing, and UI to showcase a real world, modern AI application on device.

Flow chart describing Androidify app flow
Androidify app flow chart detailing how the app works with AI

AI in Androidify with Gemini and ML Kit

The Androidify app uses the Gemini models in a multitude of ways to enrich the app experience, all powered by the Firebase AI Logic SDK. The app uses Gemini 2.5 Flash and Imagen 3 under the hood:

    • Image validation: We ensure that the captured image contains sufficient information, such as a clearly focused person, and assessing for safety. This feature uses the multi-modal capabilities of Gemini API, by giving it a prompt and image at the same time:

val response = generativeModel.generateContent(
   content {
       text(prompt)
       image(image)
   },
)

    • Text prompt validation: If the user opts for text input instead of image, we use Gemini 2.5 Flash to ensure the text contains a sufficiently descriptive prompt to generate a bot.

    • Image captioning: Once we’re sure the image has enough information, we use Gemini 2.5 Flash to perform image captioning., We ask Gemini to be as descriptive as possible,focusing on the clothing and its colors.

    • “Help me write” feature: Similar to an “I’m feeling lucky” type feature, “Help me write” uses Gemini 2.5 Flash to create a random description of the clothing and hairstyle of a bot.

    • Image generation from the generated prompt: As the final step, Imagen generates the image, providing the prompt and the selected skin tone of the bot.

The app also uses the ML Kit pose detection to detect a person in the viewfinder and enable the capture button when a person is detected, as well as adding fun indicators around the content to indicate detection.

Explore more detailed information about AI usage in Androidify.

Jetpack Compose

The user interface of Androidify is built using Jetpack Compose, the modern UI toolkit that simplifies and accelerates UI development on Android.

Delightful details with the UI

The app uses Material 3 Expressive, the latest alpha release that makes your apps more premium, desirable, and engaging. It provides delightful bits of UI out-of-the-box, like new shapes, componentry, and using the MotionScheme variables wherever a motion spec is needed.

MaterialShapes are used in various locations. These are a preset list of shapes that allow for easy morphing between each other—for example, the cute cookie shape for the camera capture button:


Androidify app UI showing camera button
Camera button with a MaterialShapes.Cookie9Sided shape

Beyond using the standard Material components, Androidify also features custom composables and delightful transitions tailored to the specific needs of the app:

    • There are plenty of shared element transitions across the app—for example, a morphing shape shared element transition is performed between the “take a photo” button and the camera surface.

      moving example of expressive button shapes in slow motion

    • Custom enter transitions for the ResultsScreen with the usage of marquee modifiers.

      animated marquee example

    • Fun color splash animation as a transition between screens.

      moving image of a blue color splash transition between Androidify demo screens

    • Animating gradient buttons for the AI-powered actions.

      animated gradient button for AI powered actions example

To learn more about the unique details of the UI, read Androidify: Building delightful UIs with Compose

Adapting to different devices

Androidify is designed to look great and function seamlessly across candy bar phones, foldables, and tablets. The general goal of developing adaptive apps is to avoid reimplementing the same app multiple times on each form factor by extracting out reusable composables, and leveraging APIs like WindowSizeClass to determine what kind of layout to display.

a collage of different adaptive layouts for the Androidify app across small and large screens
Various adaptive layouts in the app

For Androidify, we only needed to leverage the width window size class. Combining this with different layout mechanisms, we were able to reuse or extend the composables to cater to the multitude of different device sizes and capabilities.

    • Responsive layouts: The CreationScreen demonstrates adaptive design. It uses helper functions like isAtLeastMedium() to detect window size categories and adjust its layout accordingly. On larger windows, the image/prompt area and color picker might sit side-by-side in a Row, while on smaller windows, the color picker is accessed via a ModalBottomSheet. This pattern, called “supporting pane”, highlights the supporting dependencies between the main content and the color picker.

    • Foldable support: The app actively checks for foldable device features. The camera screen uses WindowInfoTracker to get FoldingFeature information to adapt to different features by optimizing the layout for tabletop posture.

    • Rear display: Support for devices with multiple displays is included via the RearCameraUseCase, allowing for the device camera preview to be shown on the external screen when the device is unfolded (so the main content is usually displayed on the internal screen).

Using window size classes, coupled with creating a custom @LargeScreensPreview annotation, helps achieve unique and useful UIs across the spectrum of device sizes and window sizes.

CameraX and Media3 Compose

To allow users to base their bots on photos, Androidify integrates CameraX, the Jetpack library that makes camera app development easier.

The app uses a custom CameraLayout composable that supports the layout of the typical composables that a camera preview screen would include— for example, zoom buttons, a capture button, and a flip camera button. This layout adapts to different device sizes and more advanced use cases, like the tabletop mode and rear-camera display. For the actual rendering of the camera preview, it uses the new CameraXViewfinder that is part of the camerax-compose artifact.

CameraLayout in Compose
CameraLayout composable that takes care of different device configurations, such as table top mode

CameraLayout in Compose
CameraLayout composable that takes care of different device configurations, such as table top mode

The app also integrates with Media3 APIs to load an instructional video for showing how to get the best bot from a prompt or image. Using the new media3-ui-compose artifact, we can easily add a VideoPlayer into the app:

@Composable
private fun VideoPlayer(modifier: Modifier = Modifier) {
    val context = LocalContext.current
    var player by remember { mutableStateOf<Player?>(null) }
    LifecycleStartEffect(Unit) {
        player = ExoPlayer.Builder(context).build().apply {
            setMediaItem(MediaItem.fromUri(Constants.PROMO_VIDEO))
            repeatMode = Player.REPEAT_MODE_ONE
            prepare()
        }
        onStopOrDispose {
            player?.release()
            player = null
        }
    }
    Box(
        modifier
            .background(MaterialTheme.colorScheme.surfaceContainerLowest),
    ) {
        player?.let { currentPlayer ->
            PlayerSurface(currentPlayer, surfaceType = SURFACE_TYPE_TEXTURE_VIEW)
        }
    }
}

Using the new onLayoutRectChanged modifier, we also listen for whether the composable is completely visible or not, and play or pause the video based on this information:

var videoFullyOnScreen by remember { mutableStateOf(false) }     

LaunchedEffect(videoFullyOnScreen) {
     if (videoFullyOnScreen) currentPlayer.play() else currentPlayer.pause()
} 

// We add this onto the player composable to determine if the video composable is visible, and mutate the videoFullyOnScreen variable, that then toggles the player state. 
Modifier.onVisibilityChanged(
                containerWidth = LocalView.current.width,
                containerHeight = LocalView.current.height,
) { fullyVisible -> videoFullyOnScreen = fullyVisible }

// A simple version of visibility changed detection
fun Modifier.onVisibilityChanged(
    containerWidth: Int,
    containerHeight: Int,
    onChanged: (visible: Boolean) -> Unit,
) = this then Modifier.onLayoutRectChanged(100, 0) { layoutBounds ->
    onChanged(
        layoutBounds.boundsInRoot.top > 0 &&
            layoutBounds.boundsInRoot.bottom < containerHeight &&
            layoutBounds.boundsInRoot.left > 0 &&
            layoutBounds.boundsInRoot.right < containerWidth,
    )
}

Additionally, using rememberPlayPauseButtonState, we add on a layer on top of the player to offer a play/pause button on the video itself:

val playPauseButtonState = rememberPlayPauseButtonState(currentPlayer)
            OutlinedIconButton(
                onClick = playPauseButtonState::onClick,
                enabled = playPauseButtonState.isEnabled,
            ) {
                val icon =
                    if (playPauseButtonState.showPlay) R.drawable.play else R.drawable.pause
                val contentDescription =
                    if (playPauseButtonState.showPlay) R.string.play else R.string.pause
                Icon(
                    painterResource(icon),
                    stringResource(contentDescription),
                )
            }

Check out the code for more details on how CameraX and Media3 were used in Androidify.

Navigation 3

Screen transitions are handled using the new Jetpack Navigation 3 library androidx.navigation3. The MainNavigation composable defines the different destinations (Home, Camera, Creation, About) and displays the content associated with each destination using NavDisplay. You get full control over your back stack, and navigating to and from destinations is as simple as adding and removing items from a list.

@Composable
fun MainNavigation() {
   val backStack = rememberMutableStateListOf<NavigationRoute>(Home)
   NavDisplay(
       backStack = backStack,
       onBack = { backStack.removeLastOrNull() },
       entryProvider = entryProvider {
           entry<Home> { entry ->
               HomeScreen(
                   onAboutClicked = {
                       backStack.add(About)
                   },
               )
           }
           entry<Camera> {
               CameraPreviewScreen(
                   onImageCaptured = { uri ->
                       backStack.add(Create(uri.toString()))
                   },
               )
           }
           // etc
       },
   )
}

Notably, Navigation 3 exposes a new composition local, LocalNavAnimatedContentScope, to easily integrate your shared element transitions without needing to keep track of the scope yourself. By default, Navigation 3 also integrates with predictive back, providing delightful back experiences when navigating between screens, as seen in this prior shared element transition:

CameraLayout in Compose

Learn more about Jetpack Navigation 3, currently in alpha.

Learn more

By combining the declarative power of Jetpack Compose, the camera capabilities of CameraX, the intelligent features of Gemini, and thoughtful adaptive design, Androidify is a personalized avatar creation experience that feels right at home on any Android device. You can find the full code sample at github.com/android/androidify where you can see the app in action and be inspired to build your own AI-powered app experiences.

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.


What’s New in Jetpack Compose

Posted by Nick Butcher – Product Manager

At Google I/O 2025, we announced a host of features, performance, stability, libraries, and tools updates for Jetpack Compose, our recommended Android UI toolkit. With Compose you can build excellent apps that work across devices. Compose has matured a lot since it was first announced (at Google I/O 2019!) and we're now seeing 60% of the top 1,000 apps in the Play Store such as MAX and Google Drive use and love it.

New Features

Since I/O last year, Compose Bill of Materials (BOM) version 2025.05.01 adds new features such as:

    • Autofill support that lets users automatically insert previously entered personal information into text fields.
    • Auto-sizing text to smoothly adapt text size to a parent container size.
    • Visibility tracking for when you need high-performance information on a composable's position in its root container, screen, or window.
    • Animate bounds modifier for beautiful automatic animations of a Composable's position and size within a LookaheadScope.
    • Accessibility checks in tests that let you build a more accessible app UI through automated a11y testing.

LookaheadScope {
    Box(
        Modifier
            .animateBounds(this@LookaheadScope)
            .width(if(inRow) 100.dp else 150.dp)
            .background(..)
            .border(..)
    )
}
moving image of animate bounds modifier in action

For more details on these features, read What’s new in the Jetpack Compose April ’25 release and check out these talks from Google I/O:

If you’re looking to try out new Compose functionality, the alpha BOM offers new features that we're working on including:

    • Pausable Composition (see below)
    • Updates to LazyLayout prefetch
    • Context Menus
    • New modifiers: onFirstVisible, onVisbilityChanged, contentType
    • New Lint checks for frequently changing values and elements that should be remembered in composition

Please try out the alpha features and provide feedback to help shape the future of Compose.

Material Expressive

At Google I/O, we unveiled Material Expressive, Material Design’s latest evolution that helps you make your products even more engaging and easier to use. It's a comprehensive addition of new components, styles, motion and customization options that help you to build beautiful rich UIs. The Material3 library in the latest alpha BOM contains many of the new expressive components for you to try out.

moving image of material expressive design example

Learn more to start building with Material Expressive.

Adaptive layouts library

Developing adaptive apps across form factors including phones, foldables, tablets, desktop, cars and Android XR is now easier with the latest enhancements to the Compose adaptive layouts library. The stable 1.1 release adds support for predictive back gestures for smoother transitions and pane expansion for more flexible two pane layouts on larger screens. Furthermore, the 1.2 (alpha) release adds more flexibility for how panes are displayed, adding strategies for reflowing and levitating.

moving image of compose adaptive layouts updates in the Google Play app
Compose Adaptive Layouts Updates in the Google Play app

Learn more about building adaptive android apps with Compose.

Performance

With each release of Jetpack Compose, we continue to prioritize performance improvements. The latest stable release includes significant rewrites and improvements to multiple sub-systems including semantics, focus and text optimizations. Best of all these are available to you simply by upgrading your Compose dependency; no code changes required.

bar chart of internal benchmarks for performance run on a Pixel 3a device from January to May 2023 measured by jank rate
Internal benchmark, run on a Pixel 3a

We continue to work on further performance improvements, notable changes in the latest alpha BOM include:

    • Pausable Composition allows compositions to be paused, and their work split up over several frames.
    • Background text prefetch enables text layout caches to be pre-warmed on a background thread, enabling faster text layout.
    • LazyLayout prefetch improvements enabling lazy layouts to be smarter about how much content to prefetch, taking advantage of pausable composition.

Together these improvements eliminate nearly all jank in an internal benchmark.

Stability

We've heard from you that upgrading your Compose dependency can be challenging, encountering bugs or behaviour changes that prevent you from staying on the latest version. We've invested significantly in improving the stability of Compose, working closely with the many Google app teams building with Compose to detect and prevent issues before they even make it to a release.

Google apps develop against and release with snapshot builds of Compose; as such, Compose is tested against the hundreds of thousands of Google app tests and any Compose issues are immediately actioned by our team. We have recently invested in increasing the cadence of updating these snapshots and now update them daily from Compose tip-of-tree, which means we’re receiving feedback faster, and are able to resolve issues long before they reach a public release of the library.

Jetpack Compose also relies on @Experimental annotations to mark APIs that are subject to change. We heard your feedback that some APIs have remained experimental for a long time, reducing your confidence in the stability of Compose. We have invested in stabilizing experimental APIs to provide you a more solid API surface, and reduced the number of experimental APIs by 32% in the last year.

We have also heard that it can be hard to debug Compose crashes when your own code does not appear in the stack trace. In the latest alpha BOM, we have added a new opt-in feature to provide more diagnostic information. Note that this does not currently work with minified builds and comes at a performance cost, so we recommend only using this feature in debug builds.

class App : Application() {
   override fun onCreate() {
        // Enable only for debug flavor to avoid perf impact in release
        Composer.setDiagnosticStackTraceEnabled(BuildConfig.DEBUG)
   }
}

Libraries

We know that to build great apps, you need Compose integration in the libraries that interact with your app's UI.

A core library that powers any Compose app is Navigation. You told us that you often encountered limitations when managing state hoisting and directly manipulating the back stack with the current Compose Navigation solution. We went back to the drawing-board and completely reimagined how a navigation library should integrate with the Compose mental model. We're excited to introduce Navigation 3, a new artifact designed to empower you with greater control and simplify complex navigation flows.

We're also investing in Compose support for CameraX and Media3, making it easier to integrate camera capture and video playback into your UI with Compose idiomatic components.

@Composable
private fun VideoPlayer(
    player: Player?, // from media3
    modifier: Modifier = Modifier
) {
    Box(modifier) {
        PlayerSurface(player) // from media3-ui-compose
        player?.let {
            // custom play-pause button UI
            val playPauseButtonState = rememberPlayPauseButtonState(it) // from media3-ui-compose
            MyPlayPauseButton(playPauseButtonState, Modifier.align(BottomEnd).padding(16.dp))
        }
    }
}
To learn more, see the media3 Compose documentation and the CameraX samples.

Tools

We continue to improve the Android Studio tools for creating Compose UIs. The latest Narwhal canary includes:

    • Resizable Previews instantly show you how your Compose UI adapts to different window sizes
    • Preview navigation improvements using clickable names and components
    • Studio Labs 🧪: Compose preview generation with Gemini quickly generate a preview
    • Studio Labs 🧪: Transform UI with Gemini change your UI with natural language, directly from preview.
    • Studio Labs 🧪: Image attachment in Gemini generate Compose code from images.

For more information read What's new in Android development tools.

moving image of resizable preview in Jetpack Compose
Resizable Preview

New Compose Lint checks

The Compose alpha BOM introduces two new annotations and associated lint checks to help you to write correct and performant Compose code. The @FrequentlyChangingValue annotation and FrequentlyChangedStateReadInComposition lint check warns in situations where function calls or property reads in composition might cause frequent recompositions. For example, frequent recompositions might happen when reading scroll position values or animating values. The @RememberInComposition annotation and RememberInCompositionDetector lint check warns in situations where constructors, functions, and property getters are called directly inside composition (e.g. the TextFieldState constructor) without being remembered.

Happy Composing

We continue to invest in providing the features, performance, stability, libraries and tools that you need to build excellent apps. We value your input so please share feedback on our latest updates or what you'd like to see next.

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.


Androidify: Building delightful UIs with Compose

Posted by Rebecca Franks - Developer Relations Engineer

Androidify is a new sample app we built using the latest best practices for mobile apps. Previously, we covered all the different features of the app, from Gemini integration and CameraX functionality to adaptive layouts. In this post, we dive into the Jetpack Compose usage throughout the app, building upon our base knowledge of Compose to add delightful and expressive touches along the way!

Material 3 Expressive

Material 3 Expressive is an expansion of the Material 3 design system. It’s a set of new features, updated components, and design tactics for creating emotionally impactful UX.


It’s been released as part of the alpha version of the Material 3 artifact (androidx.compose.material3:material3:1.4.0-alpha10) and contains a wide range of new components you can use within your apps to build more personalized and delightful experiences. Learn more about Material 3 Expressive's component and theme updates for more engaging and user-friendly products.

Material Expressive Component updates
Material Expressive Component updates

In addition to the new component updates, Material 3 Expressive introduces a new motion physics system that's encompassed in the Material theme.

In Androidify, we’ve utilized Material 3 Expressive in a few different ways across the app. For example, we’ve explicitly opted-in to the new MaterialExpressiveTheme and chosen MotionScheme.expressive() (this is the default when using expressive) to add a bit of playfulness to the app:

@Composable
fun AndroidifyTheme(
   content: @Composable () -> Unit,
) {
   val colorScheme = LightColorScheme


   MaterialExpressiveTheme(
       colorScheme = colorScheme,
       typography = Typography,
       shapes = shapes,
       motionScheme = MotionScheme.expressive(),
       content = {
           SharedTransitionLayout {
               CompositionLocalProvider(LocalSharedTransitionScope provides this) {
                   content()
               }
           }
       },
   )
}

Some of the new componentry is used throughout the app, including the HorizontalFloatingToolbar for the Prompt type selection:

moving example of expressive button shapes in slow motion

The app also uses MaterialShapes in various locations, which are a preset list of shapes that allow for easy morphing between each other. For example, check out the cute cookie shape for the camera capture button:

Material Expressive Component updates
Camera button with a MaterialShapes.Cookie9Sided shape

Animations

Wherever possible, the app leverages the Material 3 Expressive MotionScheme to obtain a themed motion token, creating a consistent motion feeling throughout the app. For example, the scale animation on the camera button press is powered by defaultSpatialSpec(), a specification used for animations that move something across a screen (such as x,y or rotation, scale animations):

val interactionSource = remember { MutableInteractionSource() }
val animationSpec = MaterialTheme.motionScheme.defaultSpatialSpec<Float>()
Spacer(
   modifier
       .indication(interactionSource, ScaleIndicationNodeFactory(animationSpec))
       .clip(MaterialShapes.Cookie9Sided.toShape())
       .size(size)
       .drawWithCache {
           //.. etc
       },
)

Camera button scale interaction
Camera button scale interaction

Shared element animations

The app uses shared element transitions between different screen states. Last year, we showcased how you can create shared elements in Jetpack Compose, and we’ve extended this in the Androidify sample to create a fun example. It combines the new Material 3 Expressive MaterialShapes, and performs a transition with a morphing shape animation:

moving example of expressive button shapes in slow motion

To do this, we created a custom Modifier that takes in the target and resting shapes for the sharedBounds transition:

@Composable
fun Modifier.sharedBoundsRevealWithShapeMorph(
   sharedContentState: 
SharedTransitionScope.SharedContentState,
   sharedTransitionScope: SharedTransitionScope = 
LocalSharedTransitionScope.current,
   animatedVisibilityScope: AnimatedVisibilityScope = 
LocalNavAnimatedContentScope.current,
   boundsTransform: BoundsTransform = 
MaterialTheme.motionScheme.sharedElementTransitionSpec,
   resizeMode: SharedTransitionScope.ResizeMode = 
SharedTransitionScope.ResizeMode.RemeasureToBounds,
   restingShape: RoundedPolygon = RoundedPolygon.rectangle().normalized(),
   targetShape: RoundedPolygon = RoundedPolygon.circle().normalized(),
)

Then, we apply a custom OverlayClip to provide the morphing shape, by tying into the AnimatedVisibilityScope provided by the LocalNavAnimatedContentScope:

val animatedProgress =
   animatedVisibilityScope.transition.animateFloat(targetValueByState = targetValueByState)


val morph = remember {
   Morph(restingShape, targetShape)
}
val morphClip = MorphOverlayClip(morph, { animatedProgress.value })


return this@sharedBoundsRevealWithShapeMorph
   .sharedBounds(
       sharedContentState = sharedContentState,
       animatedVisibilityScope = animatedVisibilityScope,
       boundsTransform = boundsTransform,
       resizeMode = resizeMode,
       clipInOverlayDuringTransition = morphClip,
       renderInOverlayDuringTransition = renderInOverlayDuringTransition,
   )

View the full code snippet for this Modifer on GitHub.

Autosize text

With the latest release of Jetpack Compose 1.8, we added the ability to create text composables that automatically adjust the font size to fit the container’s available size with the new autoSize parameter:

BasicText(text,
style = MaterialTheme.typography.titleLarge,
autoSize = TextAutoSize.StepBased(maxFontSize = 220.sp),
)

This is used front and center for the “Customize your own Android Bot” text:

Text reads Customize your own Android Bot with an inline moving image
“Customize your own Android Bot” text with inline GIF

This text composable is interesting because it needed to have the fun dancing Android bot in the middle of the text. To do this, we use InlineContent, which allows us to append a composable in the middle of the text composable itself:

@Composable
private fun DancingBotHeadlineText(modifier: Modifier = Modifier) {
   Box(modifier = modifier) {
       val animatedBot = "animatedBot"
       val text = buildAnnotatedString {
           append(stringResource(R.string.customize))
           // Attach "animatedBot" annotation on the placeholder
           appendInlineContent(animatedBot)
           append(stringResource(R.string.android_bot))
       }
       var placeHolderSize by remember {
           mutableStateOf(220.sp)
       }
       val inlineContent = mapOf(
           Pair(
               animatedBot,
               InlineTextContent(
                   Placeholder(
                       width = placeHolderSize,
                       height = placeHolderSize,
                       placeholderVerticalAlign = PlaceholderVerticalAlign.TextCenter,
                   ),
               ) {
                   DancingBot(
                       modifier = Modifier
                           .padding(top = 32.dp)
                           .fillMaxSize(),
                   )
               },
           ),
       )
       BasicText(
           text,
           modifier = Modifier
               .align(Alignment.Center)
               .padding(bottom = 64.dp, start = 16.dp, end = 16.dp),
           style = MaterialTheme.typography.titleLarge,
           autoSize = TextAutoSize.StepBased(maxFontSize = 220.sp),
           maxLines = 6,
           onTextLayout = { result ->
               placeHolderSize = result.layoutInput.style.fontSize * 3.5f
           },
           inlineContent = inlineContent,
       )
   }
}

Composable visibility with onLayoutRectChanged

With Compose 1.8, a new modifier, Modifier.onLayoutRectChanged, was added. This modifier is a more performant version of onGloballyPositioned, and includes features such as debouncing and throttling to make it performant inside lazy layouts.

In Androidify, we’ve used this modifier for the color splash animation. It determines the position where the transition should start from, as we attach it to the “Let’s Go” button:

var buttonBounds by remember {
   mutableStateOf<RelativeLayoutBounds?>(null)
}
var showColorSplash by remember {
   mutableStateOf(false)
}
Box(modifier = Modifier.fillMaxSize()) {
   PrimaryButton(
       buttonText = "Let's Go",
       modifier = Modifier
           .align(Alignment.BottomCenter)
           .onLayoutRectChanged(
               callback = { bounds ->
                   buttonBounds = bounds
               },
           ),
       onClick = {
           showColorSplash = true
       },
   )
}

We use these bounds as an indication of where to start the color splash animation from.

moving image of a blue color splash transition between Androidify demo screens

Learn more delightful details

From fun marquee animations on the results screen, to animated gradient buttons for the AI-powered actions, to the path drawing animation for the loading screen, this app has many delightful touches for you to experience and learn from.

animated marquee example

animated gradient button for AI powered actions example

animated loading screen example

Check out the full codebase at github.com/android/androidify and learn more about the latest in Compose from using Material 3 Expressive, the new modifiers, auto-sizing text and of course a couple of delightful interactions!

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.

Android Studio Meerkat Feature Drop is stable

Posted by Adarsh Fernando, Group Product Manager

Today, we're excited to announce the stable release of Android Studio Meerkat Feature Drop (2024.3.2)!

This release brings a host of new features and improvements designed to boost your productivity and enhance your development workflow. With numerous enhancements, this latest release helps you build high-quality Android apps faster and more efficiently: streamlined Jetpack Compose previews, new Gemini capabilities, better Kotlin Multiplatform (KMP) integration, improved device management, and more.

Read on to learn about the key updates in Android Studio Meerkat Feature Drop, and download the latest stable version today to explore them yourself!

Developer Productivity Enhancements

Analyze Crash Reports with Gemini in Android Studio

Debugging production crashes can require you to spend significant time switching contexts between your crash reporting tool, such as Firebase Crashlytics and Android Vitals, and investigating root causes in the IDE. Now, when viewing reports in App Quality Insights (AQI), click the Insights tab. Gemini provides a summary of the crash, generates insights, and links to useful documentation. If you also provide Gemini with access to local code context, it can provide more accurate results, relevant next steps, and code suggestions. This helps you reduce the time spent diagnosing and resolving issues.

moving image of Gemini in the App Quality Insights tool window in Android Studio
Gemini helps you investigate, understand, and resolve crashes in your app much more quickly in the App Quality Insights tool window.

Generate Unit Test Scenarios with Gemini

Writing effective unit tests is crucial but can be time-consuming. Gemini now helps kickstart this process by generating relevant test scenarios. Right-click on a class in your editor and select Gemini > Generate Unit Test Scenarios. Gemini analyzes the code and suggests test cases with descriptive names, outlining what to test. While you still implement the specific test logic, this significantly speeds up the initial setup and ensures better test coverage by suggesting scenarios you might have missed.

moving image of generating unit test scenarios in Android Studio
Gemini helps you generate unit test scenarios for your app.

Gemini Prompt Library

No more retyping your most frequently used prompts for Gemini! The new Prompt Library lets you save prompts directly within Android Studio (Settings > Gemini > Prompt Library). Whether it's a specific code generation pattern, a refactoring instruction, or a debugging query you use often, save it once from the chat (right-click > Save prompt) and re-apply it instantly from the editor (right-click > Gemini > Prompt Library). Prompts that you save can also be shared and standardized across your team.

moving image of prompt library in Android Studio
The prompt library saves your frequently used Gemini prompts to make them easier to use.

You have the option to store prompts on IDE level or Project level:

    • IDE level prompts are private and can be used across multiple projects.
    • Project level prompts can be shared across teams working on the same project (if .idea folder is added to VCS).

Compose and UI Development

Themed Icon Support Preview

Ensure your app's branding looks great with Android’s themed icons. Android Studio now lets you preview how your existing launcher icon adapts to the monochromatic theming algorithm directly within the IDE. This quick visual check helps you identify potential contrast issues or undesirable shapes early in the workflow, even before you provide a dedicated monochromatic drawable. This allows for faster iteration on your app's visual identity.

moving image of themed icon support in preview in Android Studio
Themed icon support in Preview helps you visually check how your existing launcher icon adapts to monochromatic theming.

Compose Preview Enhancements

Iterating on your Compose UI is now faster and better organized:

    • Enhanced Zoom: Navigate complex layouts more easily with smoother, more responsive zooming in your Compose previews.
    • Collapsible Groups: Tidy up your preview surface by collapsing groups of related composables under their @Preview annotation names, letting you focus on specific parts of the UI without clutter.
    • Grid Mode by Default: Grid mode is now the default for a clear overview. Gallery mode (for flipping through individual previews) is available via right-click, while List view has been removed to streamline the experience.
moving image of Compose previews in Android Studio
Compose previews render more smoothly and make it easier to hide previews you’re not focused on.

Build and Deploy

KMP Shared Module Integration

Android Studio now streamlines adding shared logic to your Android app with the new Kotlin Multiplatform Shared Module template. This provides a dedicated starting point within your Android project, making it easier to structure and build shared business logic for both Android and iOS directly from Android Studio.

Kotlin Multiplatform template in Android Studio
The new Kotlin Multiplatform module template makes it easier to add shared business logic to your existing app.

Updated UX for Adding Devices

Spend less time configuring test devices. The new Device Manager UX for adding virtual and remote devices makes it much easier to configure the devices you want from the Device Manager. To get started, click the ‘+’ action at the top of the window and select one of these options:

    • Create Virtual Device: New filters, recommendations, and creation flow guide you towards creating AVDs that are best suited for your intended purpose and your machine's performance.
    • Add Remote Devices: With Android Device Streaming, powered by Firebase, you can connect and debug your app with a variety of real physical devices. With a new catalog view and filters, it's now easier to locate and start using the device you need in just a few clicks.
moving image of configuring virtual devices in Android Studio
It’s now easier to configure virtual devices that are optimized for your workstation.

Google Play Deprecated SDK Warnings

Stay more informed about SDKs you publish with your app. Android Studio now displays warnings from the Google Play SDK Index when an SDK used in your app has been deprecated by its author. These warnings include information about suggested alternative SDKs, helping you proactively manage dependencies and avoid potential issues related to outdated or insecure libraries.

Google Play Deprecated SDK warnings in Android Studio
Play deprecated SDK warnings help you avoid potential issues related to outdated or insecure libraries.

Updated Build Menu and Actions

We've refined the Build menu for a more intuitive experience:

    • New 'Build run-configuration-name' Action: Builds the currently selected run configuration (e.g., :app or a specific test). This is now the default action for the toolbar button and Control/Command+F9.
    • Reordered Actions: The new build action is prioritized at the top, followed by Compile and Assemble actions.
    • Clearer Naming: "Rebuild Project" is now "Clean and Assemble Project with Tests". "Make Project" is renamed to "Assemble Project", and a new "Assemble Project with Tests" action is available.
Build menu in Android Studio
The Build menu includes behavior and naming changes to simplify and streamline the experience.

Standardized Config Directories

Switching between Stable, Beta, and Canary versions of Android Studio is now smoother. Configuration directories are standardized, removing the "Preview" suffix for non-stable builds. We've also added the micro version (e.g., AndroidStudio2024.3.2) to the path, allowing different feature drops to run side-by-side without conflicts. This simplifies managing your IDE settings, especially if you work with multiple Android Studio installations.

IntelliJ platform update

Android Studio Meerkat Feature Drop (2024.3.2) includes the IntelliJ 2024.3 platform release, which has many new features such as a feature complete K2 mode, more reliable Java** and Kotlin code inspections, grammar checks during indexing, debugger improvements, speed and quality of life improvements to Terminal, and more.

For more information, read the full IntelliJ 2024.3 release notes.

Summary

Android Studio Meerkat Feature Drop (2024.3.2) delivers these key features and enhancements:

    • Developer Productivity:
        • Analyze Crash Reports with Gemini
        • Generate Unit Test Scenarios with Gemini
        • Gemini Prompt Library
    • Compose and UI:
        • Themed Icon Preview
        • Compose Preview Enhancements (Zoom, Collapsible Groups, View Modes)
    • Build and Deploy:
        • KMP Shared Module Template
        • Updated UX for Adding Devices
        • Google Play SDK Insights: Deprecated SDK Warnings
        • Updated Build Menu & Actions
        • Standardized Config Directories
    • IntelliJ Platform Update
        • Feature complete K2 mode
        • Improved Kotlin and Java** inspection reliability
        • Debugger improvements
        • Speed and quality of life improvements in Terminal

Getting Started

Ready to elevate your Android development? Download Android Studio Meerkat Feature Drop and start using these powerful new features today!

As always, your feedback is crucial. Check known issues, report bugs, suggest improvements, and connect with the community on LinkedIn, Medium, YouTube, or X. Let's continue building amazing Android apps together!


**Java is a trademark or registered trademark of Oracle and/or its affiliates.

Android Studio Meerkat Feature Drop is stable

Posted by Adarsh Fernando, Group Product Manager

Today, we're excited to announce the stable release of Android Studio Meerkat Feature Drop (2024.3.2)!

This release brings a host of new features and improvements designed to boost your productivity and enhance your development workflow. With numerous enhancements, this latest release helps you build high-quality Android apps faster and more efficiently: streamlined Jetpack Compose previews, new Gemini capabilities, better Kotlin Multiplatform (KMP) integration, improved device management, and more.

Read on to learn about the key updates in Android Studio Meerkat Feature Drop, and download the latest stable version today to explore them yourself!

Developer Productivity Enhancements

Analyze Crash Reports with Gemini in Android Studio

Debugging production crashes can require you to spend significant time switching contexts between your crash reporting tool, such as Firebase Crashlytics and Android Vitals, and investigating root causes in the IDE. Now, when viewing reports in App Quality Insights (AQI), click the Insights tab. Gemini provides a summary of the crash, generates insights, and links to useful documentation. If you also provide Gemini with access to local code context, it can provide more accurate results, relevant next steps, and code suggestions. This helps you reduce the time spent diagnosing and resolving issues.

moving image of Gemini in the App Quality Insights tool window in Android Studio
Gemini helps you investigate, understand, and resolve crashes in your app much more quickly in the App Quality Insights tool window.

Generate Unit Test Scenarios with Gemini

Writing effective unit tests is crucial but can be time-consuming. Gemini now helps kickstart this process by generating relevant test scenarios. Right-click on a class in your editor and select Gemini > Generate Unit Test Scenarios. Gemini analyzes the code and suggests test cases with descriptive names, outlining what to test. While you still implement the specific test logic, this significantly speeds up the initial setup and ensures better test coverage by suggesting scenarios you might have missed.

moving image of generating unit test scenarios in Android Studio
Gemini helps you generate unit test scenarios for your app.

Gemini Prompt Library

No more retyping your most frequently used prompts for Gemini! The new Prompt Library lets you save prompts directly within Android Studio (Settings > Gemini > Prompt Library). Whether it's a specific code generation pattern, a refactoring instruction, or a debugging query you use often, save it once from the chat (right-click > Save prompt) and re-apply it instantly from the editor (right-click > Gemini > Prompt Library). Prompts that you save can also be shared and standardized across your team.

moving image of prompt library in Android Studio
The prompt library saves your frequently used Gemini prompts to make them easier to use.

You have the option to store prompts on IDE level or Project level:

    • IDE level prompts are private and can be used across multiple projects.
    • Project level prompts can be shared across teams working on the same project (if .idea folder is added to VCS).

Compose and UI Development

Themed Icon Support Preview

Ensure your app's branding looks great with Android’s themed icons. Android Studio now lets you preview how your existing launcher icon adapts to the monochromatic theming algorithm directly within the IDE. This quick visual check helps you identify potential contrast issues or undesirable shapes early in the workflow, even before you provide a dedicated monochromatic drawable. This allows for faster iteration on your app's visual identity.

moving image of themed icon support in preview in Android Studio
Themed icon support in Preview helps you visually check how your existing launcher icon adapts to monochromatic theming.

Compose Preview Enhancements

Iterating on your Compose UI is now faster and better organized:

    • Enhanced Zoom: Navigate complex layouts more easily with smoother, more responsive zooming in your Compose previews.
    • Collapsible Groups: Tidy up your preview surface by collapsing groups of related composables under their @Preview annotation names, letting you focus on specific parts of the UI without clutter.
    • Grid Mode by Default: Grid mode is now the default for a clear overview. Gallery mode (for flipping through individual previews) is available via right-click, while List view has been removed to streamline the experience.
moving image of Compose previews in Android Studio
Compose previews render more smoothly and make it easier to hide previews you’re not focused on.

Build and Deploy

KMP Shared Module Integration

Android Studio now streamlines adding shared logic to your Android app with the new Kotlin Multiplatform Shared Module template. This provides a dedicated starting point within your Android project, making it easier to structure and build shared business logic for both Android and iOS directly from Android Studio.

Kotlin Multiplatform template in Android Studio
The new Kotlin Multiplatform module template makes it easier to add shared business logic to your existing app.

Updated UX for Adding Devices

Spend less time configuring test devices. The new Device Manager UX for adding virtual and remote devices makes it much easier to configure the devices you want from the Device Manager. To get started, click the ‘+’ action at the top of the window and select one of these options:

    • Create Virtual Device: New filters, recommendations, and creation flow guide you towards creating AVDs that are best suited for your intended purpose and your machine's performance.
    • Add Remote Devices: With Android Device Streaming, powered by Firebase, you can connect and debug your app with a variety of real physical devices. With a new catalog view and filters, it's now easier to locate and start using the device you need in just a few clicks.
moving image of configuring virtual devices in Android Studio
It’s now easier to configure virtual devices that are optimized for your workstation.

Google Play Deprecated SDK Warnings

Stay more informed about SDKs you publish with your app. Android Studio now displays warnings from the Google Play SDK Index when an SDK used in your app has been deprecated by its author. These warnings include information about suggested alternative SDKs, helping you proactively manage dependencies and avoid potential issues related to outdated or insecure libraries.

Google Play Deprecated SDK warnings in Android Studio
Play deprecated SDK warnings help you avoid potential issues related to outdated or insecure libraries.

Updated Build Menu and Actions

We've refined the Build menu for a more intuitive experience:

    • New 'Build run-configuration-name' Action: Builds the currently selected run configuration (e.g., :app or a specific test). This is now the default action for the toolbar button and Control/Command+F9.
    • Reordered Actions: The new build action is prioritized at the top, followed by Compile and Assemble actions.
    • Clearer Naming: "Rebuild Project" is now "Clean and Assemble Project with Tests". "Make Project" is renamed to "Assemble Project", and a new "Assemble Project with Tests" action is available.
Build menu in Android Studio
The Build menu includes behavior and naming changes to simplify and streamline the experience.

Standardized Config Directories

Switching between Stable, Beta, and Canary versions of Android Studio is now smoother. Configuration directories are standardized, removing the "Preview" suffix for non-stable builds. We've also added the micro version (e.g., AndroidStudio2024.3.2) to the path, allowing different feature drops to run side-by-side without conflicts. This simplifies managing your IDE settings, especially if you work with multiple Android Studio installations.

IntelliJ platform update

Android Studio Meerkat Feature Drop (2024.3.2) includes the IntelliJ 2024.3 platform release, which has many new features such as a feature complete K2 mode, more reliable Java** and Kotlin code inspections, grammar checks during indexing, debugger improvements, speed and quality of life improvements to Terminal, and more.

For more information, read the full IntelliJ 2024.3 release notes.

Summary

Android Studio Meerkat Feature Drop (2024.3.2) delivers these key features and enhancements:

    • Developer Productivity:
        • Analyze Crash Reports with Gemini
        • Generate Unit Test Scenarios with Gemini
        • Gemini Prompt Library
    • Compose and UI:
        • Themed Icon Preview
        • Compose Preview Enhancements (Zoom, Collapsible Groups, View Modes)
    • Build and Deploy:
        • KMP Shared Module Template
        • Updated UX for Adding Devices
        • Google Play SDK Insights: Deprecated SDK Warnings
        • Updated Build Menu & Actions
        • Standardized Config Directories
    • IntelliJ Platform Update
        • Feature complete K2 mode
        • Improved Kotlin and Java** inspection reliability
        • Debugger improvements
        • Speed and quality of life improvements in Terminal

Getting Started

Ready to elevate your Android development? Download Android Studio Meerkat Feature Drop and start using these powerful new features today!

As always, your feedback is crucial. Check known issues, report bugs, suggest improvements, and connect with the community on LinkedIn, Medium, YouTube, or X. Let's continue building amazing Android apps together!


**Java is a trademark or registered trademark of Oracle and/or its affiliates.

What’s new in the Jetpack Compose April ’25 release

Posted by Jolanda Verhoef – Developer Relations Engineer

Today, as part of the Compose April ‘25 Bill of Materials, we’re releasing version 1.8 of Jetpack Compose, Android's modern, native UI toolkit, used by many developers. This release contains new features like autofill, various text improvements, visibility tracking, and new ways to animate a composable's size and location. It also stabilizes many experimental APIs and fixes a number of bugs.

To use today’s release, upgrade your Compose BOM version to 2025.04.01 :

implementation(platform("androidx.compose:compose-bom:2025.04.01"))
Note: If you are not using the Bill of Materials, make sure to upgrade Compose Foundation and Compose UI at the same time. Otherwise, autofill will not work correctly.

Autofill

Autofill is a service that simplifies data entry. It enables users to fill out forms, login screens, and checkout processes without manually typing in every detail. Now, you can integrate this functionality into your Compose applications.

Setting up Autofill in your Compose text fields is straightforward:

      1. Set the contentType Semantics: Use Modifier.semantics and set the appropriate contentType for your text fields. For example:

TextField(
  state = rememberTextFieldState(),
  modifier = Modifier.semantics {
    contentType = ContentType.Username 
  }
)

      2. Handle saving credentials (for new or updated information):

          a. Implicitly through navigation: If a user navigates away from the page, commit will be called automatically - no code needed!

          b. Explicitly through a button: To trigger saving credentials when the user submits a form (by tapping a button, for instance), retrieve the local AutofillManager and call commit().

For full details on how to implement autofill in your application, see the Autofill in Compose documentation.

Text

When placing text inside a container, you can now use the autoSize parameter in BasicText to let the text size automatically adapt to the container size:

Box {
    BasicText(
        text = "Hello World",
        maxLines = 1,
        autoSize = TextAutoSize.StepBased()
    )
}
moving image of Hello World text inside a container

You can customize sizing by setting a minimum and/or maximum font size and define a step size. Compose Foundation 1.8 contains this new BasicText overload, with Material 1.4 to follow soon with an updated Text overload.

Furthermore, Compose 1.8 enhances text overflow handling with new TextOverflow.StartEllipsis or TextOverflow.MiddleEllipsis options, which allow you to display ellipses at the beginning or middle of a text line.

val text = "This is a long text that will overflow"
Column(Modifier.width(200.dp)) {
  Text(text, maxLines = 1, overflow = TextOverflow.Ellipsis)
  Text(text, maxLines = 1, overflow = TextOverflow.StartEllipsis)
  Text(text, maxLines = 1, overflow = TextOverflow.MiddleEllipsis)
}
text overflow handling displaying ellipses at the beginning and middle of a text line

And finally, we're expanding support for HTML formatting in AnnotatedString, with the addition of bulleted lists:

Text(
  AnnotatedString.fromHtml(
    """
    <h1>HTML content</h1>
    <ul>
      <li>Hello,</li>
      <li>World</li>
    </ul>
    """.trimIndent()
  )
)
a bulleted list of two items

Visibility tracking

Compose UI 1.8 introduces a new modifier: onLayoutRectChanged. This API solves many use cases that the existing onGloballyPositioned modifier does; however, it does so with much less overhead. The onLayoutRectChanged modifier can debounce and throttle the callback per what the use case demands, which helps with performance when it’s added onto an item in LazyColumn or LazyRow.

This new API unlocks features that depend on a composable's visibility on screen. Compose 1.9 will add higher-level abstractions to this low-level API to simplify common use cases.

Animate composable bounds

Last year we introduced shared element transitions, which smoothly animate content in your apps. The 1.8 Animation module graduates LookaheadScope to stable, includes numerous performance and stability improvements, and includes a new modifier, animateBounds. When used inside a LookaheadScope, this modifier automatically animates its composable's size and position on screen, when those change:

Box(
  Modifier
    .width(if(expanded) 180.dp else 110.dp)
    .offset(x = if (expanded) 0.dp else 100.dp)
    .animateBounds(lookaheadScope = this@LookaheadScope)
    .background(Color.LightGray, shape = RoundedCornerShape(12.dp))
    .height(50.dp)
) {
  Text("Layout Content", Modifier.align(Alignment.Center))
}
a moving image depicting animate composable bounds

Increased API stability

Jetpack Compose has utilized @Experimental annotations to mark APIs that are liable to change across releases, for features that require more than a library's alpha period to stabilize. We have heard your feedback that a number of features have been marked as experimental for some time with no changes, contributing to a sense of instability. We are actively looking at stabilizing existing experimental APIs—in the UI and Foundation modules, we have reduced the experimental APIs from 172 in the 1.7 release to 70 in the 1.8 release. We plan to continue this stabilization trend across modules in future releases.

Deprecation of contextual flow rows and columns

As part of the work to reduce experimental annotations, we identified APIs added in recent releases that are less than optimal solutions for their use cases. This has led to the decision to deprecate the experimental ContextualFlowRow and ContextualFlowColumn APIs, added in Foundation 1.7. If you need the deprecated functionality, our recommendation for now is to copy over the implementation and adapt it as needed, while we work on a plan for future components that can cover these functionalities better.

The related APIs FlowRow and FlowColumn are now stable; however, the new overflow parameter that was added in the last release is now deprecated.

Improvements and fixes for core features

In response to developer feedback, we have shipped some particularly in-demand features and bug fixes in our core libraries:

Get started!

We’re grateful for all of the bug reports and feature requests submitted to our issue tracker - they help us to improve Compose and build the APIs you need. Continue providing your feedback, and help us make Compose better.

Happy composing!