Tag Archives: Compose

Top 3 Updates with Compose across Form Factors at Google I/O ’24

Posted by Chris Arriola – Developer Relations Engineer

Google I/O 2024 was filled with lots of updates and announcements around helping you be more productive as a developer. Here are the top 3 announcements around Jetpack Compose and Form Factors from Google I/O 2024:

#1 New updates in Jetpack Compose

The June 2024 release of Jetpack Compose is packed with new features and improvements such as shared element transitions, lazy list item animations, and performance improvements across the board.

With shared element transitions, you can create delightful continuity between screens in your app. This feature works together with Navigation Compose and predictive back so that transitions can happen as users navigate your app. Another highly requested feature—lazy list item animations—is also now supported for lazy lists giving it the ability to animate inserts, deletions, and reordering of items.

Jetpack Compose also continues to improve runtime performance with every release. Our benchmarks show a faster time to first pixel of 17% in our Jetsnack Compose sample. Additionally, strong skipping mode graduated from experimental to production-ready status further improving the performance of Compose apps. Simply update your app to take advantage of these benefits.

Read What’s new in Jetpack Compose at I/O ‘24 for more information.


#2 Scaling across screens with new Compose APIs and Tools

During Google I/O, we announced new tools and APIs to make it easier to build across screens with Compose. The new Material 3 adaptive library introduces new APIs that allow you to implement common adaptive scenarios such as list-detail, and supporting pane. These APIs allow your app to display one or two panes depending on the available size for your app.

Watch Building UI with the Material 3 adaptive library and Building adaptive Android apps to learn more. If you prefer to read, you can check out About adaptive layouts in our documentation.

We also announced that Compose for TV 1.0.0 is now available in beta. The latest updates to Compose for TV include better performance, input support, and a whole range of improved components that look great out of the box. New in this release, we’ve added lists, navigation, chips, and settings screens. We’ve also added a new TV Material Catalog app and updated the developer tools in Android Studio to include a new project wizard to get a running start with Compose for TV.

Finally, Compose for Wear OS has added features such as SwipeToReveal, an expandableItem, and a range of WearPreview supporting annotations. During Google I/O 2024, Compose for Wear OS graduated visual improvements and fixes from beta to stable. Learn more about all the updates to Wear OS by checking out the technical session.

Check out case studies from SoundCloud and Adidas to see how apps are leveraging Compose to build their apps and learn more about all the updates for Compose across screens by reading more here!


#3 Glance 1.1

Jetpack Glance is Android’s modern recommended framework for building widgets. The latest version, Glance 1.1, is now stable. Glance is built on top of Jetpack Compose allowing you to use the same declarative syntax that you’re used to when building widgets.

This release brings a new unit test library, Error UIs, and new components. Additionally, we’ve released new Canonical Widget Layouts on GitHub to allow you to get started faster with a set of layouts that align with best practices and we’ve published new design guidance published on the UI design hub — check it out!

To learn more about using Glance, check out Build beautiful Android widgets with Jetpack Glance. Or if you want something more hands-on, check out the codelab Create a widget with Glance.


You can learn more about the latest updates to Compose and Form Factors by checking out the Compose Across Screens and the What’s new in Jetpack Compose at I/O ‘24 blog posts or watching the spotlight playlist!

A Developer’s Roadmap to Predictive Back (Views)

Posted by Ash Nohe and Tram Bui – Developer Relations Engineers

Before you read on, this topic is scoped to Views. Predictive Back with Compose is easier to implement and not included in this blog post. To learn how to implement Compose with Predictive Back, see the Add predictive back animations codelab and the I/O workshop Improve the user experience of your Android app.

This blog post aims to shed light on various dependencies and requirements to support predictive back animations in your views based app.

First, view the Predictive Back Requirements table to understand if a particular animation requires a manifest flag, a compileSDK version, additional libraries or hidden developer options to function.

Then, start your quest. Here are your milestones:

  1. Upgrade Kotlin milestone
  2. Back-to-home animation milestone
  3. Migrate all activities milestone
  4. Fragment milestone
  5. Material Components (Views) milestone
  6. [Optional] AndroidX transitions milestone
Milestones

Upgrade Kotlin milestone

The first milestone is to upgrade to Kotlin 1.8.0 or higher, which is required for other Predictive Back dependencies.

Upgrade to Kotlin 1.8.0 or higher

Back-to-home animation milestone

The back-to-home animation is the keystone predictive back animation.

To get this animation, add android:enableOnBackInvokedCallback=true in your AndroidManifest.xml for your root activity if you are a multi-activity app (see per-activity opt-in) or at the application level if you are a single-activity app. After this, you’ll see both the back-to-home animation and a cross-task animation where applicable, which are visible to users in Android 15+ and behind a developer option in Android 13 and 14.

If you are intercepting back events in your root activity (e.g. MainActivity), you can continue to do so but you’ll need to use supported APIs and you won’t get the back-to-home animation. For this reason, we generally recommend you only intercept back events for UI logic; for example, to show a dialog asking the user to save before they quit.

See the Add support for the predictive back gesture guide for more details.

Milestone grid

Migrate all activities milestone

If you are a multi-activity app, you’ll need to opt-in and handle back events within those activities too to get a system controlled cross-activity animation. Learn more about per-activity opt-in, available for devices running Android 14+. The cross-activity animation is visible to users in Android 15+ and behind a developer option in Android 13 and 14.

Custom cross activity animations are also available with overrideActivityTransition.

Milestone grid

Fragment milestone

Next, you’ll want to focus on your fragment animations and transitions. This requires updating to AndroidX fragment 1.7.0 and transition 1.5.0 or later and using Animator or AndroidX Transitions. Assuming these requirements are met, your existing fragment animations and transitions will animate in step with the back gesture. You can also use material motion with fragments. Most material motions support predictive back as of 1.12.02-alpha02 or higher, including MaterialFadeThrough, MaterialSharedAxis and MaterialFade.

Don’t strive to make your fragment transitions look like the system’s cross-activity transition. We recommend this full screen surface transition instead.

Learn more about Fragments and Predictive Back.

Milestone grid

Material Components milestone

Finally, you’ll want to take advantage of the Material Component View animations available for Predictive Back. Learn more about available components.

Milestone grid

After this, you’ve completed your quest to support Predictive Back animations in your view based app.

[Optional] AndroidX Transitions milestone

If you’re up for more, you might also ensure your AndroidX transitions are supported with Predictive Back. Read more about AndroidX Transitions and the Predictive Back Progress APIs.

Milestone grid

Other Resources

Everything you need to know about Google TV and Android TV OS


Posted by Shobana Radhakrishnan – Senior Director of Engineering, Google TV, and Paul Lammertsma – Developer Relations Engineer

Over the past year, we’ve seen significant growth of Android TV OS, reaching 220 million monthly active devices with a 47% year-over-year increase. This incredible engagement would not be possible without our dedicated developer community. A massive thank you for your contributions.

Android 14 on TV

We’re bringing Android 14 to TV! The next generation of Android provides improvements in performance, sustainability, accessibility, and multitasking to help you build engaging apps for TVs.

  • Performance and sustainability — Android 14 for TV improves on previous OS versions so users get a snappier, more responsive TV experience. We’ve also added new energy modes to put users in control, helping to reduce a TV’s standby power consumption (see Energy saving image). Ensure your app integrates with MediaSession correctly to prevent content from continuing when input modes change or the panel switches off.
  • Accessibility — New features include color correction, enhanced text options, and improved navigation for users, which can all be toggled on or off using remote shortcuts. Review the accessibility best practices to make sure your app supports these features.
  • Multitasking Picture-in-picture mode is now supported on qualified Android 14 TV models. To evaluate whether a device supports the feature, query PackageManager for the picture-in-picture feature flag:
    hasSystemFeature(PackageManager.FEATURE_PICTURE_IN_PICTURE)


    For additional details, consult the updated Android TV app quality guidelines and the Android 14 for TV release notes.

    Compose for TV

    Compose for TV is now available in 1.0.0-beta01. We’ve updated the developer tools in Android Studio to include a new project wizard to give you a running start with Compose for TV.

    Here are just a few ways Compose makes it easier to build apps for TV:

      • Dedicated components for TV apps. Explore these components in our design guide or in practice by using our new TV Material Catalog app. Since the previous alpha release, we’ve added lists, navigation, chips, and settings screens.
      • Improved input support and performance. We’ve worked hard to address focus issues and ensure that the UI appears and animates smoothly.
      • Ease of implementation and extensive styling. Add components to your app and customize them with minimal code.
      • Broad form-factor support. Reuse business logic from your phone, tablet, or foldable app to render a TV UI with changes that can be as small as simply adding a ViewModel.

    Beta01 makes two big changes from alpha10:

      • Several components have graduated from experimental.
      • The ImmersiveList composable has been removed from the androidx-tv-material package.

    Carousel and chip components, such as FilterChip, are still experimental, so you’ll want to keep the @ExperimentalTvMaterial3Api annotation if you are using these components in your app. For all other components, you can now remove the @ExperimentalTvMaterial3Api annotation, since these APIs are now available in beta.

    We heard your feedback about the variety in the data types that represent content, which made it difficult to design a component in such a way that it would result in less code. If you are using the ImmersiveList composable from the alpha release, replace it with a custom implementation of an immersive list. While ImmersiveList is no longer part of Compose for TV, you can create an immersive list with just a few lines of code:

    @Composable
    fun SampleImmersiveList() {
        val selectedMovie = remember { mutableStateOf<Movie?>(null) }
    
    
        // Container
        Box(
            modifier = Modifier
                .fillMaxWidth()
                .height(400.dp)
        ) {
            // Background
            Box(
                modifier = Modifier
                    .fillMaxWidth()
                    .aspectRatio(20f / 7)
                    .background(selectedMovie.background)
            ) {}
    
    
            // Rows
            LazyRow(
                modifier = Modifier.align(Alignment.BottomEnd),
                ...
            ) {
                items(movies) { movie ->
                    MyMovieCard(
                        modifier = Modifier
                            .onFocusChanged {
                                if (it.hasFocus) {
                                    selectedMovie.value = movie
                                }
                            },
                        ...
                    ) {}
                }
            }
        }
    }
    

    A complete snippet is available in the immersive list sample.

    Also consult the comprehensive list of changes in the release notes to migrate any renamed or moved components.

    Migrate from the Leanback UI toolkit

    We recommend following our step-by-step migration guide to switch from Leanback to Compose for Android TV.

    Resources

    Whether you’re new to Compose or are in the process of migrating to Compose already, our large collection of resources are here to help you learn best practices for building TV UIs with the modern Android development toolkit, Jetpack Compose:

    Engage with the active Android developer community on Stack Overflow for any bugs you encounter, or submit the bugs through our public bug tracker.

    Thank you for your continued support of Android TV OS. We can’t wait to see what you’ll do on Google TV with the Android 14 TV OS!

15 Things to know for Android developers at Google I/O

Posted by Matthew McCullough, Vice President, Product Management, Android Developer  

AI is unlocking experiences that were not even possible a few years ago, and we’ve been hard at work reimaging Android with AI at the core, to help enable you to build a whole new class of apps. At this year’s Google I/O, we’re covering how new tools like Gemini can power building the next generations of apps on Android. Plus, we showcased a range of updates to our tools and services grounded in productivity, helping you make it faster and easier to build excellent experiences across form factors. Let’s dive in!

Powering the next generation of Apps with AI

#1: AI in your tools, with Gemini in Android Studio

Gemini in Android Studio (formerly Studio Bot) is your coding companion for Android development, and thanks to your feedback since its preview at last year’s Google I/O, we’ve evolved our models, expanded to over 200 countries and territories, and brought it into the Gemini family of products. Earlier today, we previewed a number of new features coming soon, like Code suggestions, App Quality Insights that leverage Gemini, and a preview of the multi-modal inputs that are coming using Gemini 1.5 Pro. You can read more about the updates here, and make sure to check out What’s new in Android development tools.

#2: Building with Generative AI

Android provides the solution you need to build Generative AI apps. You can use our most capable models over the Cloud with the Gemini API in Google AI or Vertex AI for Firebase directly in your Android apps. For on-device, Gemini Nano is our most efficient model. We’re working closely with a few early adopters such as Patreon, Grammarly, and Adobe to ensure we’re creating the best APIs that unlock the most innovative experiences. For example, Adobe is experimenting with Gemini Nano to enhance the on-device experience of Acrobat AI Assistant, a tool that allows their users to summarize and interact with documents. Be sure to check out the Build your own generative AI powered Android app, Android on-device gen AI under the hood, and the What’s New in Android sessions to learn more!

Moving image of Gemini Nano operating in Adobe

Excellent apps, across devices

#3: Think adaptive: apps on phones, foldables, tablets and more

Build and design apps that adapt beyond the phone, with the new Compose adaptive layout libraries built with Material guidance in beta. Add rich stylus and keyboard support to increase user productivity. Check out three of our key Android adaptive sessions at Google I/O: Designing adaptive apps, Building adaptive Android apps, and Increase user productivity with large screens and accessories.

Moving image of Gemini Nano operating in Adobe

#4: Enhance homescreens with Widgets and Jetpack Glance

Jetpack Glance 1.1 is now available in release candidate and lets you build high quality widgets using your Compose skills. Check out our new canonical layouts, design guidance and figma updates to the Android UI kit. To learn more check out our Improve the user experience of your Android app workshop and Build Android widgets with Jetpack Glance technical session.

#5-9: come back here tomorrow and Thursday!

We’ll continue to share more updates for Android Developers throughout Google I/O, so check back here tomorrow!

Developer Productivity

#10: Use Kotlin Multiplatform for sharing business logic

Kotlin Multiplatform (KMP) enables sharing Kotlin code across different platforms and several of our Jetpack libraries, like DataStore and Room, have already been migrated to take advantage of KMP. We use Kotlin Multiplatform within Google and recommend using KMP for sharing business logic between platforms. Learn more about it here.

#11: Compose: Shared Elements, performance improvements and more

The upcoming Compose June ‘24 release is packed with the features you’ve been asking for! Shared element transitions, lazy list item reordering animations, strong skipping mode, performance improvements, a new lazy flow layout and more. Read more about it in our blog.

#12: Android Studio: the latest preview, with Gemini and more

Android Studio Koala 🐨Feature Drop (2024.1.2) available today in the canary channel, builds on top of IntelliJ 2024.1 and adds new innovative features unlocked by Gemini, such as insights for crashes in App Quality Insights, code transformations and a Gemini API starter template to get you quickly started with Gemini. Additionally, new features such as USB speed detection, shortcut UI to control device settings, a new way to sign into Google services, updated and speedier UI for profilers with a new task centric approach and a deep integration with the Google Play SDK index are intended to make the development process extremely productive. Read more here.

And the latest from the world of Mobile

#13: Grow your business with the latest Google Play updates

Discover new ways to attract and engage users with enhanced custom store listings. Optimize revenue with expanded payment options. Reinforce trust through secure, high-quality experiences made easier with our latest SDK Console improvements. Learn about these updates and more, including our new vertical approach, in our blog.

#14: Simplify app compliance with Checks

Streamline your app's privacy compliance with Checks, Google's AI-powered compliance solution! Checks empowers developers to swiftly identify, address, resolve privacy issues, and enables you to launch apps faster and with confidence. Harness the power of automation with Checks' intelligent reports, saving you valuable time and resources. Get started now at checks.google.com.

#15: And of course, Android 15

…but for that, you’ll have to stay tuned tomorrow, when we’ve got a bit more up our sleeve!

Google I/O 2024: What’s new in Android Development Tools

Posted by Mayank Jain – Product Manager, Android Studio

At Google I/O 2024, we announced an exciting new set of features and tools aimed at making Android development faster and easier. We also shared updates to Android Studio that will help you leverage AI and make it easier for you to build high quality apps for Android across the Android ecosystem.

You can check out the What’s new in Android Developer Tools session at Google I/O 2024 to see some of the new features in action or better yet, try them out yourself by downloading Android Studio Koala 🐨 Feature Drop in the preview release channel. Here’s a look at our announcements:

Leverage Gemini in Android Studio

Since launching AI features in Android Studio last year, we continue to evolve our underlying models, integrate your feedback, and expand availability to more countries and territories so that you can leverage AI in your workflow and become a more productive Android app developer. Using the built-in AI privacy controls, you can opt in to using the latest AI feature improvements that are tailored for your Android app project.

Code suggestions with Gemini in Android Studio

You can now provide custom prompts for Gemini in Android Studio to generate code suggestions. After you enable Gemini from the View > Tool Windows > Gemini tool window, right-click in the code editor and select Gemini > Transform selected code from the context menu to see the prompt field. You can then prompt Gemini to generate a code suggestion that either adds new code or transforms selected code. You can ask Gemini to simplify complex code by rewriting it, perform very specific code transformations such as “make this code idiomatic,” or generate new functions you describe. Android Studio then shows you Gemini’s code suggestion as a code diff, so that you can review and accept only the suggestions you want.

Code suggestions with Gemini in Android Studio

Gemini for recommendations on crash reports

App Quality Insights in Android Studio seamlessly incorporates both Firebase Crashlytics and Android Vitals data into Android Studio so you can access the most important app stability related information, without having to switch tools.

You can now use Gemini in Android Studio to analyze your crash reports, generate insights which are shown in the Gemini tool window, provide a crash summary, and when possible recommend next steps, including sample code and links to relevant documentation.

You can generate all of this information directly from the App Quality Insights tool window in Android Studio after you enable Gemini from View > Tool Windows > Gemini.

Gemini for recommendations on crash reports

Integrate Gemini API into your app with a starter template

Start prototyping with Gemini models in your apps with our new starter app template provided in Android Studio. In this app template, you can issue prompts directly to the Gemini API, add image sources as input, and display the responses on the screen. Additionally, use Google AI Studio to craft custom prompts for your app.

When you are ready to scale your AI features to production with Google Cloud infrastructure, you can also access the powerful capabilities of Gemini models through Vertex AI. This is Google’s fully-managed development platform designed for building and deploying generative AI. Whether you simply need world class inference capabilities, or want to build end-to-end AI workflows with Vertex, the Gemini API is a great solution.

Integrate Gemini API into your app with a starter template

Gemini 1.5 Pro coming to Android Studio

We previously announced that Gemini in Android Studio uses the Gemini 1.0 Pro model to help you by answering Android development questions, generating code, finding resources, or explaining best practices. In this preview stage of Gemini in Android Studio, we are offering Gemini 1.0 Pro at no-cost for all users for now. Gemini 1.0 Pro is a versatile model, making it ideal to scale. However we acknowledge that its quality of responses may be limited in some cases. Based on your feedback, we are committed to improving the quality for Android development, and excited to add more features using Gemini to make your developer experience even more productive.

Along this journey, the Gemini 1.5 Pro model will be coming to Android Studio later this year. Equipped with a Large Context Window, this model notably leads to higher quality responses, and unlocks use cases like multimodal input that you might have seen in the Google I/O 2024 sessions. Stay tuned for more updates on how you can access more capable models in Android Studio.

Productivity enhancements

Release Monitoring with Firebase

Today we announced the general availability of the Firebase Release Monitoring Dashboard. The Firebase Release Monitoring Dashboard is a single dashboard powered by Firebase Crashlytics to monitor your most recent production releases of your Android app. It updates in real time to give you a high-level view of the most important release metrics, like crash-free sessions, comparisons, and benchmarking based on your previous releases.

Android Device Streaming

Android Device Streaming, powered by Firebase, lets you securely connect to remote physical Android devices hosted in Google's data centers. It is a convenient way to test your app against physical units of some of the latest Android devices, including the Google Pixel 8 and 8 Pro, Pixel Fold, and more.

Starting today, Android Device Streaming now includes the following devices, in addition to the portfolio of 20+ device models already available:

    • Samsung Galaxy Fold5
    • Samsung Galaxy S23 Ultra
    • Google Pixel 8a

Additionally, if you’re new to Firebase, Android Studio automatically creates and sets up a no-cost Firebase project for you when you sign in to Koala Feature Drop to use Device Streaming. So, you can get to streaming the device you need much faster. Learn more about Android Device Streaming quotas, including promotional quota for the Firebase Blaze plan projects available for a limited time.

Connect to the latest physical Android devices in moments with Android Device Streaming, 
powered by Firebase

USB cable speed detection

Did you know that USB cable bandwidth varies from 480 Mbps (USB-2) to up to 40,000 Mbps (USB-4)? Android Studio Koala Feature Drop now makes it trivial to differentiate low performing USB cables from the high performing ones.

When you connect an Android device, Android Studio automatically detects the device and USB cable bandwidth and warns you if there’s a mismatch in USB bandwidth.

Note: USB cable speed detection requires an updated ADB found in Android SDK Platform Tools v34+, and is currently available for macOS and Linux.

USB cable speed detection.
Learn more about USB speeds here

A new way to sign in with Google in Android Studio

It’s now easier to sign in to multiple Google services with one authentication step. Whether you want to use Gemini in Android Studio, Firebase for Android Device Streaming, Google Play for Android Vitals reports, or all these useful services, the new sign in flow makes it easier to get up and running. If you’re new to Firebase and want to use Android Device Streaming, Android Studio automatically creates a project for you, so you can quickly start streaming a real physical Firebase device. With granular permissions scoping, you will always be in control of which services have access to your account. To get started, just click the profile avatar and sign in with your developer account.

A new way to sign in with Google in Android Studio

Device UI setting shortcut

Using the device UI setting shortcut, you can now effortlessly configure your devices to desired settings related to dark theme, font size, display size, app language, and more, all directly through the Running Devices window. You can now test and debug your UI seamlessly for any of the possible scenarios required by your use case.

Device UI settings shortcuts

Faster and improved Profiler with a task-centric approach

The internals of the Android Studio Profiler have been dramatically improved. Popular profiling tasks like capturing a system trace with profileable apps now start up to 60% faster.*

We’ve redesigned the profiler to make it easier to start the task you’re interested in, whether it’s profiling your app’s CPU, memory, or power usage. For example, initiating a system trace task to profile and improve your app’s startup time is integrated right in the UI as you open the profiler.

Faster and improved Profiler with a task-centric approach 
*Based on internal data, as tested in April 2024

Google Play SDK Index integration

Android Studio is integrated with the Google Play SDK Index to inform when there are known policy or version issues with SDKs used by your app. This enables you to update those dependencies and avoid issues that could prevent you from publishing new versions of your app.

In the Android Studio Koala Feature Drop release, the integration has been expanded to also include warnings from the Google Play SDK Console. This gives you a complete view of any potential version or policy issues in your dependencies before submitting your app to the Google Play Console.

Notes from SDK authors are now also displayed directly in Android Studio to save you time.

A warning from the SDK Index with the corresponding SDK author note

Preview tiles for Wear OS apps

Android Studio now has preview support for Tiles. You can now iterate much quicker when creating tiles, enabling you to quickly see what a Tile looks like on different configurations without needing to run it on a device.

Tiles previews usage for Wear OS apps

Generate synthetic sensor data for testing on Wear OS apps

To help simulate real life scenarios you can now generate synthetic (fake) data for a Wear OS emulator for health related sensors such as heart rate, speed, steps, and more. You are now able to set up and perform testing for a multi-sport training session in minutes, end-to-end in Android Studio, without ever leaving your desk.

Generate synthetic sensor data for testing on Wear OS apps

Compose Glance widget previews

Android Studio Koala Feature Drop makes it easy to preview your Jetpack Compose Glance widgets (1.1.0-rc01) directly within the IDE. Catch potential UI issues and fine-tune your widget's appearance early in the development process. Learn more about how to get started.

Previews for Compose Glance widgets

Live Edit for Compose enabled by default

Live Edit for Compose can accelerate your Compose development experience by automatically deploying code changes to the running application on an emulator or physical device. Live Edit can help you see the effect of updates to UX elements—for example new composables, modifier updates, and animations—on the overall app experience. As you become more familiar with Live Edit you will find many creative ways it can help improve your development experience and productivity.

In Android Studio Koala Feature Drop, Live Edit is enabled by default in manual mode and has increased stability and more robust change detection, including support for import statements.

ALT TEXT
Compose Preview Screenshot Testing with Now in Android app

Compose preview screenshot testing plugin (alpha)

Host-side screenshot testing is an easy and powerful way to test UIs and prevent regressions. Today, the first alpha version of the Compose Preview Screenshot Testing plugin is available as a separate plugin, to be used together with AGP 8.5.0-beta01 or higher. Add your Compose Previews to the src/main/screenshotTest folder and run the task to generate a diff report after UI updates. The generated HTML test report lets you visually detect any changes to your app’s UI.

This alpha version of the plugin is designed for rapid iteration and feedback. We plan to merge it back into AGP in the future, but for now, this separate plugin lets us experiment and improve the feature quickly. Learn more about how to get started.

IntelliJ Platform Update (2024.1)

Android Studio Koala Feature Drop includes the IntelliJ 2024.1 platform release, which comes with some very useful IDE improvements:

    • An overhauled terminal featuring both visual and functional enhancements to streamline command-line tasks. Learn more in this blog post.
    • A new feature called sticky lines in the editor simplifies working with large files and exploring new codebases. This feature keeps key structural elements, like the beginnings of classes or methods, pinned to the top of the editor as you scroll and provides an option to promptly navigate through the code by clicking on a pinned line.
    • Basic IDE functionalities like code highlighting and completion now work for Java and Kotlin during project indexing, which should enhance your startup experience.
    • You can now scale the IDE down to 90%, 80%, or 70%, giving you the flexibility to adjust the size of IDE elements both upward and downward.

Read the detailed IntelliJ release notes here.

To summarize

Android Studio Koala Feature Drop (2024.1.2) is now available in the Android Studio canary channel with

    • Gemini in Android Studio
        • Code suggestions with Gemini in Android Studio
        • Gemini for recommendations on crash reports
        • Gemini API starter app template to help integrate Gemini into your app (also available in Koala 2024.1.1)

    • Productivity enhancements
        • Release Monitoring with Firebase
        • Android Device Streaming
        • USB cable speed detection
        • A new way to sign in with Google in Android Studio
        • Device UI setting shortcut
        • Faster and improved Profiler with a task-centric approach
        • Google Play SDK Index integration
        • Preview tiles for Wear OS apps
        • Generate synthetic sensor data for testing on Wear OS apps
        • Compose Glance widget previews
        • Live Edit for Compose enabled by default
        • Compose preview screenshot testing plugin (alpha) - to be installed additionally

    • IntelliJ Platform Update (2024.1): also available in Koala 2024.1.1
        • An overhauled terminal
        • Sticky lines in editor simplifies working with large files
        • Code highlighting and completion now work during project indexing
        • Flexible IDE size adjustments

And last, a quick reminder that going forward, the initial Android Studio releases will have the .1 Android Studio major version and introduce the updated IntelliJ platform version, while subsequent Feature Drops will increase the Android major version to .2 and focus on introducing Android-specific features that help you be more productive for Android app development.

How to get started

Ready to try the exciting new features in Android Studio?

You can download the canary version Android Studio Koala 🐨 Feature Drop (2024.1.2) today to incorporate these new features into your workflow or try the stable version Android Studio Jellyfish 🪼. You can also install them side by side by following these instructions.

As always, your feedback is important to us – check known issues, report bugs, suggest improvements, and be part of our vibrant community on LinkedIn Medium, YouTube, or X. Let's build the future of Android apps together!

Jetpack Compose compiler moving to the Kotlin repository

Posted by Ben Trengrove - Developer Relations Engineer, Nick Butcher - Product Manager for Jetpack Compose

We are excited to announce that with the upcoming release of Kotlin 2.0, the Jetpack Compose compiler will move to the Kotlin repository. This means that a matching Compose compiler will release alongside each release of Kotlin. You will no longer have to wait for a matching Compose compiler release before upgrading the Kotlin version in your Compose app. The Compose team at Google will continue to be responsible for developing the compiler and will work closely with JetBrains, our co-founders of the Kotlin Foundation. The version of the Compose compiler now always matches the Kotlin version. The compiler version is therefore jumping to 2.0.0.

To simplify the set up of Compose, we are also releasing a new Compose Compiler Gradle plugin which lets you configure the Compose compiler with a type safe API. The Compose Compiler Gradle plugin’s versioning matches Kotlin’s, and it is available from Kotlin 2.0.0.

To migrate to the new plugin, add the Compose Compiler Gradle plugin dependency to the plugins section of your Gradle version catalog:

[versions]
kotlin = "2.0.0"

[plugins]
org-jetbrains-kotlin-android = { id = "org.jetbrains.kotlin.android", version.ref = "kotlin" }

// Add the Compose Compiler Gradle plugin, the version matches the Kotlin plugin
compose-compiler = { id = "org.jetbrains.kotlin.plugin.compose", version.ref = "kotlin" }

In your project’s root level Gradle file, add the plugin:

plugins {
   // Existing plugins 
   alias(libs.plugins.compose.compiler) apply false
}

Then in modules that use Compose, apply the plugin:

plugins {
   // Existing plugins
   alias(libs.plugins.compose.compiler)
}

The kotlinCompilerExtensionVersion is no longer required to be configured in composeOptions and can be removed.

composeOptions {
   kotlinCompilerExtensionVersion = libs.versions.compose.compiler.get()
}

If required, you can now add a top level section to the same Gradle file to configure options for the Compose compiler.

android { ... }

composeCompiler {
   enableStrongSkippingMode = true
}

You might currently directly referencing the Compose compiler in your build setup, rather than using AGP to apply the compose compiler plugin. If that is the case, note that the maven artifacts will also change:

Old

New

androidx.compose.compiler:compiler

org.jetbrains.kotlin:kotlin-compose-compiler-plugin-embeddable

androidx.compose.compiler:compiler-hosted

org.jetbrains.kotlin:kotlin-compose-compiler-plugin


For an example of this migration, see this pull request.

For more information on migrating to the new Compose compiler artifact, including instructions for non-version catalog setups, see our updated documentation.

Android Studio Iguana is stable

Posted by Neville Sicard-Gregory – Senior Product Manager, Android Studio

Today we are launching Android Studio Iguana 🦎 in the stable release channel to make it easier for you to create high quality apps. With features like Version Control System support in App Quality Insights, to the new built-in support to create Baseline Profiles for Jetpack Compose apps, this version should enhance your development workflow as you optimize your app. Download the latest version today!

Check out the list of new features in Android Studio Iguana below, organized by key developer flows.

Debugging

Version control system integration in App Quality Insights

When your release build is several commits behind your local source code, line numbers in Firebase Crashlytics crash reports can easily go stale, making it more difficult to accurately navigate from crash to code when using App Quality Insights. If you’re using git for your version control, there’s now a solution to this problem.

When you build your app using Android Gradle Plugin 8.3 or later and the latest version of the Crashlytics SDK, AGP includes git commit information as part of the build artifact that is published to the Play Store. When a crash occurs, Crashlytics attaches the git information to the report, and Android Studio Iguana uses this information to compare your local checkout with the exact code that caused the crash from your git history.

After you build your app using Android Gradle Plugin 8.3 or higher with the latest Crashlytics SDK, and publish it, new crash reports in the App Quality Insights window let you either navigate to the line of code in your current git checkout or view a diff report between the current checkout and the version of your app codebase that generated the crash report. Learn more.

app quality insights with version control system integration in Android Studio
App Quality Insights with Version Control System Integration

View Crashlytics crash variants in App Quality Insights

app quality insights in Android Studio
Crash variants in App Quality Insights

Today, when you select a Crashlytics issue in App Quality Insights, you see aggregated data from events that share identical points of failure in your code, but may have different root causes. To aid in your analysis of the root causes of a crash, Crashlytics now groups events that share very similar stack traces as issue variants. You can now view events in each variant of a crash report in App Quality Insights by selecting a variant from the dropdown. Alternatively, you can view aggregate information for all variants by selecting All.

Design

Jetpack Compose UI Check

To help developers build adaptive and accessible UI in Jetpack Compose, Iguana introduces a new UI Check mode in Compose Preview. This feature works similarly to visual linting and accessibility checks integrations for views. Activate Compose UI check mode to automatically audit your Compose UI and check for adaptive and accessibility issues across different screen sizes, such as text that's stretched on large screens or low color contrast. The mode highlights issues found in different preview configurations and lists them in the problems panel.

Try it out by clicking the UI Check icon in Compose Preview.

UI Check entry point in Compose Preview
UI Check entry point in Compose Preview

UI Check results of Reply App in Compose Preview
UI Check results of Reply App in Compose Preview

Progressive rendering for Compose Preview

Compose Previews in Android Studio Iguana now implement progressive rendering, allowing you to iterate on your designs with less loading time. This feature automatically lowers the detail of out-of-view previews to boost performance, meaning you can scroll through even the most complex layouts without lag.

moving image showing progressive rendering in Compose
Progressive Rendering in Compose

Develop

Intellij Platform Update

Android Studio Iguana includes the IntelliJ 2023.2 platform release, which has many new features such as support for GitLab, text search in Search Everywhere, color customization updates to the new UI and a host of new improvements. Learn more.

Testing

Baseline Profiles module wizard

Many times when you run an Android app for the first time on a device, the app can appear to have a slow start time because the operating system has to run just-in-time compilation. To improve this situation, you can create Baseline Profiles that help Android improve aspects like app start-up time, scrolling, and navigation speed in your apps. We are simplifying the process of setting up a Baseline Profile by offering a new Baseline Profile Generator template in the new module wizard (File > New > New Module). This template configures your project to support Baseline Profiles and employs the latest Baseline Profiles Gradle plugin, which simplifies setup by automating required tasks with a single Gradle command.

Baseline Profile module wizard - Create New Module
Baseline Profile Generator

Furthermore, the template creates a run configuration that enables you to generate a Baseline Profile with a single click from the "Select Run/Debug Configuration" dropdown list.

Generate Baseline Profile drop-down menu
Generate Baseline Profile drop-down menu

Test against configuration changes with the Espresso Device API

Synchronous testing of window size changes using Espresso Device API
Synchronous testing of window size changes using Espresso Device API

Catch layout problems early and ensure your app delivers a seamless user experience across devices and orientations. The Espresso Device API simulates how your app reacts to configuration changes—such as screen rotation, device folding/unfolding, or window size changes—in a synchronous way on virtual devices. These APIs help you rigorously test and preemptively fix issues that frustrate users so you build more reliable Android apps with confidence. These APIs are built on top of new gRPC endpoints introduced in Android Emulator 34.2, which enables secure bidirectional data streaming and precise sensor simulation.

Pixel 8 and Pixel 8 Pro devices in Android Emulator (34.2)

Test your app on the latest Google Pixel device configurations with the updated Android Virtual Device definitions in Android Studio. With Android Studio Iguana and the latest Android Emulator (34.2+), access the Pixel Fold, Pixel Tablet, Pixel 7a, Pixel 8, and Pixel 8 Pro. Validating your app on these virtual devices is a convenient way to ensure that your app reacts correctly to a variety of screen sizes and device types.

New Pixel Android Virtual Devices in the Android Emulator
New Pixel Android Virtual Devices in the Android Emulator.

Build

Support for Gradle Version Catalogs

Android Studio Iguana streamlines dependency management with its enhanced support for TOML-based Gradle Version Catalogs. You'll benefit from:

    • Centralized dependency management: Keep all your project's dependencies organized in a single file for easier editing and updating.
    • Time-saving features: Enjoy seamless code completion, smart navigation within your code, and the ability to quickly edit project dependencies through the convenient Project Structure dialog.
    • Increased efficiency: Say goodbye to scattered dependencies and manual version updates. Version catalogs give you a more manageable, efficient development workflow.

New projects will automatically use version catalogs for dependency management. If you have an existing project, consider making the switch to benefit from these workflow improvements. To learn how to update to Gradle version catalogs, see Migrate your build to version catalogs.

Additional SDK insights: policy issues

Android Studio Iguana now proactively alerts you to potential Google Play policy violations through integration with the Google Play SDK Index. Easily see Play policy issues right in your build files and Project Structure Dialog. This streamlines compliance, helping you avoid unexpected publishing delays or rejections on the Google Play Store.

Android Studio's project structure dialog showing a warning from the Google Play SDK Index
A warning from the Google Play SDK Index in Android Studio’s Project Structure dialog

Android Studio compileSdk version support

Using Android Studio to develop a project that has an unsupported compileSdk version can lead to unexpected errors because older versions of Android Studio may not handle the new Android SDK correctly. To avoid these issues, Android Studio Iguana now explicitly warns you if your project’s intended compileSdk is for a newer version that it does not officially support. If available, it also suggests moving to a version of Android Studio that supports the compileSdk used by your project. Keep in mind that upgrading Android Studio might also require that you upgrade AGP.

Summary

To recap, Android Studio Iguana 🦎includes the following enhancements and features:

Debugging

Design

Develop

    • Intellij platform update

Testing

Build

Download Android Studio Today

Download Android Studio Iguana 🦎 today and take advantage of the latest features to streamline your workflow and help you make better apps. Your feedback is essential – check known issues, report bugs, suggest improvements, and be part of our vibrant community on LinkedIn Medium, YouTube, or X (formerly known as Twitter). Let's build the future of Android apps together!

Thank you for creating excellent apps, across devices in 2023!

Posted by Anirudh Dewani, Director of Android Developer Relations

Hello Android Developers,

As we approach the end of 2023, I wanted to take a moment to reflect on all that we've accomplished together as a community, and send a huge *thank you* for all of your work!

It's been an incredible year for Android, with many new features and improvements released as part of the platform as well as many new delightful app experiences crafted and delivered by you, all for the benefit of our users across the world. Here are just a few of the highlights:

    • The release of feature packed and highly performant Android 14, our most ambitious release to date.
    • The incredible momentum on large screens and Wear OS, fueled by hardware innovations of device makers and by the great app experiences you build for users
    • The growth of Compose, from being a mobile developer toolkit to Compose Everywhere, helping you build excellent apps for mobile, tablets, wear and TV,
    • And the growth of the entire Android Developer community around the world, and the millions of amazing apps you build for users!

I'm so proud of everything we've achieved together this year!

Your hard work and dedication continue to make Android the best mobile platform in the world, and I want to thank you for being a part of this community. Your contributions are invaluable, and I'm grateful for your continued support.

Thanks again for all that you do, and we can’t wait to see what you build next year!

Best,
Anirudh Dewani
Director, Android Developer Relations

Thank You for building excellent apps across devices! 0 PELOTO zoom SAMSUNG happyHolidays (year: Int = 2023)

What’s new in the Jetpack Compose August ’23 release

Posted by Ben Trengrove, Android Developer Relations Engineer

Today, as part of the Compose August ‘23 Bill of Materials, we’re releasing version 1.5 of Jetpack Compose, Android's modern, native UI toolkit that is used by apps such as Play Store, Dropbox, and Airbnb. This release largely focuses on performance improvements, as major parts of our modifier refactor we began in the October ‘22 release are now merged.

Performance

When we first released Compose 1.0 in 2021, we were focused on getting the API surface right to provide a solid foundation to build on. We wanted a powerful and expressive API that was easy to use and stable so that developers could confidently use it in production. As we continue to improve the API, performance is our top priority, and in the August ‘23 release, we have landed many performance improvements.

Modifier performance

Modifiers see large performance improvements, up to 80% improvement to composition time, in this release. The best part is that, thanks to our work getting the API surface right in the first release, most apps will see these benefits just by upgrading to the August ‘23 release.

We have a suite of benchmarks that are used to monitor for regressions and to inform our investments in improving performance. After the initial 1.0 release of Compose, we began focusing on where we could make improvements. The benchmarks showed that we were spending more time than anticipated materializing modifiers. Modifiers make up the vast majority of a composition tree and, as such, were the largest contributor to initial composition time in Compose. Refactoring modifiers to a more efficient design began under the hood in the October ‘22 release.

The October ‘22 release included new APIs and performance improvements in our lowest level module, Compose UI. Modifiers build on top of each other so we started migrating our low level modifiers in Compose Foundation in the next release, March ‘23. This included graphicsLayer, low level focus modifiers, padding, and offset. These low level modifiers are used by other highly utilized modifiers such as Clickable, and are also utilized by many framework Composables such as Text. Migrating modifiers in the March ‘23 release brought performance improvements to those components, but the real gains would come when we could migrate the higher level modifiers and composables themselves to the new modifier system.

In the August ‘23 release, we have begun migrating the Clickable modifier to the new modifier system, bringing substantial improvements to composition time, in some cases up to 80%. This is especially relevant in lazy lists that contain clickable elements such as buttons. Modifier.indication, used by Clickable, is still in the process of being migrated, so we anticipate further gains to come in future releases.

As part of this work, we identified a use case for composed modifiers that wasn’t covered in the original refactor and added a new API to create Modifier.Node elements that consume CompositionLocal instances.

We are now working on documentation to guide you through migrating your own modifiers to the new Modifier.Node API. To get started right away, you can reference the samples in our repository.

Learn more about the rationale behind the changes in the Compose Modifiers deep dive talk from Android Dev Summit ‘22.

Memory

This release includes a number of improvements in memory usage. We have taken a hard look at allocations happening across different Compose APIs and have reduced the total allocations in a number of areas, especially in the graphics stack and vector resource loading. This not only reduces the memory footprint of Compose, but also directly improves performance, as we spend less time allocating memory and reduce garbage collection.

In addition, we fixed a memory leak when using ComposeView, which will benefit all apps but especially those that use multi-activity architecture or large amounts of View/Compose interop.

Text

BasicText has moved to a new rendering system backed by the modifier work, which has brought an average of gain of 22% to initial composition time and up to a 70% gain in one benchmark of complex layouts involving text.

A number of Text APIs have also been stabilized, including:

Improvements and fixes for core features

We have also shipped new features and improvements in our core APIs as well as stabilizing some APIs:

  • LazyStaggeredGrid is now stable.
  • Added asComposePaint API to replace toComposePaint as the returned object wraps the original android.graphics.Paint.
  • Added IntermediateMeasurePolicy to support lookahead in SubcomposeLayout.
  • Added onInterceptKeyBeforeSoftKeyboard modifier to intercept key events before the soft keyboard.

Get started!

We’re grateful for all of the bug reports and feature requests submitted to our issue tracker — they help us to improve Compose and build the APIs you need. Continue providing your feedback, and help us make Compose better!

Wondering what’s next? Check out our roadmap to see the features we’re currently thinking about and working on. We can’t wait to see what you build next!

Happy composing!

Compose for Wear OS and Tiles 1.2 libraries are now stable: check out new features!

Posted by Anna Bernbaum, Product Manager and Kseniia Shumelchyk, Android Developer Relations Engineer

We’re excited to announce that version 1.2 of Compose for Wear OS and Wear Tiles libraries have reached the stable milestone. This makes it easier than ever to use these modern APIs to build beautiful and engaging apps for Wear OS.

We continue to evolve Android Jetpack libraries for Wear OS with new features and improvements to streamline development, including support for the latest Wear OS 4 release.

Many developers are already leveraging the powerful tools and intuitive APIs to create exceptional experiences for Wear OS. Partners like Peloton and Deezer were able to quickly build a watch experience and are seeing the impact on their feature-adoption and user engagement.

"The Wear OS app was our first usage of Compose in production, we really enjoyed how much more productive it made us.” 

– Stefan Haacker, a senior Android engineer at Peloton.

Compose for Wear OS and Wear Tiles complement one another. Use Wear Tiles to define the experience in your app’s tiles, and use Compose for Wear OS to build UIs across the more detailed screens in your app. Both sets of APIs offer material components and layouts that ensure your app experience on Wear OS is coherent and follows our best practices.

Now, let’s look into key features of version 1.2 of Jetpack libraries for Wear OS.

Compose for Wear OS 1.2 release

Compose for Wear OS version 1.2 contains new components and brings improvements to tooling, as well as the usability and accessibility of existing components:

Expandable Items

The new expandableItem, expandableItems and expandableButton components provide a simple way to fold and unfold content on demand. Use these components to hide detailed information on long pages or expanded sections by default. This design pattern allows users to focus on essential content and choose when to view the more detailed information.

This pattern enables apps to include high-density content while preserving the key principles of wearables – compactness and glanceability.


Moving images of expanding list and expanding text using the new component
Example of expanding list and expanding text using the new component

The component can be used for expanding lists within ScalingLazyColumn, so expandableButton collapses after the content in expandableItems is revealed in one smooth option. Another use case is expanding the content of a single item, such as Text, that would otherwise contain too many lines to show all at once when the screen first loads.

Swipe to Reveal

A new experimental API has been added to support the SwipeToReveal pattern, as a way to add up to 2 secondary actions when the composable is swiped to the left. It also provides support for users to undo the secondary actions that they take. This component is intended for use cases where the existing ‘long press’ pattern is not ideal.


Moving images showing SwipeToReveal implementation with two actions (left) and single action with undo (right)
SwipeToReveal implementation with two actions (left) and single action with undo (right)

Note that this feature is distinct from swipe-to-dismiss, which is used to navigate back to the previous screen.

Compose Previews for Wear OS

In version 1.2 we’ve added device configurations to the set of Compose Preview annotations that you use when evaluating how a design looks and behaves on a variety of devices.

We added a number of custom Wear Preview annotations for different watch shapes and sizes: WearPreviewSmallRound, WearPreviewLargeRound, WearPreviewSquare. We’ve also added the WearPreviewDevices, WearPreviewFontScales annotations to check your app against multiple device configurations and types at once. Use these new annotations to instantly verify how your app’s layout behaves on a variety of Wear OS devices.

Image showing WearPreviewDevices and WearPreviewFontScales annotations used for Horologist VolumeScreen preview
WearPreviewDevices and WearPreviewFontScales annotations used for Horologist VolumeScreen preview

Wear Compose tooling is available within a separate dependency androidx.wear.compose.ui.tooling.preview that you’ll need to include in addition to general Compose dependencies.

UX and accessibility improvements

The 1.2 release also introduced numerous improvements for user experience and accessibility:

  • Reduce-motion setting is now supported. When setting switched on it will disable scaling and fading animations in ScalingLazyColumn, and turn off the shimmering effect and wipe-off motion on placeholders.
  • HierarchicalFocusCoordinator - new experimental composable that enables marking sub-trees of the composition as focus enabled or focus disabled. Use this to control which element receives rotary scroll events, such as multiple ScalingLazyColumns in a HorizontalPage
  • PickerGroup - a new composable designed to combine multiple pickers together. It handles focus between the pickers using the HierarchicalFocusCoordinator API and enables auto-centering of Picker items. It’s already integrated in prebuilt Date and Time pickers from Horologist: check out some examples.
  • Picker has a new userScrollEnabled parameter, which determines if picker should be scrollable and disables scrolling when not focused.
  • The shimmer and wipe-off animations for placeholder now apply the wipe-off effect immediately when the content is ready.
  • Stepper has an additional parameter, enableRangeSemantics, that allows customization of semantics, such as disabling default range semantics when required.

Other changes

ScalingLazyColumn and associated classes have migrated from the material package to the foundation.lazy package, as a preparation for a new Material3 library. You can use this migration script to update your code seamlessly.

The Horologist library enhances the implementation of snap behavior to a ScalingLazyColumn, TimePicker and DatePicker when the user interacts with a rotary crown. The rotaryWithFling modifier was deprecated in favor of rotaryWithScroll which includes fling behavior by default. Check out rotaryWithScroll and rotaryWithSnap reference documentation for details.


Moving image of Snap and fling behavior for scrolling list
Snap and fling behavior for scrolling list

Tiles 1.2 release

Tiles are designed to give users fast, predictable access to the information and actions they rely on most. Version 1.2 of the Jetpack Tiles Library introduces support for platform data bindings and animations so you can provide even more responsive experiences to your users.

Moving image of Tiles carousel on Wear Os
Tiles carousel on Wear OS

Platform data bindings

Version 1.2 introduces support dynamic expressions that link elements of your tile to platform data sources. If your tile uses platform data sources such as heart rate, or, step count, or time, your tile can be updated up to once per second.

Moving image of a tile using data binding
Examples of a tile using data binding

Animations

The new version of tiles also adds support for animations. You can use tween animations to create smooth transitions when part of your layout changes, and use transition animations to animate new or disappearing elements from the tile.

Moving images of animated tiles
Examples of animated tiles

Partial tile updates

We have also now enabled partial tile updates, meaning that we will only update the part of your tile that has been updated, not the entire layout. This allows you to update part of your tile, while an animation is playing in another part, without disrupting that animation.

Learn more

Get started with hands-on experience trying our codelab to create your first Tile and Compose for Wear OS codelab.

We’ve already updated our samples and Horologist libraries to work with the latest version of Jetpack libraries for Wear OS. Also make sure to check out the documentation for Tiles and Compose for Wear OS to learn more about best practices when building apps for wearables.

Provide feedback

We continue to evolve our APIs with the features you’ve been asking for. Please do continue providing us feedback on the issue tracker , and join the Kotlin Slack #compose-wear channel to connect with the Google team and developer community.

Start building for Wear OS now

Discover even more by taking a look at our developer site and reading the latest Wear OS announcements from Google I/O!