Tag Archives: Android

Google @ KotlinConf 2024: A Look Inside Multiplatform Development with KMP and more

Posted by Murat Yener – Developer Relations Engineer

Following our recent Google I/O announcement recommending Kotlin Multiplatform (KMP) for sharing business logic across mobile, web, server, and desktop platforms, and our move to use KMP in Google Workspace, KotlinConf 2024 was the next moment to share the highlights and connect with the Kotlin community.

Kotlin Multiplatform, developed by JetBrains, allows developers to build cross-platform apps by compiling Kotlin code into platform-native binaries while leveraging the full capabilities of a modern, memory-managed language. This approach has been a long-term investment for the Google Workspace team, enabling them to share the business logic between different platforms.

The Android team has been working to support KMP and recently released an alpha version of Room with KMP support. As of today, Annotations, Collections and DataStore are already in stable with KMP support . We've also commonified Lifecycle, ViewModel and Paging libraries to allow integrations with non-Android platforms.

Keynotes and Technical Sessions

The conference kicked off with a keynote, as part of which, Google’s Jeffrey van Gogh gave an overview of Google’s contributions to the Kotlin ecosystem. As part of this, Jeffrey delved into how Google leverages Kotlin Multiplatform (KMP) to streamline development across its own product portfolio. Jeffrey highlighted the benefits of code sharing and efficiency that KMP brings to Google's projects, aligning with our recent recommendations for Android app development.

Our technical sessions at KotlinConf 2024 span a range of topics:

  • A Tale of Two Languages by John Pampuch offered an engaging comparison of Java and Kotlin's evolution, highlighting their symbiotic relationship and mutual influence.
  • The Android Jetpack team, represented by Elif Bilgin, Yigit Boyar, and Daniel Santiago Rivera, unveiled Enabling Kotlin Multiplatform Success: The Android Jetpack Journey. They provided insights into the current state of KMP in Jetpack, shared updates on KMP-enabled Jetpack libraries, and explored the migration process of a well-established Jetpack library to KMP.
  • Going Fast with Kotlin by Andrei Shikov shared valuable insights gained from optimizing Compose for Android. Andrei highlighted interesting performance nuances in Kotlin and the guardrails the Compose team established to ensure optimal performance.
  • Kotlin Multiplatform in Google Workspace by Jason Parachoniak discussed Google Workspace's ongoing migration from a Java-oriented multiplatform foundation to Kotlin Multiplatform, aligning with Google's broader adoption of KMP. Jason shared lessons learned and the current state of this ambitious transition.
  • Write Your Own Kotlin Lint Checks! by Tor Norbye, Android Studio Engineering Director, empowered developers to extend Android Lint, a static analysis tool used by millions, by creating their own checks. Despite the name, it's not actually Android specific -- it's also used to analyze server Kotlin and Java code inside of Google!

Community Engagement at KotlinConf

We are always looking into ways to be actively engaged with the Kotlin community. If you attended KotlinConf, we hope you got a chance to check out our booth, with opportunities to chat with our engineers, get your questions answered, and learn more about how you can leverage Kotlin and KMP.

Learn more about KMP

In addition, you can view updated docs and a new mobile sample on KMP. These resources should have what you need to start learning KMP and if you have any feedback or come across any issues, please share them through this link.

Looking Ahead

We are excited about the future of Kotlin and are planning to add KMP support to more AndroidX libraries. We are looking forward to seeing how you will adopt and build the next generation of apps using KMP.

Thanks to KotlinConf organizers, speakers, attendees, and the entire Kotlin community for making this event happen and bringing Kotlin enthusiasts together.

Home APIs: Enabling all developers to build for the home

Posted by Matt Van Der Staay – Engineering Director, Google Home


This blog was originally posted on Google for Developers.

As the saying goes, “home is where the heart is.” It’s where we spend the most time; it’s your space to be comfortable, where you can truly relax, connect and make memories. Our homes have gotten more helpful with connected products, such as a smart door lock or Nest thermostat. Despite this momentum, it's still too hard to develop for the home.

We are changing all of that. Building on the foundation of Matter, we've re-envisioned Google Home as a platform for developers - all developers, not just those that build smart home devices. Google Home is the destination to create innovative experiences for the home.

Today, we’re announcing the Home APIs and Home runtime. With the Home APIs, app developers can access over 600M devices, Google’s hubs and Matter infrastructure, and an automation engine powered by Google intelligence - all available on both Android and iOS. Here are five things to know:

1. Any developer can now build an experience that works with Google Home.

The home offers a unique opportunity for developers to create seamless and deeper relationships with users, but developing for the smart home is harder than it needs to be. Building for the smart home means integrations with many device makers, operating hubs and Matter fabrics, and operating automations engines driven by intelligent signals.

Whether you build an app specifically for smart home devices or build apps that have nothing to do with the smart home – like a fitness app or delivery app - the Home APIs will let you create app experiences that offer your customers delightful and differentiated experiences on both Android and iOS.

2. Access 600 million connected devices from your app

The new Device and Structure APIs let you access over 600M devices with a single integration. Control and manage the devices already connected to Google Home, such as Matter light bulbs or the Nest Learning Thermostat, whether at home, or on the go. You can build a complex app to manage any aspect of a smart home, or simply integrate with a smart device to solve pain points - like turning on the lights automatically before the food delivery driver arrives.

The Home APIs have been designed with privacy and security in mind, leveraging industry standard best practices. Users are always in control and need to explicitly grant access to their structure and smart home devices before an app can access it. And they can easily revoke access at any time from the Google Home app. To ensure quality experiences, developers who adopt the Home APIs must pass certification before launching their app.

The Device and Structure APIs
The Device and Structure APIs provide all of the foundational building blocks to create a smart home experience.

The new Commissioning API lets you setup Matter devices in your app or the Home app or directly with Fast Pair on Android, without the need to create a new Matter fabric, saving you time and resources.

The Commissioning API
The Commissioning API provides all of the customer experience to set up a Matter device.

3. Automate with Google’s unique intelligence about the home

As people add more devices to their home, it becomes challenging to make them all work in unison. Over the past year, we have added new signals and allowed those with advanced skills to script their home using generative AI. With the new Automation API, you can create and manage home automations in your app, using Google Home’s new automation engine and intelligent signals.

Automations can be triggered by device signals from the home such as occupancy events from motion sensors, mode changes from appliances, or media events from a smart TV. For example, Yale is using the Automation API to turn on the foyer lights when the front door is unlocked at night. Automations can also use Google’s intelligence signals like home and away, which fuses together signals from devices across the home to create a more accurate presence detection.

The Automations API
The Automations API provides all of the tools for creating and managing automations.

4. Expanding hubs for Google Home to the TV

A hub for Google Home is a device that enables remote access and local control of their Matter devices across Wifi and Thread. The Home APIs use the network of hubs for Google Home to control Matter devices whether the user is in the home or away.

Later this year, we’re upgrading our hubs and introducing the Home runtime, so other devices, including Chromecast with Google TV, select panel TVs with Google TV running Android 14, or higher and eligible LG brand TVs will also become hubs for Google Home.

Home APIs make controlling lights and switches locally over a hub feel snappy. We are adopting these APIs in the Google Home app, and our early tests show device control operating up to three times faster than before. Developers using the Home APIs can see faster and more responsive local control in their apps as well.

5. Delightful new experiences from a diverse set of apps

We are working with a broad range of brands across lighting, security, automotive, energy, and entertainment to build seamless smart home experiences that help get more usefulness from the smart home.

Partners from every major smart home category are building on the Home APIs.
Partners from every major smart home category are building on the Home APIs.

Here are how some of our first partners are using the Home APIs:

ADT’s new Trusted Neighbor will revolutionize the universal practice of “giving a trusted neighbor a key to your home,” enabling users to easily grant secure and temporary access to their homes for neighbors, friends or helpers.

ADT Trusted Neighbor Program

LG will enable millions of TVs to be hubs for Google Home, allowing seamless control of devices from any app built using Home APIs. You will also be able to use the ThinQ mobile app or the Home Hub on the LG TV to control devices.

Home APIs on LG TVs for Google Home

Eve Systems will bring their experience to Android for the first time and build helpful automations like lowering the blinds when the temperature drops at night.

Eve Systems using Home APIs

Google Pixel is bridging the digital and physical worlds so that bedtime mode can not only dim your screen, but can also automatically dim your bedroom lights, lower the shades and lock the front door.

Google Pixel using Home APIs

And this is just the beginning. With the Home APIs, a workout app could keep you cool while you are burning calories by turning on the fan before you begin working out. Or a vacation rental app could make sure that the lights are on and the temperature is just right when a guest arrives. With the Home APIs, now anyone can bridge digital experiences and physical devices.


Sign Up to Build with the Home APIs

Do you have a great idea or feature that you'd like to build into your app with the Home APIs? Tell us about it and join the waitlist for access to the Home APIs or Home runtime. We will expand access on a rolling basis and the first apps built on the Home APIs will come to the Play Store and App Store starting this fall. Learn more about what’s included in the Home APIs from our I/O session on the Google Home Developer Center.

Everything you need to know about Google TV and Android TV OS


Posted by Shobana Radhakrishnan – Senior Director of Engineering, Google TV, and Paul Lammertsma – Developer Relations Engineer

Over the past year, we’ve seen significant growth of Android TV OS, reaching 220 million monthly active devices with a 47% year-over-year increase. This incredible engagement would not be possible without our dedicated developer community. A massive thank you for your contributions.

Android 14 on TV

We’re bringing Android 14 to TV! The next generation of Android provides improvements in performance, sustainability, accessibility, and multitasking to help you build engaging apps for TVs.

  • Performance and sustainability — Android 14 for TV improves on previous OS versions so users get a snappier, more responsive TV experience. We’ve also added new energy modes to put users in control, helping to reduce a TV’s standby power consumption (see Energy saving image). Ensure your app integrates with MediaSession correctly to prevent content from continuing when input modes change or the panel switches off.
  • Accessibility — New features include color correction, enhanced text options, and improved navigation for users, which can all be toggled on or off using remote shortcuts. Review the accessibility best practices to make sure your app supports these features.
  • Multitasking Picture-in-picture mode is now supported on qualified Android 14 TV models. To evaluate whether a device supports the feature, query PackageManager for the picture-in-picture feature flag:
    hasSystemFeature(PackageManager.FEATURE_PICTURE_IN_PICTURE)


    For additional details, consult the updated Android TV app quality guidelines and the Android 14 for TV release notes.

    Compose for TV

    Compose for TV is now available in 1.0.0-beta01. We’ve updated the developer tools in Android Studio to include a new project wizard to give you a running start with Compose for TV.

    Here are just a few ways Compose makes it easier to build apps for TV:

      • Dedicated components for TV apps. Explore these components in our design guide or in practice by using our new TV Material Catalog app. Since the previous alpha release, we’ve added lists, navigation, chips, and settings screens.
      • Improved input support and performance. We’ve worked hard to address focus issues and ensure that the UI appears and animates smoothly.
      • Ease of implementation and extensive styling. Add components to your app and customize them with minimal code.
      • Broad form-factor support. Reuse business logic from your phone, tablet, or foldable app to render a TV UI with changes that can be as small as simply adding a ViewModel.

    Beta01 makes two big changes from alpha10:

      • Several components have graduated from experimental.
      • The ImmersiveList composable has been removed from the androidx-tv-material package.

    Carousel and chip components, such as FilterChip, are still experimental, so you’ll want to keep the @ExperimentalTvMaterial3Api annotation if you are using these components in your app. For all other components, you can now remove the @ExperimentalTvMaterial3Api annotation, since these APIs are now available in beta.

    We heard your feedback about the variety in the data types that represent content, which made it difficult to design a component in such a way that it would result in less code. If you are using the ImmersiveList composable from the alpha release, replace it with a custom implementation of an immersive list. While ImmersiveList is no longer part of Compose for TV, you can create an immersive list with just a few lines of code:

    @Composable
    fun SampleImmersiveList() {
        val selectedMovie = remember { mutableStateOf<Movie?>(null) }
    
    
        // Container
        Box(
            modifier = Modifier
                .fillMaxWidth()
                .height(400.dp)
        ) {
            // Background
            Box(
                modifier = Modifier
                    .fillMaxWidth()
                    .aspectRatio(20f / 7)
                    .background(selectedMovie.background)
            ) {}
    
    
            // Rows
            LazyRow(
                modifier = Modifier.align(Alignment.BottomEnd),
                ...
            ) {
                items(movies) { movie ->
                    MyMovieCard(
                        modifier = Modifier
                            .onFocusChanged {
                                if (it.hasFocus) {
                                    selectedMovie.value = movie
                                }
                            },
                        ...
                    ) {}
                }
            }
        }
    }
    

    A complete snippet is available in the immersive list sample.

    Also consult the comprehensive list of changes in the release notes to migrate any renamed or moved components.

    Migrate from the Leanback UI toolkit

    We recommend following our step-by-step migration guide to switch from Leanback to Compose for Android TV.

    Resources

    Whether you’re new to Compose or are in the process of migrating to Compose already, our large collection of resources are here to help you learn best practices for building TV UIs with the modern Android development toolkit, Jetpack Compose:

    Engage with the active Android developer community on Stack Overflow for any bugs you encounter, or submit the bugs through our public bug tracker.

    Thank you for your continued support of Android TV OS. We can’t wait to see what you’ll do on Google TV with the Android 14 TV OS!

Android for Cars: Bringing more apps to cars

Posted by Vivek Radhakrishnan – Technical Program Manager, and Seung Nam – Product Manager

With technology in cars becoming more capable, the opportunity to deliver safe and seamless connected experiences for drivers and passengers is greater than ever. Google remains committed to the automotive industry and is seeing momentum across Android Auto and cars powered by Android Automotive OS with Google built-in. We’re excited to share updates across our in-car experiences and introduce new programs and resources to make it easier for you to bring your apps to cars. Learn more below and in the Android for Cars Technical Session.

Momentum and updates

With over 200 million cars on the road compatible with Android Auto, and nearly 40 car models like the Nissan Rogue, Renault R5, Acura ZDX, and Ford Explorer offering Google built-in, the time to bring your apps to cars is now.

Over the last year, the ecosystem of apps available across these experiences has grown – thanks to you. New entertainment apps like Max, Peacock and Angry Birds are coming to select cars with Google built-in. On Android Auto, the Uber Driver app is now available, allowing drivers to accept rides and deliveries, and get turn-by-turn directions on a bigger screen.

Image showing Angry Birds on a Volvo EX90 car display
Angry Birds is coming to select cars with Google built-in, including Volvo EX90 (pictured).

We’re also pleased to share that Google Cast is coming to cars with Android Automotive OS, starting with Rivian with more to follow. This allows you to easily cast video content from your phone or tablet directly to the car while parked. If you don’t already offer casting in your app, this is a simple way for your content to reach new audiences in the car.

Coming soon - you can stream content from apps on your phone, like Pluto TV, to Rivian cars via Google Cast.

New car app quality tiers

There are unique considerations when developing apps and experiences for cars including safety, numerous screen sizes, and more. Our priority is developing resources and tools that take these considerations into account and minimize the work needed for you to bring your apps to cars.

We’re introducing new quality tiers, inspired by those that exist for large screens, to streamline the process of bringing existing apps to cars by highlighting what makes for a great user experience in cars. Here are the tiers and what they encompass:

    • Tier 1: Car differentiated
      This tier represents the best of what’s possible in cars. Apps in this tier are specifically built to work across the variety of hardware in cars and can adapt their experience across driving and parked modes. They provide the best user experience designed for the different screens in the car like the center console, instrument cluster and additional screens - like panoramic displays that we see in many premium vehicles.
    • Tier 2: Car optimized
      Most apps available in cars today fall into this tier and provide a great experience on the car’s center stack display. These apps will have some car-specific engineering to include capabilities that can be used across driving or parked modes, depending on the app’s category.
    • Tier 3: Car ready
      Apps in this tier are large screen compatible and are enabled while the car is parked, with potentially no additional work. While these apps may not have car-specific features, users can experience the app just as they would on any large screen Android device.

To learn more about the quality tiers, see Android app quality for cars.

Car ready mobile apps program

Let’s dive deep into Tier 3 apps. In collaboration with car manufacturers, we’re introducing the Car ready mobile apps program to accelerate bringing mobile apps to cars with no additional work for developers.

As part of this program, Google will proactively review mobile apps that are already adaptive and large screen compatible to ensure safety and compatibility in cars. If the app qualifies, we will automatically opt it in for distribution on cars with Google built-in and make it available in Android Auto, without the need for new development or a new release to be created. This program will start with parked app categories like video, gaming and browsers with plans to expand to other app categories in the future.

The program will roll out in the coming months, but if you already offer a large screen compatible adaptive app and it falls into one of these categories, you can request a review to participate sooner. As this program rolls out, availability of your app will depend on platform compatibility.

To learn more about building qualified mobile apps, check out the technical session titled “Building Adaptive Android Apps”. You can find guidance on what to look out for at developer.google.com

Animation showing AMC+ app on a phone, tablet and car display.
Apps optimized for large screens, like AMC+, may be able to come to cars with little to no development work.

New tools and emulators

To create high quality experiences in cars, we are also introducing some new tools that can help you along the way.

    • First, we have a new emulator for distant and panoramic displays so developers can visualize and test for the growing sizes and number of screens in the car and make sure apps can adapt to the variety of displays for the best experience.
    • We also have a new tool that addresses the wide range of screen shapes and user interfaces (UI) present in cars. Many new car displays have unique curves, insets and angles that impact the UI, so we have an emulator that lets you change the emulator screen to match OEM screen designs. This will help ensure the apps work well on real cars without needing to set up specific OEM emulators or bringing in real cars for testing.
    • Lastly, we’re introducing an Android Automotive OS system image for Pixel Tablet. This will let you physically interact with your app as you would on a car screen. We are opening this up for early access partners for the purpose of development and testing today, and you can request to participate here.

To learn more about how to use these tools, check out the “Build and test a parked app for Android Automotive OS” codelab that will be published tomorrow.

More app categories for cars

As you consider bringing your app to cars, we put together a table to help you understand what app categories are currently open and accepting app submissions across both Android Auto and cars with Google built-in. We will continue to expand the type of apps that can be enabled in cars, so if your app isn’t in one of these categories, stay tuned for future opportunities!

Android for Cars Catergory Status

Start developing apps for cars today

To learn how to bring your apps to cars, check out the documentation on the Android for Cars developer site and the Android for Cars Technical Session. With all the opportunities across car screens, there has never been a better time to bring your apps and experiences to cars. Thanks for all the contributions to the Android ecosystem. See you on the road!

Scaling Across Screens with Jetpack Compose @ Google I/O ‘24

Posted by Maru Ahues Bouza, Product Management Director, Android Developer

Scaling Across Screens with Jetpack Compose

The promise of Jetpack Compose has always been that a modern toolkit designed to build native UI can help you build better apps faster and easier. As more and more of you - 40% of the top 1k apps, in fact - use (and love) Compose, we’ve been working to extend those benefits you’re seeing on mobile to also help you build across form factors as well. At Google I/O 2024, we announced a lot of new updates for Compose that help you build across form factors, including Compose APIs to support adaptive layouts, and new updates for Compose TV and Wear OS. From foldables to wearables to TVs, Compose is delivering features built to make Android development faster and easier. Apps like yours are already using Compose to support more screens with less code.

When thinking about layouts - think adaptive

Yesterday, we announced a new set of Compose APIs for building adaptive layouts, using Material guidance. These APIs, now in Beta, provide new layouts and components that adapt as users expect when switching between small and large window sizes.

The libraries provide 3 new scaffolds that adapt to the different window sizes that users can place apps in on different types of devices, from phones to foldables to tablets and more.

3 new libraries that adapt to different window sizes

NavigationSuiteScaffold

NavigationSuiteScaffold helps make it easier to build navigation UI by automatically complying with Material guidelines to provide your users with an optimal experience based on their window size.

Material guidelines recommend using a navigation bar at the bottom of compact width windows such as most phones and a navigation rail on the size of medium width and expanded width windows. It used to be up to each app individually to handle swapping between these components; now NavigationSuiteScaffold does this for you by switching between the components when the window size changes.

Navigation bar

ListDetailPaneScaffold & SupportingPaneScaffold

The new library also has ListDetailPaneScaffold and SupportingPaneScaffold, which help you implement canonical layouts that we recommend in many cases - list-detail and supporting pane.

On a phone, you usually organize your app flow through screens. For example, clicking on an item on your list screen brings you to the detail screen.

Detaileds screen

When adapting to different window sizes, it helps to think of your app in terms of panes rather than screens. For a compact window size class, such as a phone, you might only display one pane. For an expanded window size class, you might show two, or more panes at the same time. ListDetailPaneScaffold and SupportingPaneScaffold help you build apps that easily switch between one and two pane layouts.

Different screen layouts

You can learn more about all three of these APIs and how to get started with them in the “Building UI with the Material 3 adaptive library” and “Building adaptive Android apps” technical sessions.

“Integrating SupportingPaneScaffold was effortless and quick. It enabled us to seamlessly organize primary and secondary content on To-Dos. Depending on the window size class, the supporting pane adjusts the UI without any additional custom logic. Delighting our users regardless of what device they use is a key priority for SAP Mobile Start.”
- Software Engineer on SAP Mobile Start

Compose for Wear OS

In the past year, adoption of Compose for Wear OS has grown 200%, showcasing the ease with which Compose allows developers to build for the watch form factor.

Recently we’ve seen top apps such as WhatsApp, Gmail and Google Calendar built entirely using Compose for Wear OS, and it’s the recommended way for building user interfaces for Wear OS apps.

At this year’s Google I/O, Compose for Wear OS is graduating visual improvements and fixes from beta to stable.

In the past year, we’ve added features such as SwipeToReveal, to give users additional means for completing actions, an expandableItem, to enhance the use of the smaller screen and show additional information where needed, and a range of WearPreview supporting annotations, for ensuring your app works optimally across the range of device sizes and font scales.

Compose for Wear OS previews usage in Android Studio
Compose for Wear OS previews usage in Android Studio

You can get started with Compose for Wear OS by taking the codelab and learn more about all the latest updates for Wear OS via the technical session.

Compose for Android TV

At Google I/O ‘24, we announced that Compose for TV 1.0.0 is now available in beta. Compose for TV is our recommended approach for building delightful UIs for Android TV OS. It brings all of the benefits of Jetpack Compose to your TV apps, making building beautiful and functional experiences in your app much faster and easier.

The latest updates to Compose for TV include better performance, input support, and a whole range of improved components that look great out of the box. New in this release, we’ve added lists, navigation, chips, and settings screens. We’ve also updated the developer tools in Android Studio to include a new project wizard to get a running start with Compose for TV.

The new TV Material Catalog app lets you explore components in Compose for TV with different themes and layouts, and our updated JetStream sample shows how it all fits together.

TV Material Catalog app in action

You can get started with Compose for TV by checking out the dedicated blog, the technical session or taking a look at the integration guides.

Jetpack Glance

Jetpack Glance 1.1.0 is now available in RC, bringing a new unit test library, Error UIs, and new components.

We have also released new Canonical Widget Layouts on GitHub, which are built on top of the Glance components, to allow you to get started faster with a set of layouts that align with best practices.

The first set of layouts are delivered as code samples and a matching figma design kit on Android UI Kit with more layouts coming later this year.

Lastly, we have new design guidance published on the UI design hub—check it out!

A sample of Compose across screens: Jetcaster

A sample of Compose across screens: Jetcaster

We have updated Jetcaster—one of our Compose samples—to adapt across phone, foldable and tablet screens, and added support for TV, Wear and homescreen widgets with Glance. Jetcaster showcases how Compose helps you to build across a range of devices using a shared architecture in a single project.

See how you can extract elements such as your data layer, and design system, to promote reuse and consistency while delivering an experience tailored to different form factors. You can dive directly into the code on GitHub.

Get started with Compose across screens

With these updates to Compose to help you build for tablets, foldables, wearables and TVs, it is a great time to get started! These technical sessions are a great place to learn more about all the latest updates:

Learn more about how SoundCloud supported more screens using 45% less code with Jetpack Compose!

"Our mobile Compose skills transferred directly to Compose for other form factors, The concepts and most APIs are the same across form factors” - Vitus Ortner, Android engineer at SoundCloud

What’s new in Wear OS – I/O ’24

Kseniia Shumelchyk, Android Developer Relations Engineer, and Garan Jenkin, Android Developer Relations Engineer

Wear OS has seen incredible growth and advancements over the past year. With watch launches from Pixel, Samsung and more, Wear OS grew its user base by 40% in 2023 and has users in over 160 countries and regions. And Wear OS has expanded to more brands including OnePlus, OPPO and Xiaomi. This growth has been accompanied by heavy investments in performance and power optimization.

In this blog post, we’ll be highlighting some of the key updates we announced at Google I/O this year, so let’s dive in and explore the latest advancements in Wear OS and how you can make the most of the platform.

Wear OS 5 Developer Preview

We’re excited to be releasing the Developer Preview of Wear OS 5, the next version of Google’s smartwatch platform arriving later this year, based on Android 14. Central to our release of Wear OS 5 is continuing to enhance battery life.

Wear OS 5 brings performance improvements over Wear OS 4. Tracking your workout is now more efficient; for example, running a marathon consumes up to 20% less power on Wear OS 5 than on Wear OS 4.

Wear OS 5 brings battery improvements over Wear OS 4 for longer work out tracking
Wear OS 5 brings battery improvements over Wear OS 4 for longer work out tracking

To help you develop power-efficient apps on Wear OS, we’ve released a new guide to conserve power and battery. Be sure to take a look!

Wear OS 5 is based on Android 14, which brings with it a number of developer-facing changes. Check out what’s changed and try the new Wear OS 5 emulator to test your app for compatibility with the new platform version.

Changes in Watch Faces development

Last year we introduced the Watch Face Format as part of Wear OS 4, and we’ve had a fantastic response, with 30% of watch faces in Google Play already using the format. It’s been great to see what you’ve all been able to create so far using the Watch Face Format!

Sample Watch faces created with Watch Face Format
Sample Watch faces created with Watch Face Format

We’re excited to bring you the next iteration of the Watch Face Format with Wear OS 5.

Additionally, we’re announcing some changes to existing watch face development using Jetpack Watch Face APIs. Starting from Wear OS 5, we are introducing restrictions to complications for watch faces built with AndroidX or the Wearable Support Library that will apply to some data sources, as well as Google Play publishing limitations to new watch faces built with these libraries.

Check out the Watch Faces blog post for full details on the new features in Watch Face Format and changes to watch faces development options.

Tooling and library updates

Jetpack Compose for Wear OS

Adoption of Compose on Wear OS has grown 200% in the past year, highlighting the ease with which Compose allows developers to build for the watch form factor. Recently we’ve seen top apps such as WhatsApp, Gmail and Google Calendar built entirely using Compose for Wear OS, and it’s the recommended way for building user interfaces for Wear OS apps.

With the 1.3 release of Jetpack Compose for Wear OS, we’ve graduated a number of visual improvements and fixes from beta to stable.

In the past year, we’ve added features such as SwipeToReveal, to give users additional means for completing actions, an expandable item, to enhance the use of the smaller screen and show additional information where needed, and a range of WearPreview supporting annotations, for ensuring your app works optimally across the range of device sizes and font scales.

Compose for Wear OS previews usage in Android Studio
Compose for Wear OS previews usage in Android Studio

And at Google I/O 2024, we announced a lot of new updates with Jetpack Compose that help you build across form factors, including Wear OS, read more in this blog and check out how SoundCloud supported more screens using 45% less code with Jetpack Compose.

Tiles and ProtoLayout

Wear OS tiles give users fast, predictable access to the information and actions they rely on most. Version 1.4 of the Jetpack Tiles library, currently in alpha, introduces preview support for Android Studio to help you quickly iterate on your Tile development while also helping you create optimal-looking tiles on a range of display sizes.

Previews can be seen starting in Android Studio Koala Feature Drop (Canary), with the following dependencies:

    • androidx.wear.tiles:tiles-tooling-preview:1.4.0-alpha02+
    • androidx.wear.tiles:tiles-tooling:1.4.0-alpha02+
    • androidx.wear:wear-tooling-preview:1.0.0+
@Preview(device = WearDevices.SMALL_ROUND)
fun smallPreview(context: Context) = TilePreviewData(
    onTileRequest = { request ->
        TilePreviewHelper.singleTimelineEntryTileBuilder(
            buildMyTileLayout()
        ).build()
    }
)
Tiles previews usage in Android Studio
Tiles previews usage in Android Studio

We’ve also introduced better means for your app to determine whether your tiles are in use, through the getActiveTilesAsync() method.

Within ProtoLayout’s stable version 1.1, as used by Tiles, we’ve introduced a number of changes, such as the following:

    • Gradient support in ArcLine.
    • Date-time formatting supports different time zones for dynamic data types.
    • Better text autosizing and ellipsizing options, and consistent font padding behavior.
    • Expandable spacers
    • Improved accessibility for Clickable elements

And from 1.2.0-alpha02, we’ve made it easier for your layouts to adjust appropriately for different display sizes by adding the setResponsiveContentInsetEnabled() method to PrimaryLayout, as well as updating it for EdgeContentLayout. To use this setter, update your code as follows:

PrimaryLayout.Builder(deviceParameters)
    .setResponsiveContentInsetEnabled(true)
    .setContent(
        // ...
    )
.build()

Easier testing for fitness apps

Android Studio Koala Feature Drop (Canary) brings a new sensor panel to make it easier to test use of Health Services in your Wear OS app. The panel allows you to configure capabilities of the device, set values of specific data types and stimulate events such as auto-pause and resume of exercises.

Sensor panel usage with Wear OS emulator in Android Studio
Sensor panel usage with Wear OS emulator in Android Studio

Check out this blog to learn more about tooling updates.

Larger Displays

With the momentum surrounding Wear OS, we’re seeing a wider variety of round screen sizes and resolutions, which provides more choices for the user.

We are releasing new guidelines on how to build responsive UIs for different watch display sizes, as well as updates to existing libraries to introduce adaptive layouts, and components.

Check out the ComposeStarter sample for Wear OS on Github to see how to take advantage of these updates in your app. Furthermore, we’ve updated the sample to provide examples of using tools to evaluate your layouts, including :

    • Previews - demonstrating use of WearPreviewDevices to visualize your layouts on a full range of device sizes and font scaling settings.
    • Screenshot testing - helping you detect issues and regressions in your layouts on different sized devices, with different font scales and locales, representative of real-world devices.

Start building for Wear OS now

There has never been a better time to start building for Wear OS! Be sure to check out Building for the future of Wear OS technical session to learn more about all the latest updates for Wear OS!

To get started:

We’re looking forward to seeing the experiences that you build on Wear OS!