Category Archives: Android Developers Blog

An Open Handset Alliance Project

Android Dev Summit ‘22: Here’s how to tune in!

Posted by Yasmine Evjen, Community Lead, Android Developer Relations

Android Dev Summit is about to kick off at 9AM PT on Monday October 24, so it’s time to tune in! You can watch the livestream on developers.android.com, on YouTube, or right below:

Whether you’re tuning in online or–for the first time since 2019–joining in person at locations around the world, it’s your opportunity to learn from the source, about building excellent apps across devices. We just dropped information on the livestream agenda, technical talks, and speakers — so start planning your schedule!
 
Here’s what you can expect: we’re kicking things off at 9am PT with the Android Dev Summit keynote, where you’ll hear about the latest in Modern Android Development, innovations in our core platform, and how to take advantage of Android’s momentum across devices, including wearables and large screens. And right after the keynote, at 9:50 AM PT, we’ll be broadcasting live the first of three tracks: Modern Android Development (MAD)!
Modern Android Development Track @ Android Dev Summit October 24, 2022 at 9:00 AM PT 
Agenda 9:00 AM Keynote, 9:50 AM Custom Layouts and Graphics in Compose, 10:10 AM Making Apps Blazing Fast with Baseline Profiles, 10:30 State of the Art of Compose Tooling, 10:50 State Holders and State Production in the UI Layer, 11:10 AM 5 ways Compose Improves UI Testing, 11:15 AM 5 Android Studio Features You Don't Want to Miss, 11:30 AM Pre-recorded MAD Technical Talks, 12:20 PM Where to Hoist that State in Compose, 12:25 PM Material You in Compose Apps, 12:30 PM PM Compose Modifiers Deep Dive, 12:50 Practical Room Migrations, 12:55 PM Type Safe, Multi-Module Best Practices with Navigation, 1:00 PM What's New in Android Build, 1:20 PM From Views to Compose: Where Can I Start?, 1:25 PM Test at Scale with Gradle Managed Devices, 1:35 PM MAD #AskAndroid. Broadcast live on d.android.com/dev-summit & YouTube.
Then, ADS continues into November, with two more tracks. First, on November 9, ADS travels to London where we’ll broadcast all of the Form Factors, read the full list of talks here.
Form Factors Track @ Android Dev Summit November 9, 2022 
Sessions: Deep Dive into Wear OS App Architecture, Build Better Uls Across Form Factors with Android Studio, Designing for Large Screens: Canonical Layouts and Visual Hierarchy Compose: Implementing Responsive UI for Large Screens, Creating Helpful Fitness Experiences with Health Services and Health Connect, The Key to Keyboard and Mouse Support across Tablets and ChromeOS Your Camera App on Different Form Factors,  Building Media Apps on Wear OS,  Why and How to Optimize Your App for ChromeOS. 
Broadcast live on d.android.com/dev-summit & YouTube.


Then, on November 14, we’ll broadcast our Platform, you can check out the talks here.
Platform Track @ Android Dev Summit November 14, 2022 
Sessions: Migrate Your Apps to Android 13,  Presenting a High-quality Media Experience for all Users, Improving Your Social Experience Quality with Android Camera, Building for a Multilingual World Everything About Storage on Android, Migrate to Play Billing Library 5: More flexible subscriptions on Google Play, Designing a High Quality App with the Latest Android Features, Hardware Acceleration for ML on-device, Demystifying Attestation, Building Accessibility Support for Compose. 
Broadcast live on d.android.com/dev-summit & YouTube.

Burning question? #AskAndroid to the rescue!

To cap off each of our live streamed tracks, we’ll be hosting a live Q&A – #AskAndroid - for each track topic, so you can get your burning questions answered live by the team who built Android. Post your questions to Twitter or comment in the YouTube livestream using #AskAndroid, for a chance to have your questions answered on the livestream.

We’re so excited for this year’s Android Dev Summit, and we’re looking forward to connecting with you!

Material Design Components for Android 1.7.0

Posted by James Williams, Developer Relations Engineer

The latest releases of Material Design Components (MDC) - 1.7.0 brings updates to Material You styling, accessibility and size coherence and new minimum version requirements

MDC 1.7.0 has new minimum version requirements:

  • Java 8 (1.8), previously Java 7 (1.7)
  • Android Gradle Plugin (AGP) 7.3.3, previously 4.0.0
  • Android Studio Chipmunk, version 2021.2.1, and
  • compileSdkVersion / targetSdkVersion 32

This is a fairly large jump in terms of the Gradle plugin version, so make sure to secure changes in your build files first before moving on to UI code. As always, our release notes contain the full details of what has been updated. There are a couple standout updates we’d like to highlight.

MaterialSwitch component

The Switch component has undergone a visual refresh that increases contrast and accessibility. The MaterialSwitch class replaces the previous SwitchMaterial class.

It now differentiates between the on and off states more by making the “on” thumb larger and able to contain an icon in addition to an on state color. The “off” state has a smaller thumb with less contrast.

Much of the new component’s core API aligns with the obsolete SwitchMaterial class so to get started, you can simply replace the class references.

For more information on how the obsolete component stacks against the new implementation, check the documentation on GitHub.

Shape Theming

A component’s shape is one way to express your brand. In addition to providing a custom MaterialShapeDrawable, there is also a means to more simply customize shape theming using rounded or cut corners.

Material 3 components have been updated to apply one of the seven styles ranging from None to Full. A component’s shape is defined by two properties: its Shape family, either rounded or cut, and its value, usually described in dp. Where a “none” style always results in a rectangular shape, the resulting shape for full depends on the shape family. Rounded returns a rectangle with fully rounded edges, while Cut returns a hexagonal shape.

You are able to set the shape family and value individually and arbitrarily on each edge but there are set intervals and baseline values.

Shape StyleValue
None0dp
Extra Small

4dp
Small8dp
Medium12dp

Large16dp
Extra Large

28dp
FullN/A


The Shape Theming card in the Catalog app allows you to see how different values affect rounded or cut corners.


What's next for MDC

We’re fast at work on the next major version of MDC. You can follow the progress, file bug reports and feature requests on GitHub. Also feel free to reach out to us on Twitter @materialdesign.

Get ready for Android Dev Summit ‘22: Check out the Technical Talks, Livestream Agenda, and Speakers!

Posted by Yasmine Evjen, Community Lead, Android Developer Relations

Modern Android Development Track @ Android Dev Summit October 24, 2022 at 9:00 AM PT Agenda 9:00 AM Keynote, 9:50 AM Custom Layouts and Graphics in Compose, 10:10 AM Making Apps Blazing Fast with Baseline Profiles, 10:30 State of the Art of Compose Tooling, 10:50 State Holders and State Production in the UI Layer, 11:10 AM 5 ways Compose Improves UI Testing, 11:15 AM 5 Android Studio Features You Don't Want to Miss, 11:30 AM Pre-recorded MAD Technical Talks, 12:20 PM Where to Hoist that State in Compose, 12:25 PM Material You in Compose Apps, 12:30 PM PM Compose Modifiers Deep Dive, 12:50 Practical Room Migrations, 12:55 PM Type Safe, Multi-Module Best Practices with Navigation, 1:00 PM What's New in Android Build, 1:20 PM From Views to Compose: Where Can I Start?, 1:25 PM Test at Scale with Gradle Managed Devices, 1:35 PM MAD #AskAndroid. Broadcast live on d.android.com/dev-summit & YouTube.Android Dev Summit is kicking off next week Monday October 24 9am PT, live streamed on YouTube from the San Francisco Bay Area! Whether you’re tuning in online or–for the first time since 2019–joining in person at locations around the world, it’s your opportunity to learn from the source, about building excellent apps across devices. We just dropped information on the livestream agenda, technical talks, and speakers — so start planning your schedule!

Here’s what you can expect: On October 24, we’re kicking things off at 9am PT with the Android Dev Summit keynote, where you’ll hear about the latest in Modern Android Development, innovations in our core platform, and how to take advantage of Android’s momentum across devices, including wearables and large screens.

Three Tracks across Three Days: MAD, Form Factors & Platform

Right after the keynote, at 9:50 AM PT, we’ll be broadcasting live the first of three tracks: Modern Android Development (MAD)! Check out the livestream agenda here. You’ll be able to watch technical talks such as: Custom Layouts and Graphics in Compose, State holders and state production in the UI layer, and Making apps blazing fast with Baseline Profiles. You can learn more about all of the MAD talks here.

Then, ADS continues into November, with two more tracks. First, on November 9, ADS travels to London where we’ll broadcast all of the Form Factors technical talks such as Build better UIs across form factors with Android Studio, Deep dive into Wear OS app architecture, and the Do's and Don'ts: Mindset for optimizing apps for large screens. Check out the Form Factors talks here.

Form Factors Track @ Android Dev Summit November 9, 2022 
Sessions: Deep Dive into Wear OS App Architecture, Build Better Uls Across Form Factors with Android Studio, Designing for Large Screens: Canonical Layouts and Visual Hierarchy Compose: Implementing Responsive UI for Large Screens, Creating Helpful Fitness Experiences with Health Services and Health Connect, The Key to Keyboard and Mouse Support across Tablets and ChromeOS Your Camera App on Different Form Factors,  Building Media Apps on Wear OS,  Why and How to Optimize Your App for ChromeOS. 
Broadcast live on d.android.com/dev-summit & YouTube.And then on November 14 we’ll broadcast our Platform technical talks where you’ll learn about the latest innovations and updates to the Android platform. You’ll be able to watch talks such as Android 13: Migrate your apps, Presenting a high-quality media experience for all users, and Migrating to Billing Library 5 and more flexible subscriptions on Google Play. Get a sneak peak at all the Platform talks here.
Platform Track @ Android Dev Summit November 14, 2022 
Sessions: Migrate Your Apps to Android 13,  Presenting a High-quality Media Experience for all Users, Improving Your Social Experience Quality with Android Camera, Building for a Multilingual World Everything About Storage on Android, Migrate to Play Billing Library 5: More flexible subscriptions on Google Play, Designing a High Quality App with the Latest Android Features, Hardware Acceleration for ML on-device, Demystifying Attestation, Building Accessibility Support for Compose. 
Broadcast live on d.android.com/dev-summit & YouTube.

Burning question? #AskAndroid to the rescue!

To cap off each of our live streamed tracks, we’ll be hosting a live Q&A – #AskAndroid - for each track topic, so you can get your burning questions answered live by the team who built Android. Post your questions to Twitter or comment in the YouTube livestream using #AskAndroid, for a chance to have your questions answered on the livestream.

We’re so excited for this year’s Android Dev Summit, and we’re looking forward to connecting with you!

#WeArePlay | Meet app founders helping people around the world

Posted by Leticia Lago, Developer Marketing

There are millions of apps available on Google Play, created by thousands of founders across the world. Each single app is unique and special in its own right, but they all have one thing in common - their purpose is to help. From helping motorhome enthusiasts find somewhere to camp, small business owners manage their finances or waste pickers make a reliable income - in this latest batch of #WeArePlay stories, we celebrate app founders who are helping people across the world in extraordinarily different ways.

First we begin with Cristian. Originally from Villa Rica in southern Chile, he made his family very proud by being the first to go to university. During his studies in Santiago, he learned about the local waste pickers – people who make an income by searching through trash cans and finding valuable materials to sell. Despite his mother’s wishes, he was so motivated to help them that he dropped out of university and dedicated all his time to creating an app. Reciclapp works by helping waste pickers connect with local businesses, so they can collect resellable materials directly from them. So far, the app has helped waste pickers across the city save time and guarantee a more reliable income. As Cristian has grown his company to a team of 12 and expanded into Mexico, his mother is now very proud of his bravery and success.

Next, Kennedy and Duke. When they were children, their father’s business sadly failed because managing his finances and tracking spending was too hard. Years later, after a successful career abroad in tech, Kennedy decided it was time to return to his homeland of Nigeria and build his own company. Inspired by his father’s struggle, he partnered with brother Duke and travelled across the country to interview other business owners about their financial struggles. Using this research, they created Kippa - the app simplifies bookkeeping to make sending invoices, storing receipts and setting up a bank account easy. It’s now used by over half a million businesses in Nigeria, as Kennedy mentions “without Google Play, we couldn't help as many business owners”.

To round up today, Gijs and Eefje. The couple adore renting campervans and travelling around to explore the natural beauty of Europe, but they always seemed to struggle with one thing - easily finding places to stay. Feeling like nothing out there could help them, they decided to give app development a go and create Campy. The app works as a digital camping encyclopaedia: helping like-minded campervan enthusiasts discover the perfect spots to set up camp, plan their trips and meet others who love the outdoors. A few years after Campy launched, Gijs and Eefje now have 2 little girls to bring on their big adventures, and are elated with the feedback they have received - “it never ceases to amaze me what a tiny app can do for so many people”.

Check out all the stories from around the world at g.co/play/weareplay and stay tuned for more coming soon.


How useful did you find this blog post?



Latest updates on Android’s custom ML stack

Posted by The Android ML Platform Team

The use of on-device ML in Android is growing faster than ever thanks to its unique benefits over server based ML such as offline availability, lower latency, improved privacy and lower inference costs.

When building on-device ML based features, Android developers usually have a choice between two options: using a production ready SDK that comes with pre-trained and optimized ML models, such as ML Kit or, if they need more control, deploying their own custom ML models and features.

Today, we have some updates on Android’s custom ML stack - a set of essential APIs and services for deploying custom ML features on Android.


TensorFlow Lite in Google Play services is now Android’s official ML inference engine

We first announced TensorFlow Lite in Google Play services in Early Access Preview at Google I/O '21 as an alternative to standalone TensorFlow Lite. Since then, it has grown to serve billions of users every month via tens of thousands of apps.Last month we released the stable version of TensorFlow Lite in Google Play services and are excited to make it the official ML inference engine on Android.

Using TensorFlow Lite in Google Play services will not only allow you to save on binary size and benefit from performance improvements via automatic updates but also ensure that you can easily integrate with future APIs and services from Android’s custom ML stack as they will be built on top of our official inference engine.

If you are currently bundling TensorFlow Lite to your app, check out the documentation to migrate.

TensorFlow Lite Delegates now distributed via Google Play services

Released a few years ago, GPU delegate and NNAPI delegate let you leverage the processing power of specialized hardware such as GPU, DSP or NPU. Both GPU and NNAPI delegates are now distributed via Google Play services.

We are also aware that, for advanced use cases, some developers want to use custom delegates directly. We’re working with our hardware partners on expanding access to their custom delegates via Google Play services.

Acceleration Service will help you pick the best TensorFlow Lite Delegate for optimal performance in runtime

Identifying the best delegate for each user can be a complex task on Android due to hardware heterogeneity. To help you overcome this challenge, we are building a new API that allows you to safely optimize the hardware acceleration configuration at runtime for your TensorFlow Lite models.

We are currently accepting applications for early access to the Acceleration Service and aim for a public launch early next year.

We will keep investing in Android’s custom ML stack

We are committed to providing the essentials for high performance custom on-device ML on Android.

As a summary, Android’s custom ML stack currently includes:

  • TensorFlow Lite in Google Play Services for high performance on-device inference
  • TensorFlow Lite Delegates for accessing hardware acceleration

Soon, we will release an Acceleration Service, which will help pick the optimal delegate for you at runtime.

You can read about and stay up to date with Android’s custom ML stack at developer.android.com/ml.

New features and tools to help you showcase your Play Store listing

Posted by Allison Chang (Product Manager, Google Play), Weifang Sun (Product Manager, Chrome OS), Manuel Wang (Product Manager, Google Play Console), and Marcus Leal (Product Manager, Google Play)

Your Play Store listing is the best way to help prospective users understand the functionality and value of your app. The assets and information you provide - descriptions, images, and videos – are essential to users looking to make a decision on what to download.

Tailoring your app's assets to each form factor is more important than ever, as users are increasingly investing in connected devices beyond their phones, such as tablets, smart watches, and TVs. In fact, the number of active non-mobile Android devices has grown almost 30% in the last year.

Today, we’re announcing new features that put more of your store listing assets front and center in Google Play. We'll also walk through some best practices to help you optimize your listing and generate meaningful installs for your app.
 

Changes on Large Screens

New Content-Forward Formats on Play Homepages

On large screens like tablets, foldables, and Chromebooks, we’re continuing to make improvements that will enable users to discover the best apps for their devices. As we showcased at I/O earlier this year, we’re redesigning the Play Store for large screens and using your screenshots, videos, and descriptions directly in Apps and Games Home.
Play Store homepage for large screens (2023)
The goal of this content-forward approach is to better represent your app in the store and help users make install decisions.

We’ve published a set of content quality guidelines as best practices to showcase your app on large screens. Beginning early next year, apps with assets that follow these criteria will be able to take advantage of richer formats in Play. This won’t impact your app’s promotability, just the way your app is displayed in the Play Store.

Screenshot Support for ChromeOS

When users browse the Play Store on Chromebooks today, they see tablet or phone screenshots in the app’s store listing page. Since this does not always accurately portray the Chromebook experience, we’re now launching the ability to upload Chromebook-specific screenshots in Play Console.
Chromebook screenshots in Play Developer Console

This will allow up to 8 screenshots and will be shown primarily on the Play Store for Chromebooks. These screenshots will appear on both your app listing page and Play homepages.

We recommend using 16:9 screenshots for landscape with dimensions of 1080-7690px.

To get started, visit the Main Store Listing section in Play Console.

Updates to Tablet Screenshot Guidelines

With the new launch of ChromeOS screenshot support, we’re also updating our quality guidelines for tablets for consistency across large screens. While previously uploaded tablet screenshots will not be affected, this should help simplify the process of generating new screenshots when you make updates to your app.


Changes on Phones

Homepages for Other Devices

Last month, we introduced form-factor-specific homepages. This is a dedicated surface on phones for users that have additional non-mobile devices. These homepages improve the visibility of your app and store listing details by allowing users to browse for titles best suited for their smart watches, TVs or cars - all from their phones.
Homepages for other devices

Search Device Filters and Remote Install

Users can also filter results in search with a new device filter in Play. With the filter enabled, search results will only include titles that are compatible with the selected device.


Device search filters
Remote install to other devices

Store Listing Best Practices

Since these changes will make your store listing details much more prominent in Play, here are some ways to help you optimize your app assets:

Use device-specific screenshots that demonstrate the core app or game experience.

In Play Console, you can upload screenshots to show users how your app or game will look on different device types and highlight unique form factor features. When choosing screenshots, use imagery that conveys the primary user flows within your app. This will help users on all devices anticipate what the true app or game experience will be like for them.

Use device imagery with caution

Showing a physical device in your store listing may cause your screenshots and videos to become obsolete quickly or alienate some users. To save time maintaining your assets, use screenshots and videos of just the app or game experience.


Use high-quality images with the proper aspect ratio and resolution

Using high quality images is essential to ensuring your screenshots look great on all screen sizes. Don’t include screenshots that are pixelated, stretched or compressed, or improperly rotated.

Avoid overloading assets with text

To make sure your screenshots and videos look great when featured on Play homepages, avoid using too much text. Since we may resize your assets to fit certain screen sizes, this will prevent any text from being cut off unintentionally.

If you need to use text, avoid any time-sensitive copy that needs to be updated frequently.

As we continue to test ways to feature your store listing information more prominently in Play, the quality of your assets remain as important as ever. We hope these features and tips empower you to showcase the best of your app on all device types. For more tips like these to help you get started, visit our content quality guidelines.

Bringing passkeys to Android & Chrome

Posted by Diego Zavala, Product Manager (Android), Christiaan Brand, Product Manager (Account Security), Ali Naddaf, Software Engineer (Identity Ecosystems), Ken Buchanan, Software Engineer (Chrome)

Explore passkeys on Android & Chrome starting today

Starting today, Google is bringing passkey support to both Android and Chrome.

Passkeys are a significantly safer replacement for passwords and other phishable authentication factors. They cannot be reused, don't leak in server breaches, and protect users from phishing attacks. Passkeys are built on industry standards and work across different operating systems and browser ecosystems, and can be used for both websites and apps.

Passkeys follow already familiar UX patterns, and build on the existing experience of password autofill. For end-users, using one is similar to using a saved password today, where they simply confirm with their existing device screen lock such as their fingerprint. Passkeys on users’ phones and computers are backed up and synced through the cloud to prevent lockouts in the case of device loss. Additionally, users can use passkeys stored on their phone to sign in to apps and websites on other nearby devices.

Today’s announcement is a major milestone in our work with passkeys, and enables two key capabilities:

  1. Users can create and use passkeys on Android devices, which are securely synced through the Google Password Manager.
  2. Developers can build passkey support on their sites for end-users using Chrome via the WebAuthn API, on Android and other supported platforms.

To try this today, developers can enroll in the Google Play Services beta and use Chrome Canary. Both features will be generally available on stable channels later this year.

Our next milestone in 2022 will be an API for native Android apps. Passkeys created through the web API will work seamlessly with apps affiliated with the same domain, and vice versa. The native API will give apps a unified way to let the user pick either a passkey or a saved password. Seamless, familiar UX for both passwords and passkeys helps users and developers gradually transition to passkeys.

Signing in to a website on an Android device with a passkey

For the end-user, creating a passkey requires just two steps: (1) confirm the passkey account information, and (2) present their fingerprint, face, or screen lock when prompted.

 

Signing in is just as simple: (1) The user selects the account they want to sign in to, and (2) presents their fingerprint, face, or screen lock when prompted.

 

Signing in to a website on a nearby computer with a passkey on an Android device

A passkey on a phone can also be used to sign in on a nearby device. For example, an Android user can now sign in to a passkey-enabled website using Safari on a Mac. Similarly, passkey support in Chrome means that a Chrome user, for example on Windows, can do the same using a passkey stored on their iOS device.

Since passkeys are built on industry standards, this works across different platforms and browsers - including Windows, macOS and iOS, and ChromeOS, with a uniform user experience.

We will continue to do our part for a passwordless future

We have worked with others in the industry, including Apple and Microsoft, and members within the FIDO Alliance and the W3C to drive secure authentication standards for years. We have shipped support for W3C Webauthn and FIDO standards since their inception.

Today is another important milestone, but our work is not done. Google remains committed to a world where users can choose where their passwords, and now passkeys, are stored. Please stay tuned for more updates from us in the next year as we introduce changes to Android, enabling third party credential managers to support passkeys for their users.

Announcing an Experimental Preview of Jetpack Multiplatform Libraries

Posted by Márton Braun, Developer Relations Engineer

Since we announced Kotlin support for Android in 2017, developers have been excited about writing their Android apps using Kotlin. We’ve continuously expanded this support for the language over the years, going Kotlin-first with Jetpack libraries and documentation, and then further investing into Kotlin with Jetpack Compose. We’ve also seen the interest of the community in Kotlin’s multiplatform capabilities.

Kotlin Multiplatform Mobile from JetBrains is now in beta, and we have been experimenting with this technology to see how it can enable code sharing across platforms. As part of these experiments, we are now sharing a preview of Kotlin Multiplatform libraries in Jetpack.

The libraries available for multiplatform as part of this experimental preview are Collections and DataStore. These were chosen as they evaluate several important aspects of converting an existing library to multiplatform:

  • Collections is an example of a library written in the Java programming language that has no Android-specific dependencies, but implements Java collection APIs.
  • DataStore is written entirely in Kotlin, and it uses coroutines in both its implementation and APIs. It also depends on Java IO and Android platform APIs.

With this preview, we’re looking for your feedback about using these Jetpack libraries in multiplatform projects targeting Android and iOS applications. Keep in mind that these dev builds are experimental and should not be used in production. They are published outside the regular release cycle of these libraries, and they are not guaranteed to graduate to stable.

The libraries are available from Google’s Maven repository. To start using them, add the following dependencies to your Kotlin Multiplatform project:

val commonMain by getting {
  dependencies {
      implementation("androidx.collection:collection:1.3.0-dev01")

      // Lower-level APIs with support for custom serialization
      implementation("androidx.datastore:datastore-core-okio:1.1.0-dev01")
      // Higher-level APIs for storing values of basic types
      implementation("androidx.datastore:datastore-preferences-core:1.1.0-dev01")
  }
}

You can learn more about the available APIs by checking out our sample app which uses DataStore on Android and iOS, or in the preview API reference documentation available for both libraries.

To provide feedback about your experience with the multiplatform Jetpack libraries, or to show your interest in Kotlin Multiplatform, join the conversation in the Kotlinlang #multiplatform channel. You can also open bugs on the issue tracker for DataStore or for Collections.

*Java is a trademark or registered trademark of Oracle and/or its affiliates.

The new Google Pixel Watch is here – start building for Wear OS!

Posted by the Android Developers Team

If you caught yesterday's Made by Google event, then you saw the latest devices in the Pixel portfolio. Besides the Pixel 7 and Pixel 7 Pro phones, we wanted to showcase two of the latest form factors: the Google Pixel Tablet1 (Google's brand new tablet, coming in 2023), and the latest device powered with Wear OS by Google: the Google Pixel Watch! As consumers begin to preorder the watch, it's an especially great time to prepare your app so it looks great on all of the new watches that consumers will get their hands on over the holidays. Discover the latest updates to Wear OS, how apps like yours are upgrading their experiences, and how you can get started building a beautiful, efficient Wear OS app.

Here’s What’s New in Wear OS

The Google Pixel Watch is built on Wear OS and includes the latest updates to the platform, Wear OS 3.5. This version of Wear OS is also available on some of your other favorite Wear OS devices! The new Wear OS experience is designed to feel fluid and easy to navigate, bringing users the information they need with a tap, swipe, or voice command. With a refreshed UI and rich notifications, your users can see even more at a glance.

To take advantage of building on top of all of these new features, earlier this year we released Compose for Wear OS, our modern declarative UI toolkit designed to help you get your app running with fewer development hours - and fewer lines of code. It's built from the bottom up with Kotlin, and it moved to 1.0 earlier this year, meaning the API is stable and ready for you to get building. Here's what's in the 1.0 release:

  • Material: The Compose Material catalog for Wear OS already offers more components than are available with View-based layouts. The components follow material styling and also implement material theming, which allows you to customize the design for your brand.
  • Declarative: Compose for Wear OS leverages Modern Android Development and works seamlessly with other Jetpack libraries. Compose-based UIs in most cases result in less code and accelerate the development process as a whole, read more.
  • Interoperable: If you have an existing Wear OS app with a large View-based codebase, it's possible to gradually adopt Compose for Wear OS by using the Compose Interoperability APIs rather than having to rewrite the whole codebase.
  • Handles different watch shapes: Compose for Wear OS extends the foundation of Compose, adding a DSL for all curved elements to make it easy to develop for all Wear OS device shapes: round, square, or rectangular with minimal code.
  • Performance: Each Compose for Wear OS library ships with its own baseline profiles that are automatically merged and distributed with your app’s APK and are compiled ahead of time on device. In most cases, this achieves app performance for production builds that is on-par with View-based apps. However, it’s important to know how to configure, develop, and test your app’s performance for the best results. Learn more.

Another exciting update for Wear OS is the launch of the Tiles Material library to help you build tiles more quickly. The Tiles Material Library includes pre-built Material components and layouts that embrace the latest Material Design for Wear OS. This easy to use library includes components for buttons, progress arcs and more - saving you the time of building them from scratch. Plus, with the pre-built layouts, you can kickstart your tiles development knowing your layout follows Material design guidelines on how your tiles should be formatted.

Finally, in the recently released Android Studio Dolphin, we added a range of Wear OS features to help get your apps, tiles, and watch faces ready for all of the Wear OS 3 devices. With an updated Wear OS Emulator Toolbar, an intuitive Pairing Assistant, and the new Direct Surface Launch feature to quickly test watch faces, tiles, and complication, it's now simpler and more efficient than ever to make great apps for WearOS.

Get Inspired with New App Experiences

Apps like yours are already providing fantastic experiences for Wear OS, from Google apps to others like Spotify, Strava, Bitmoji, adidas Running, MyFitnessPal, and Calm. This year, Todoist, PeriodTracker, and Outdooractive all rebuilt their app with Compose - taking advantage of the tools and APIs that make building their app simpler and more efficient; in fact, Outdooractive found that using Compose for Wear OS cut development time by 30% for their team.

With the launch of the Google Pixel Watch, we are seeing fantastic new experiences from Google apps - using the new hardware features as another way to provide an exceptional user experience. Google Photos now allows you to set your favorite picture as your watch face on the Google Pixel Watch, which has 19 customizable watch faces, each with many personalization options. With Google Assistant built in, Google Pixel Watch users can interact with their favorite apps by using the Wear OS app or leveraging the built-in integration with Google Assistant. For example, Google Home’s latest updates users can easily control their smart home devices through the Wear OS app or by saying “Hey Google” to their watch to do everything from adjusting the thermostat to getting notifications from their Nest doorbell when a person or package at the door2.

Health and fitness apps have a lot of opportunity with the latest Wear OS platform and hardware updates. Google Pixel Watch includes Fitbit’s amazing health and fitness features, including accurate heart rate tracking with on-device machine learning and deep optimization down to the processor level. Users can get insights into key metrics like breathing rate, heart rate variability, sleep quality and more right on their Google Pixel Watch. With this improved data, there are more opportunities for health and fitness apps to provide meaningful insights and experiences for their users.

The updates and improvements from Wear OS and the Google Pixel Watch make building differentiated app experiences more tangible. Apps are using those capabilities to excite and delight users and so can you.

Get started

The Google Pixel Watch is the latest addition to an already incredible Wear OS device ecosystem. From improved APIs and tools to exciting new hardware, there is no time like the present to get started on your Wear OS app. To begin developing with Compose for Wear OS, get started on our curated learning pathway for a step-by-step learning journey. Then, check out the documentation including a quick start guide and get hands on experience with the Compose for Wear OS codelab!

Discover even more with the Wear OS session from Google I/O and hear the absolute latest and greatest from Wear OS by tuning into the keynote and technical sessions at the upcoming Android Developer Summit!

Want to learn more about all the MBG announcements? Check out the official blog here. Plus, get started with another exciting form factor coming to the Pixel ecosystem, the Google Pixel Tablet, by optimizing your app for tablets!

Disclaimers:

1. The Google Pixel Tablet has not been authorized as required by the rules of the Federal Communications Commission or other regulators. This device may not be sold or otherwise distributed until required legal authorizations have been obtained. 
2. Requires compatible smart home devices (sold separately).

Todoist adopted Compose for Wear OS and increased its growth rate by 50%

Posted by Posted by The Android Developers Team

Todoist is the world’s top task and time management app, empowering over 30 million people to organize, plan, and collaborate on projects big and small. As a company, Todoist is committed to creating more fulfilling ways for its users to work and live—which includes access to its app across all devices.

That’s why Todoist developers adopted Compose for Wear OS to completely rebuild its app for wearables. This new UI toolkit gives developers the same easy-to-use suite that Android has made available for other devices, allowing for efficient, manageable app development.

A familiar toolkit optimized for Wear OS

Developers at Todoist already had experience with Jetpack Compose for Android mobile, which allowed them to quickly familiarize themselves with Compose for Wear OS. “When the new Wear design language and Compose for Wear OS were announced, we were thrilled,” said Rastislav Vaško, head of Android for Todoist. “It gave us new motivation and an opportunity to invest in the future of the platform.”

As with Jetpack Compose for mobile, developers can integrate customizable components directly from the Compose for Wear OS toolkit, allowing them to write code and implement design requirements much faster than with the View-based layouts they used previously. With the available documentation and hands-on guidance from the Compose for Wear OS codelab, they were able to translate their prior toolkit knowledge to the wearable platform.

“Compose for Wear OS had almost everything we needed to create our layouts,” said Rastislav. “Swipe-dismiss, TimeText, and ScalingLazyList were all components that worked very well out of the box for us, while still allowing us to make a recognizable and distinct app.” For features that were not yet available in the toolkit, the Todoist team used Google’s Horologist—a group of open-source libraries which provide Wear OS developers with features that are commonly required by developers but not yet available. From there, they used the Compose Layout Library to incorporate the fade away modifier that matched the native design guidelines.

Compose for Wear OS shifts development into overdrive

Compose for Wear OS simplifies UI development for Wear OS, letting engineers create complex screens that are both readable and maintainable because of its rich Kotlin syntax and modern declarative approach. This was a significant benefit for the production of the new Todoist application, enabling developers to achieve more in less time.

The central focus of the overhaul was to redesign all screens and interactions to conform with the latest Material Design for Wear OS. Using Compose for Wear OS, Todoist developers shifted away from WearableDrawerLayout in favor of a flatter app structure. This switch followed Material Design for Wear OS guidance and modernized the application’s layout.

Todoist developers designed each screen specifically for Wear OS devices, removing unnecessary elements that complicated the user experience.

“For wearables, we’re always thinking about what we can leave out, to keep only streamlined, focused, and quick interactions,” Rastislav said. Compose for Wear OS helped the Todoist team tremendously with both development and design, allowing them to introduce maintainable implementation while providing a consistent user experience.

"Since we rebuilt our app with Compose for Wear OS, Todoist’s growth rate of installations on Google Play increased by 50%." 

An elevated user and developer experience

The developers at Todoist rapidly and efficiently created a refreshed application for Wear OS using Jetpack Compose. The modern tooling; intuitive APIs; and host of resources, documentation, and samples made for a smooth design and development process that required less code and accelerated the delivery of a new, functional user experience.

Since the app was revamped, the growth rate for Todoist installs on Google Play has increased 50%, and the team has received positive feedback from internal teams and on social media.

The team at Todoist is looking forward to discovering what else Compose for Wear OS can do for its application. They saw the refresh as an investment in the future of wearables and are excited for the additional opportunities and feature offerings provided by devices running Wear OS 3.

Transform your app with Compose for Wear OS

Todoist completely rebuilt and redesigned its Wear OS application with Compose for Wear OS, improving both the user and developer experience.

Learn more about Jetpack Compose for Wear OS: