Tag Archives: wear

Introducing the Fused Orientation Provider API: Consistent device orientation for all

Posted by Geoffrey Boullanger – Senior Software Engineer, Shandor Dektor – Sensors Algorithms Engineer, Martin Frassl and Benjamin Joseph – Technical Leads and Managers

Device orientation, or attitude, is used as an input signal for many use cases: virtual or augmented reality, gesture detection, or compass and navigation – any time the app needs the orientation of a device in relation to its surroundings. We’ve heard from developers that orientation is challenging to get right, with frequent user complaints when orientation is incorrect. A maps app should show the correct direction to walk towards when a user is navigating to an exciting restaurant in a foreign city!

The Fused Orientation Provider (FOP) is a new API in Google Play services that provides quality and consistent device orientation by fusing signals from accelerometer, gyroscope and magnetometer.

Although currently the Android Rotation Vector already provides device orientation (and will continue to do so), the new FOP provides more consistent behavior and high performance across devices. We designed the FOP API to be similar to the Rotation Vector to make the transition as easy as possible for developers.

In particular, the Fused Orientation Provider

    • Provides a unified implementation across devices: an API in Google Play services means that there is no implementation variance across different manufacturers. Algorithm updates can be rolled out quickly and independent of Android platform updates;
    • Directly incorporates local magnetic declination, if available;
    • Compensates for lower quality sensors and OEM implementations (e.g., gyro bias, sensor timing).

In certain cases, the FOP returns values piped through from the AOSP Rotation Vector, adapted to incorporate magnetic declination.

How to use the FOP API

Device orientation updates can be requested by creating and sending a DeviceOrientationRequest object, which defines some specifics of the request like the update period.

The FOP then outputs a stream of the device’s orientation estimates as quaternions. The orientation is referenced to geographic north. In cases where the local magnetic declination is not known (e.g., location is not available), the orientation will be relative to magnetic north.

In addition, the FOP provides the device’s heading and accuracy, which are derived from the orientation estimate. This is the same heading that is shown in Google Maps, which uses the FOP as well. We recently added changes to better cope with magnetic disturbances, to improve the reliability of the cone for Google Maps and FOP clients.

The update rate can be set by requesting a specific update period. The FOP does not guarantee a minimum or maximum update rate. For example, the update rate can be faster than requested if another app has a faster parallel request, or it can be slower as requested if the device doesn’t support the high rate.

For full specification of the API, please consult the API documentation:

Example usage (Kotlin)

package ...

import android.content.Context
import com.google.android.gms.location.DeviceOrientation
import com.google.android.gms.location.DeviceOrientationListener
import com.google.android.gms.location.DeviceOrientationRequest
import com.google.android.gms.location.FusedOrientationProviderClient
import com.google.android.gms.location.LocationServices
import com.google.common.flogger.FluentLogger
import java.util.concurrent.Executors

class Example(context: Context) {
  private val logger: FluentLogger = FluentLogger.forEnclosingClass()

  // Get the FOP API client
  private val fusedOrientationProviderClient: FusedOrientationProviderClient =
    LocationServices.getFusedOrientationProviderClient(context)

  // Create an FOP listener
  private val listener: DeviceOrientationListener =
    DeviceOrientationListener { orientation: DeviceOrientation ->
      // Use the orientation object returned by the FOP, e.g.
      logger.atFinest().log("Device Orientation: %s deg", orientation.headingDegrees)
    }

  fun start() {
    // Create an FOP request
    val request =
      DeviceOrientationRequest.Builder(DeviceOrientationRequest.OUTPUT_PERIOD_DEFAULT).build()

    // Create (or re-use) an Executor or Looper, e.g.
    val executor = Executors.newSingleThreadExecutor()

    // Register the request and listener
    fusedOrientationProviderClient
      .requestOrientationUpdates(request, executor, listener)
      .addOnSuccessListener { logger.atInfo().log("FOP: Registration Success") }
      .addOnFailureListener { e: Exception? ->
        logger.atSevere().withCause(e).log("FOP: Registration Failure")
      }
  }

  fun stop() {
    // Unregister the listener
    fusedOrientationProviderClient.removeOrientationUpdates(listener)
  }
}

Technical background

The Android ecosystem has a wide variety of system implementations for sensors. Devices should meet the criteria in the Android compatibility definition document (CDD) and must have an accelerometer, gyroscope, and magnetometer available to use the fused orientation provider. It is preferable that the device vendor implements the high fidelity sensor portion of the CDD.

Even though Android devices adhere to the Android CDD, recommended sensor specifications are not tight enough to fully prevent orientation inaccuracies. Examples of this include magnetometer interference from internal sources, and delayed, inaccurate or nonuniform sensor sampling. Furthermore, the environment around the device usually includes materials that distort the geomagnetic field, and user behavior can vary widely. To deal with this, the FOP performs a number of tasks in order to provide a robust and accurate orientation:

    • Synchronize sensors running on different clocks and delays;
    • Compensate for the hard iron offset (magnetometer bias);
    • Fuse accelerometer, gyroscope, and magnetometer measurements to determine the orientation of the device in the world;
    • Compensate for gyro drift (gyro bias) while moving;
    • Produce a realistic estimate of the compass heading accuracy.

We have validated our algorithms on comprehensive test data to provide a high quality result on a wide variety of devices.

Availability and limitations

The Fused Orientation Provider is available on all devices running Google Play services on Android 5 (Lollipop) and above. Developers need to add the dependency play-services-location:21.2.0 (or above) to access the new API.

Permissions

No permissions are required to use the FOP API. The output rate is limited to 200Hz on devices running API level 31 (Android S) or higher, unless the android.permissions.HIGH_SAMPLING_RATE_SENSORS permission was added to your Manifest.xml.

Power consideration

Always request the longest update period (lowest frequency) that is sufficient for your use case. While more frequent FOP updates can be required for high precision tasks (for example Augmented Reality), it comes with a power cost. If you do not know which update period to use, we recommend starting with DeviceOrientationRequest::OUTPUT_PERIOD_DEFAULT as it fits most client needs.

Foreground behavior

FOP updates are only available to apps running in the foreground.


Copyright 2023 Google LLC.
SPDX-License-Identifier: Apache-2.0

Designing for Wear OS: Getting started with designing inclusive smartwatch apps

Posted by Matthew Pateman & Mallory Carroll (UX Research), and Josef Burnham (UX Design)

Smartwatches are becoming increasingly popular, with many people using them to stay connected, track their health, and control their devices. Watches enable people to get information at a glance and then take action. These quick and frequent interactions can help people get back to being present in their daily lives.

To help with the challenges of designing and building great watch experiences that work for all, we have created a series of videos. These videos cover a variety of topics starting with how to understand what people want from a smartwatch app. We cover how best to design for your target audience, and how to make the most of the watch’s form factor with a series of design principles. Lastly, we give you an introduction on how to approach product inclusion throughout the whole development lifecycle, and how this approach can help make your products better for all. If you’re interested in learning more, be sure to check out the videos below.


1. Introduction to UX Research & Product Inclusion on Wear OS

If you’re considering building a smartwatch app but don’t know how to begin, this video will help you get started. It shows how to uncover what people want from a smartwatch app, what a great Wear OS experience should look like, and how to ensure it addresses real needs of the people you are building for. Lastly, you’ll find out how to take an equity-focused approach when developing products, apps, and experiences.


2. Introduction to UX Design on Wear OS

Did you know that the average smartwatch interaction is approximately 5 seconds long? In this video you will learn how to design effective and engaging experiences for Wear OS. We’ll guide you on how to make the most out of these short watch interactions by covering key differences between mobile and smartwatch design, the importance of a glanceable user experience, and practical tips for designing for different Wear OS surfaces.


3. Introduction to Product Inclusion & Equity

We will introduce you to Product Inclusion and Equity, and how to approach it when designing for Wear OS. You will learn how to build for belonging and make products more accessible and usable by all.


4. Case Studies: Inclusion and Exclusion in Technology Design

Here you will see a series of case studies showing how product and design choices can be impactful on a personal, community, and systemic level. Designs can both be affirming and inclusive, or harmful and exclusionary to various people and communities. We’ll use some examples to highlight how important inclusion and equity considerations are when making product decisions.


5. Considerations for Community Co-Design

The last video in this series will give you an introduction into community co-design, a powerful approach that focuses on building solutions with, not for, historically marginalized communities. In community co-design, we engage with people based on identity, culture, community, and context. You’ll find out how to engage people and communities in a safe, respectful, and equity-centered way in product development.


Keep your eyes peeled for more updates from us as we continue to share and evolve our latest design thinking and practices, principles, and guidelines.

We also have many more resources to help get you started designing for Wear OS:

  • Find inspiring designs for different types of apps in our gallery
  • Interested in designing for multiple devices from TV’s to mobiles to tablets, check out our design hub
  • Access developer documentation for Wear OS

Watch the Wear OS updates at I/O 2023

Posted by Kseniia Shumelchyk, Android Developer Relations Engineer

As we continue to evolve the Wear OS platform, we're excited to share with you some of the newest features and improvements that have been added to help you create innovative and engaging experiences for your users.

Partners like Peloton and Todoist have been building exceptional experiences for Wear OS - and seeing the impact on their feature-adoption and engagement. Hear directly from Peloton engineers about how they built a differentiated experience for the watch with Compose for Wear OS.


In this blog post, we’ll be highlighting some of the key updates we announced at Google I/O this year, so let’s dive in and explore the latest advancements in Wear OS!

Wear OS 4 Developer Preview

Today we’re releasing the first Developer Preview of Wear OS 4, the next version of Google’s smartwatch platform arriving later this year. It has enhancements to security, user customization, and power optimizations.

This preview introduces several new tools to help enhance your Wear OS app experience:

Watch Face Format

We are launching the Watch Face Format, a new way to create watch faces for Wear OS. The format makes it easier to create customizable and more power-efficient watch faces for Wear OS 4. Developed in partnership with Samsung, the Watch Face Format is a declarative XML format, so there is no executable code involved in creating a watch face and there will be no code embedded in your watch face APK. Read more.

Watch faces created using the new Format

Tiles

Wear OS tiles give users fast, predictable access to the information and actions they rely on most. Version 1.2 of the Jetpack Tiles library introduces support for platform data bindings, so if your tile uses platform data sources such as heart rate, step count, or time, your tile is updated once per second.

The new version of tiles also adds support for animations. You can use tween animations to create smooth transitions on changes to part of your layout, and transition animations can animate new or disappearing elements from the tile.

Image showing examples of animated Tiles
Examples of animated Tiles

Get your app ready

Wear OS 4 is based on Android 13, which is several versions newer than the current Wear OS version, so your app will need to handle the system behavior changes that took effect in Android 12 and Android 13. We recommend you start by testing your app and releasing a compatible update first – as devices get upgraded to Wear OS 4, it’s a basic but a critical level of quality that provides a good app experience for users.

Download the Wear OS 4 emulator in Android Studio Hedgehog to explore new features and test your app on Wear OS 4 Developer Preview.


Tooling and library updates

Wear OS support in Firebase Test Lab

Firebase Test Lab will support running tests for your standalone app on physical Google Pixel Watches in the next few weeks. You can run your automated tests on the Google Pixel Watch via Gradle Managed Devices, or use the Firebase Console to also run Robo tests. To learn more, check out available devices.

Wear OS support in the Pre-launch reports

Today we are also excited to announce Wear OS support in Google Play Pre-launch reports for standalone apps. The Pre-launch report helps to identify issues proactively before your app reaches users, so it’s an important tool to help you launch a high-quality app. You can test for stability, accessibility, security and trust, and screenshot previews! At the moment the analysis runs on Wear emulators and it is soon launching on Google Pixel Watches.

Emulator improvements

The Wear OS 4 emulator brings support for emulated Bluetooth, which lets you test more use cases, for example Bluetooth audio.

The new Wear OS 4 emulator doesn’t support unmanaged 32-bit code, so if your app uses native code, make sure that it includes both 32-bit and 64-bit native libraries. This will also prepare your app for upcoming 64-bit only hardware.

In Android Studio Hedgehog we also added capabilities for capturing screenshots and Logcat snapshots in the Wear OS emulator, so it is now much easier to generate screenshots for your app’s store listing.

Jetpack libraries

Since the latest stable Compose for Wear OS 1.1 release, we continue to bring new features and improvements to the toolkit. Version 1.2 already has a number of alpha releases – check out release notes to find out more.

Health Services version 1.0 has introduced a few new features in latest beta releases. Most notably, it includes BatchingMode to deliver batched exercise data at a configured interval instead of the default interval, as well as an ExerciseTypeConfig API which enables updates during ongoing exercises, such as golfing. If you are interested to learn what's new in Android Health, check out this blog.


Start building for Wear OS now

Wear OS active devices have grown 5x since launching Wear OS 3, and it's the fastest growing smartwatch platform.

We’re excited to share our brand new Wear OS Gallery, where you can find even more guidance with proven design and development patterns for messaging, media, and health & fitness apps!

With the latest updates, you'll have even more tools at your disposal to create beautiful, high-quality wearable experiences.


Learn more

Get started building for Wear OS with hands-on experience! Try our Compose for Wear OS codelab, and check out the documentation and samples.

The new Wear OS quality requirements will come into effect on August 31, 2023, so consider them early when designing and developing your app.

We’re looking forward to seeing the experiences that you build!

Voice controlled workouts with Google Assistant

Posted by John Richardson, Partner Engineer

With tens of millions of installs of the adidas Running app, users every day turn to adidas as part of their health and fitness routine. Like many in the industry, adidas recognized that in this ever-evolving market, it's important to make it as easy as possible for users to achieve their fitness goals and making their app available on Wear was a natural fit. adidas didn’t stop at bringing their running app to the watch, however, they also realized that in a user situation such as a workout, the ability to engage with the application hands-free, or even eyes-free, further simplified how users could engage with the app.

Integrating Google Assistant

To enable hands-free control, adidas looked to Google Assistant and App Actions, which lets users control apps with their voice using built-in intents (BIIs). Users can perform specific tasks by voice or act upon tasks such as starting a run or swim.

Integrating Health and Fitness BIIs was a simple addition that adidas’ staff Android developer made in their IDE by declaring <capability> tags in their shortcuts.xml file in order to create a consistent experience between the mobile app and a watch surface. It’s a process that looks like this:

  1. First, Assistant parses the user’s request using natural language understanding and identifies the appropriate BII. For example, START_EXERCISE to begin a workout.
  2. Second, Assistant will then fulfill the user’s intent by launching the application to a specified content or action. Besides START_EXERCISE, users can also stop (STOP_EXERCISE), pause (PAUSE_EXERCISE), or resume (RESUME_EXERCISE) their workouts. Haptic feedback or dings can also be added here to show whether a user request was successful or not.

With App Actions being built on Android, the development team was able to deploy quickly. And when partnered with the Health Services and Health Connect APIs which respectively support real-time sensor and health data, end users can have a cohesive and secure experience across mobile and Wear OS devices.

Miving image illustrating adidas Running app launching via Google Assistant on a wearable device

“What’s exciting about Assistant and Wear is that the combination really helps our users reach their fitness goals. The ability for a user to leverage their voice to track their workout makes for a unique and very accessible experience,” says Robert Hellwagner, Director of Product Innovation for adidas Runtastic. “We are excited by the possibility of what can be done by enabling voice based interactions and experiences for our users through App Actions.”

Learn more

Enabling voice controls to unlock hands-free and eyes-free contexts is an easy way to create a more seamless app experience for your users. To bring natural, conversational interactions to your app read our documentation today, explore how to build with one of our codelabs, or subscribe to our App Actions YouTube playlist for more information. You can also sign up to develop for Android Health Connect if you are interested in joining our Google Health and Fitness EAP. To jump right into how this integration was built, learn more about integrating WearOS and App Actions.

Announcing Glance: Tiles for Wear OS made simple

Posted by Anna Bernbaum, Associate Product Manager


Last year we announced the Wear Tiles API. To complement that Java API, we are excited to announce that support for Wear OS Tiles has been added to Glance, a new framework built on top of Jetpack Compose designed to make it easier to build for surfaces outside your app on Android. We'd love to get your feedback on this alpha version.

Tiles provide Wear OS users easy access to the information and actions they need in order to get things done quickly. They also are one of the most used surfaces on Wear OS. Just one swipe away from the Watch Face, users can quickly access the most important information or actions from an app, like start a timer or get the latest weather forecast.


Watch face gif


Let's see how we can create a Tile with Glance:


class HelloTileService : GlanceTileService() {
   @Composable
   override fun Content() {
       Text(text = "Hello Glance")
   }
}

The simple code above generates the Tile below.


“Hello Glance” Tile sample with Glance

“Hello Glance” Tile sample with Glance


Note: Using Glance-wear-tiles requires`minSdkVersion`>= 26.



How it works

Glance creates “glanceable” experiences across Android surfaces using a base-set of Composables. For Tiles on Wear OS, Glance translates Composables into Tiles.


Diagram: Glance structure 

Diagram: Glance structure


Glance requires Compose to be enabled and depends on Runtime, Graphics, and Unit UI Compose layers, but it’s not directly interoperable with other existing Jetpack Compose UI elements, like Compose for Wear OS.

What’s in the Alpha

This initial release introduces the main APIs to build wear Tiles:

We are working on bringing even more functionality with default theming, further Android Studio support, and more. Stay tuned for new releases.

Get started with Glance

For a quick start, take a look at the samples in the AndroidX repository. Glance works with the latest stable Android Studio, although since Glance relies on Compose Runtime, follow the steps on the Jetpack Compose docs to set it up first.

The Alpha version is your opportunity to influence the APIs, so please share your feedback and let us know your experience!

Happy Composing with Glance!

Develop watch faces with the stable Jetpack Watch Face library

Posted by Alex Vanyo, Developer Relations Engineer

Illustration of tan hand showing a watch

Watch faces are one of the most visible ways that people express themselves on their smartwatches, and they’re one of the best ways to display your brand to your users.

Watch Face Studio from Samsung is a great tool for creating watch faces without writing any code. For developers who want more fine-tuned control, we've recently launched the Jetpack Watch Face library written from the ground up in Kotlin.

The stable release of the Jetpack Watch Face library includes all functionality from the Wearable Support Library and many new features that make it easier to support customization on the smartwatch and on the system companion app on mobile, including:

  • Watch face styling which persists across both the watch and phone (with no need for your own database or companion app).
  • Support for a WYSIWYG watch face configuration UI on the phone.
  • Smaller, separate libraries (that only include what you need).
  • Battery improvements through encouraging good battery usage patterns out of the box, such as automatically reducing the interactive frame rate when battery is low.
  • New screenshot APIs so users can see previews of their watch face changes in real time on both the watch and phone.

If you are still using the Wearable Support Library, we strongly encourage migrating to the new Jetpack libraries to take advantage of the new APIs and upcoming features and bug fixes.


Below is an example of configuring a watch face from the phone with no code written on or for the phone.

GIF showing how to edit a watch face using the Galaxy Wearable mobile companion app

Editing a watch face using the Galaxy Wearable mobile companion app


If you use the Jetpack Watch Face library to save your watch face configuration options, the values are synced with the mobile companion app. That is, all the cross-device communication is handled for you.

The mobile app will automatically present those options to the user in a simple, intuitive user interface where they change them to whatever works best for their style. It also includes previews that update in real time.

Let’s dive into the API with an overview of the most important components for creating a custom watch face!


WatchFaceService

A subclass of WatchFaceService forms the entry point of any Jetpack watch face. Implementing a WatchFaceService requires creating 3 objects: A UserStyleSchema, a ComplicationSlotsManager, and a WatchFace:

Diagram showing the 3 main parts of a WatchFaceService

Diagram showing the 3 main parts of a WatchFaceService

These 3 objects are specified by overriding 3 abstract methods from WatchFaceService:

class CustomWatchFaceService : WatchFaceService() {

    /**
     * The specification of settings the watch face supports.
     * This is similar to a database schema.
     */
    override fun createUserStyleSchema(): UserStyleSchema = // ...

    /**
     * The complication slot configuration for the watchface.
     */
    override fun createComplicationSlotsManager(
        currentUserStyleRepository: CurrentUserStyleRepository
    ): ComplicationSlotsManager = // ...

    /**
     * The watch face itself, which includes the renderer for drawing.
     */ 
    override suspend fun createWatchFace(
        surfaceHolder: SurfaceHolder,
        watchState: WatchState,
        complicationSlotsManager: ComplicationSlotsManager,
        currentUserStyleRepository: CurrentUserStyleRepository
    ): WatchFace = // ...

}

Let’s take a more detailed look at each one of these in turn, and some of the other classes that the library creates on your behalf.


UserStyleSchema

The UserStyleSchema defines the primary information source for a Jetpack watch face. The UserStyleSchema should contain a list of all customization settings available to the user, as well as information about what those options do and what the default option is. These settings can be boolean flags, lists, ranges, and more.

By providing this schema, the library will automatically keep track of changes to settings by the user, either through the mobile companion app on a connected phone or via changes made on the smartwatch in a custom editor activity.

    override fun createUserStyleSchema(): UserStyleSchema =
        UserStyleSchema(
            listOf(
                // Allows user to change the color styles of the watch face
                UserStyleSetting.ListUserStyleSetting(
                    UserStyleSetting.Id(COLOR_STYLE_SETTING),
                    // ...
                ),
                // Allows user to toggle on/off the hour pips (dashes around the outer edge of the watch
                UserStyleSetting.BooleanUserStyleSetting(
                    UserStyleSetting.Id(DRAW_HOUR_PIPS_STYLE_SETTING),
                    // ...
                ),
                // Allows user to change the length of the minute hand
                UserStyleSetting.DoubleRangeUserStyleSetting(
                    UserStyleSetting.Id(WATCH_HAND_LENGTH_STYLE_SETTING),
                    // ...
                )
            )
        )

CurrentUserStyleRepository

The current user style can be observed via the ​​CurrentUserStyleRepository, which is created by the library based on the UserStyleSchema.

It gives you a UserStyle which is just a Map with keys based on the settings defined in the schema:

Map<UserStyleSetting, UserStyleSetting.Option>

As the user’s preferences change, a MutableStateFlow of UserStyle will emit the latest selected options for all of the settings defined in the UserStyleSchema.

currentUserStyleRepository.userStyle.collect { newUserStyle ->
    // Update configuration based on user style
}

CurrentUserStyleRepository

Complications allow a watch face to display additional information from other apps on the watch, such as events, health data, or the day.

The ComplicationSlotsManager defines how many complications a watch face supports, and where they are positioned on the screen. To support changing the location or number of complications, the ComplicationSlotsManager also uses the ​​CurrentUserStyleRepository.

    override fun createComplicationSlotsManager(
        currentUserStyleRepository: CurrentUserStyleRepository
    ): ComplicationSlotsManager {
        val defaultCanvasComplicationFactory =
            CanvasComplicationFactory { watchState, listener ->
                // ...
            }
    
        val leftComplicationSlot = ComplicationSlot.createRoundRectComplicationSlotBuilder(
            id = 100,
            canvasComplicationFactory = defaultCanvasComplicationFactory,
            // ...
        )
            .setDefaultDataSourceType(ComplicationType.SHORT_TEXT)
            .build()
    
        val rightComplicationSlot = ComplicationSlot.createRoundRectComplicationSlotBuilder(
            id = 101,
            canvasComplicationFactory = defaultCanvasComplicationFactory,
            // ...
        )
            .setDefaultDataSourceType(ComplicationType.SHORT_TEXT)
            .build()

        return ComplicationSlotsManager(
            listOf(leftComplicationSlot, rightComplicationSlot),
            currentUserStyleRepository
        )
    }

WatchFace

The WatchFace describes the type of watch face and how to draw it.

A WatchFace can be specified as digital or analog and can optionally have a tap listener for when the user taps on the watch face.

Most importantly, a WatchFace specifies a Renderer, which actually renders the watch face:

    override suspend fun createWatchFace(
        surfaceHolder: SurfaceHolder,
        watchState: WatchState,
        complicationSlotsManager: ComplicationSlotsManager,
        currentUserStyleRepository: CurrentUserStyleRepository
    ): WatchFace = WatchFace(
        watchFaceType = WatchFaceType.ANALOG,
        renderer = // ...
    )

Renderer

The prettiest part of a watch face! Every watch face will create a custom subclass of a renderer that implements everything needed to actually draw the watch face to a canvas.

The renderer is in charge of combining the UserStyle (the map from ​​CurrentUserStyleRepository), the complication information from ComplicationSlotsManager, the current time, and other state information to render the watch face.

class CustomCanvasRenderer(
    private val context: Context,
    surfaceHolder: SurfaceHolder,
    watchState: WatchState,
    private val complicationSlotsManager: ComplicationSlotsManager,
    currentUserStyleRepository: CurrentUserStyleRepository,
    canvasType: Int
) : Renderer.CanvasRenderer(
    surfaceHolder = surfaceHolder,
    currentUserStyleRepository = currentUserStyleRepository,
    watchState = watchState,
    canvasType = canvasType,
    interactiveDrawModeUpdateDelayMillis = 16L
) {
    override fun render(canvas: Canvas, bounds: Rect, zonedDateTime: ZonedDateTime) {
        // Draw into the canvas!
    }

    override fun renderHighlightLayer(canvas: Canvas, bounds: Rect, zonedDateTime: ZonedDateTime) {
        // Draw into the canvas!
    }
}

EditorSession

In addition to the system WYSIWYG editor on the phone, we strongly encourage supporting configuration on the smartwatch to allow the user to customize their watch face without requiring a companion device.

To support this, a watch face can provide a configuration Activity and allow the user to change settings using an EditorSession returned from EditorSession.createOnWatchEditorSession. As the user makes changes, calling EditorSession.renderWatchFaceToBitmap provides a live preview of the watch face in the editor Activity.

To see how the whole puzzle fits together to tell the time, check out the watchface sample on GitHub. To learn more about developing for Wear OS, check out the developer website.

Watch out for Wear OS at Android Dev Summit 2021

Posted by Jeremy Walker, Developer Relations Engineer

image of 4 watch faces against dark blue background.

This year’s Android Dev Summit had many exciting announcements for Android developers, including some major updates for the Wear OS platform. At Google I/O, we announced the launch of the new Wear OS. Since then, Wear OS Powered by Samsung has launched on the Galaxy Watch4 series. Many developers such as Strava, Spotify, and Calm have already created helpful experiences for the latest version of Wear OS, and we’re looking forward to seeing what new experiences developers will help bring to the watch. To learn more and create better apps for the wrist, read more about the updates to our APIs, design tools, and the Play store.


Compose for Wear OS

The Jetpack Compose library simplifies and accelerates UI development, and we’re bringing Compose support to Wear OS. You can design your app with familiar UI components, adapted for the watch. These components include Material You, so you can create beautiful apps with less code.

Compose for Wear OS is now in developer preview. To learn more and get started:

Try it out and share your feedback here or join the #compose-wear channel on the Jetbrains Slack and let us know there! Make sure you do it before we finalize APIs during beta!


Watch Face Studio

image of clock face in editing software

Watch faces are one of the most visible ways that users can express themselves on their smartwatches. Creating a watch face is a great way to showcase your brand for users on Wear OS. We’ve partnered with Samsung to provide better tools for watch face creation and make it easier to design watch faces for the Wear OS ecosystem.

Watch Face Studio is a design tool created by Samsung that allows you to produce and distribute your own watch faces without any coding. It includes includes intuitive graphics tools to allow you to easily design watch faces. You can create watch faces for your personal use, or upload them in Google Play Console to share with your users on Wear OS devices that support API level 28 and above.


Library updates

We recently released a number of Android Jetpack Wear OS libraries to help you follow best practices, reduce boilerplate, and create performant, glanceable experiences for your users.

Tiles are now enabled for most devices in the market, providing predictable, glanceable access to information and quick actions. The API is now in beta, check it out!

For developers who want more fine-grain control of their watch faces (outside of Watch Face Studio), we've launched the new Jetpack Watch Face APIs beta built from the ground up in Kotlin.

The new API offers a number of new features:

  • Watch face styling which persists across both the watch and phone (no need for your own database).
  • Support a WYSIWYG watch face configuration UI on the phone.
  • Smaller, separate libraries (only include what you need).
  • Battery improvements by encouraging good battery usage patterns out of the box; for example, reducing the interactive frame rate when battery is low.
  • New Screenshot APIs so users can see their watch face changes in real time.
  • And many more...

This is a great time to start moving from the older Watch Face Support Library to this new version.


Play Store updates

We’re making it easier for people to discover your Wear OS apps in the Google Play Store. Earlier this year, we enabled searching for watch faces and made it easier for people to find your apps in the Wear category. We also launched the capability for people to download apps onto their watches directly from the mobile Play Store. You can read more about these changes here.

We’ve also released updated Wear OS quality guidelines to help you meet your users’ expectations, as well as new screenshot guidelines to help your users have a better understanding of what your app will look like. To help people better understand how your app would work on their device in their location, we will be launching form factor and location specific ratings in 2022.

To learn more about developing for Wear OS, check out the developer website.

Privacy protections for physical activity in Android 10

Since Google Fit was released in 2015, apps with an abundance of features for health and fitness tracking have integrated with the Google Fit APIs. Over the years, the number of users using Google Fit as a central repository for their fitness and wellness data has grown significantly.

With Android 10, we're making further updates to give users even more control over this personal data. One key change concerns how Android apps can monitor a user’s physical activity and retrieve data from Android sensor APIs and the Google Fit platform.

In Android 10: Activity recognition permission

Android 10 introduces a new runtime permission for activity recognition for apps that make use of the user's step and calorie count or classify the user's physical activity, such as walking, biking, or moving in a vehicle through one of the following APIs:

If your app relies only on raw data from other built-in sensors on the device, such as the accelerometer and gyroscope, you don't need to declare this new permission in your app.

Activity Recognition Permission Enforcement

  • Starting December 2019, data will be restricted from apps not including the Google Play Services legacy activity recognition permission in the manifest. If your app doesn’t currently request this permission, you should add it today to ensure no loss of service for your users.
  • When a user upgrades to Android 10, the system auto-grants this permission to your app if it previously requested the legacy permission.
  • As you begin targeting Android 10, you should register the ACTIVITY_RECOGNITION permission and adopt the new permission model to adhere to the new policy.

Google Fit physical activity APIs

This new permission affects a subset of data types available in the Google Fit APIs on Android. If your app accesses these types from Google Fit today, then you need to update your app inline with the new permissions.

The activity recognition runtime permission is required for accessing the following APIs / data types:

  • RecordingAPI - recording the following data types:
    • com.google.step_count.delta
    • com.google.step_count.cadence
    • com.google.activity.segment
    • com.google.calories.expended
  • HistoryAPI - reading the following data types:
    • com.google.step_count.delta
    • com.google.step_count.cadence
    • com.google.activity.segment
    • com.google.activity.exercise
    • com.google.activity.summary

With Android 10 now launched and SDK 29 becoming your primary development target, now is the time to make sure your apps are compatible with the new runtime permission.

Android Wear SDK and Emulator Update

Posted by Hoi Lam, Lead Developer Advocate, Android Wear
Today we launched the latest version of the Android Wear SDK (2.2.0) with several watch face related enhancements. These include the addition of an unread notification indicator for all watch faces, which is planned to be part of the upcoming consumer release of Android Wear. With the Wear SDK 2.2.0, you can customize the notification indicator or display your own. This feature is available to the developer community early, via the SDK and emulator, so you can verify that the indicator fits the design of your watch face. In addition, we are adding enhancements to the ComplicationDrawable class and publishing the final version of the Wear emulator based on Android Oreo.

Introducing the unread notification indicator


Notification is a vital part of the Wear experience. As a result, starting from the next consumer release of Wear (version 2.9.0), a dot-shaped indicator will be displayed by default at the bottom of the watch face if there are new, unread notifications. Watch face developers can preview the indicator with their watch faces by using the latest version of the emulator. Developers can customise the indicator's accent color via WatchFaceStyle.setAccentColor - the default color is white as shown in the example below, but developers can set the color for the ring around the dot to an accent color of their choice, to match the rest of the watch face.
If the new indicator does not fit with the design of your watch face, you can switch it off using WatchFaceStyle.setHideNotificationIndicator and choose another option for displaying the notification, including: 1) displaying the number of unread notifications in the system tray using WatchFaceStyle.setShowUnreadCountIndicator, or 2) getting the number of unread notifications using WatchFaceStyle.getUnreadCount and displaying the number in a way that fits your watch face's unique style.

Enhancement to ComplicationDrawable


We launched the ComplicationDrawable class at last year's Google I/O, and we are continuing to improve it. In this latest SDK release, we added two enhancements:
  • Permission Handling - If the watch face lacks the correct permission to display the content of a complication, the complication type of TYPE_NO_PERMISSION is issued. ComplicationDrawable now handles this automatically and will launch a permission request in onTap. If you previously implemented your own code to start the permission screen, please check that the permission screen is not triggered twice and, if necessary, remove unneeded code.
  • Drawable Callback - If a complication contains an image or an icon, it can take a small amount of time to load after the other initial data arrives. Our previous recommendation therefore was that you redraw the screen every second. But this is unnecessary for watch faces that only update once per minute, for example. As a result, we have added new support for Drawable.Callback to ComplicationDrawable. Developers who update the screen less frequently than once per second should adopt this new callback to redraw the watch face when images have loaded.
For more, please see the Android Wear Release Notes which includes other information regarding the emulator.

More improvements to come


Many of you have noticed a steady release of enhancements to Android Wear over the last few months since the launch of Wear 2.0. We are developing many more for the months ahead and look forward to sharing more when the features are ready.



Android Wear Beta

Posted by Hoi Lam, Lead Developer Advocate, Android Wear
LG Watch Sport

Today, we are launching the beta of the next Android Wear update. As we mentioned at Google I/O, this will mainly be a technical upgrade to API 26 with enhancements to background limits and notification channels. LG Watch Sport users can go to this webpage to sign up and the factory image will automatically be downloaded to the watch you enroll. As this is a beta, please be sure to review the known issues before enrolling. If you don't have a watch to test on, you can use the Android emulator. For developers working with Android Wear for China, an updated emulator image is also available.

Notification Channels

In this update, users can choose the types of notifications they receive via an app through notification channels. This gives users finer-grained control than muting all notifications from the app. For notifications generated locally by Android Wear apps, users will be able to customise the notifications channel they want to see, right on their watch. Please refer to the Wear notification sample for more details. For notifications bridged from the phone, the phone notifications channel settings will dictate what is shown on the watch.

if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
    mNotificationManager.createNotificationChannel(
        NotificationChannel("1001", "New Follower",
            NotificationManager.IMPORTANCE_DEFAULT))

    mNotificationManager.createNotificationChannel(
        NotificationChannel("1002", "Likes",
            NotificationManager.IMPORTANCE_LOW))
}

Background Limits

There are increased restrictions on background services. Developers should assume services can no longer run in the background without a visible notification. In addition, the background location update frequency will be reduced. Battery-saving best practices such as using JobScheduler should be adopted to ensure your app is battery-efficient and able to perform background tasks when possible.

Please give us your feedback

We expect this to be the only beta release before the final production release. Thank you for your feedback so far. Please submit any bugs you find via the Android Wear issue tracker. The earlier you submit them, the higher the likelihood that we can include the fixes in the final release.