Author Archives: Android Developers

I/O 2023: What’s new in Google Play

Posted by Alex Musil, Senior Director of Engineering and Product, Google Play

Over the past year, our teams have built exciting new features and made major changes to help you thrive with us. These updates have focused on:

  • Being the best partner to help you grow your audiences across the lifecycle of your business,
  • Being the best platform to help you effectively monetize your users at scale, and
  • Being the safest place to publish and distribute your hard work with Android.

Watch our video for more details, or keep reading to get the highlights.



More store listing enhancements designed to drive growth

Attracting users is the foundation of any app business, and it all starts with your store listing. These updates can help you craft better and more personalized content to drive more audience growth.

  • Last year, we gave every title the ability to create at least 50 custom store listings. Now, in addition to tailoring by country and pre-registration status, you can also customize your listing for inactive users, highlighting why they should give your app or game another chance.
  • Soon, we’ll launch custom store listings for Google Ads App campaign ad groups. These will allow you to serve custom listings to users coming from specific ads on AdMob and YouTube so you can create a more seamless user experience from Google Ads to Google Play.
  • All these new tools mean managing more listings, so we’re launching store listing groups to streamline the process. Now you can design for different audiences by simply creating a base listing, then overriding specific elements.
Image showing an example of a store lisitng group in Google Play
Create a base listing as your primary template and modify elements for different audiences with store listing groups.
  • To help you connect with people in their native language, we just launched new machine translation models for 10 languages from Google Translate in Play Console. It can translate your app and store listing in minutes, at no cost.
  •  

AI-powered features to highlight the best of your app

We’re bringing the benefits of AI to Google Play to make it easier for you and your users to get things done. From helping you showcase your app or game in the best possible light to helping users discover your title, these AI-powered features help you highlight the best of your app experience with ease.

  • Starting today, you can use Google’s generative AI technology to help you get started with store listings in English. This is an experimental feature to help you draft content with less effort. Just open our AI helper, enter a couple of prompts like audience and key theme, and it will generate a draft you can edit, discard, or use. You’re always in complete control of what you submit and publish.
Moving image of using Generative AI to create a custom store listing in Google Play
Draft an AI-generated store listing with just a few prompts
  • To help users learn from each other about what makes your app or game special at a glance, we’re launching review summaries powered by Google’s generative AI technology. Starting with an experiment in English, and expanding later this year.
screenshot of user review summaries in Google Play on a mobile device
Review summaries highlight what users are saying about your app or game at a glance


New opportunities to boost user discovery

Google Play can also help you grow your audience by partnering with you to promote important events, new content, or exciting offers. Use Promotional content to let us know when these are happening so we can amplify your growth. Almost 25,000 apps and games already have access to Promotional content, and we’re rolling out to more titles later this year.

  • We’re launching multiple new, dedicated high-traffic surfaces to showcase your most exciting content, including via Play notifications. Participating games are seeing a median 20% uplift in store-wide acquisitions and reacquisitions, driven by increases of over 60% from organic Explore traffic. 
image of four mobile screens displays side-by-side showing new high-traffic surfaces
New Play surfaces showcase your most exciting content
  • To enhance how and where your Promotional content is viewed on Play, we’re updating our reporting so you can track and optimize your events’ direct performance. Check it out in Play Console under “Promotional content performance reports.”

To be eligible for these new growth opportunities, your app or game needs to be of high quality and deliver the great experiences your users expect. Because it’s so important, we’re sharing more insights into how we think about quality and improving our tooling to help you meet these goals.

  • Today, we launched a unified framework for app and game quality that explains how we evaluate quality across a number of dimensions for promotion and featuring. Learn more with this article and I/O session, “What great quality looks like on Play.

More effective monetization features

We’re also rolling out new features that leverage Play’s reach, expertise, and technologies to help you more effectively generate revenue.

  • Soon, you’ll be able to run price experiments for in-app products right within Play Console. Experiment with different price points across markets and identify when you may be pricing yourself out of a sale or undervaluing your in-app products.
View of price experiments in Google Play Console
Find the right price point for your in-app products with our experiments tool in Play Console
  • Also coming soon is a new type of Promotional content called “featured products” that will allow you to sell your in-app items directly on Play. Feature specific in-app items in different countries or offer discounts to excite users and increase conversions.
moving image showing in-app items on the details page in Google Play
Feature in-app products on your store listing and nominate them for further promotion across Play surfaces
  • We’ve also made new updates to subscriptions to help you expand your reach, increase conversions, and improve retention. This year, we launched multiple prices per billing period so you can provide different auto-renewing and prepaid plan prices as desired, like giving “VIP” users recurring discounts.
  • Our commerce platform continues to evolve by improving access to buyers with new payment methods, exploring expanded billing options through our user choice billing pilot, and investing in secure purchase experiences that prevented over $2 billion in fraudulent and abusive transactions in 2022.

Learn more in our “Boost your revenue with Play Commerce” session.

Finally, we’re also working to increase the effectiveness of your marketing-to-sales funnel.

  • Last year, we launched a Play Console page dedicated to deep links. This page flags broken deep links and provides contextual guidance on how to fix them. Coming soon, we’ll make it easier for you to rationalize your web-to-app mapping with a convenient way to review your top website URLs alongside their deep link status. To help you validate your deep links, we're adding a simple way to compare your app to your web experience for a given URL, side-by-side.

Find out more in our deep links talk, “Optimize app experiences with deep linking.


Enhanced privacy and security protection for developers and users

Protecting your users and your work is critical to a successful ecosystem, so we’ve continued to strengthen our platform-wide protections and roll out more tools to help you protect your apps.

  • Google Play Protect scans billions of apps each day across billions of Android devices to keep users safe from threats like malware and unwanted software. Last year we prevented 1.4 million policy-violating apps from entering Google Play
  • Play Integrity API lets you check that user actions and server requests come from unmodified versions of your app, running on genuine Android devices. We’re rolling out a new beta integration option which gives Play Integrity API verdicts 10x faster. We launched status.play.google.com so you can monitor Play Integrity API service status and be notified of any issues.
  • We’re also expanding access to Automatic integrity protection for apps and games so anti-tamper and anti-piracy protection can be applied in “one-click” with no need to integrate an API in a backend server. Developers who use these products see a reduction in unauthorized usage of 80% on average.
  • Finally, we are building new tools to help you steer users away from broken app versions with prompts to update. First, automatic update prompts for crashing apps are triggered if your app crashes in the foreground and a more stable version is available. And second, you can prompt users on specific app versions to update. No prior integration is required and it will be available to all apps built with app bundles in the coming months.

We’re also continuing to improve Google Play and Play Console to help you provide safe, trustworthy experiences to users.

  • Last year, we launched the Data safety section to help explain what data your app may collect or share, and why. Since the launch, we’ve seen millions of users engaging with this feature every day, and it’s become an important way for users to evaluate an app’s safety before installing it.

    Now, we're enhancing this feature with new data deletion options both inside and outside an app, and policy requirements to help you build trust and empower users with greater clarity and control. You also have the option to give users the choice to clean up their account and request that data like activity history, images, and videos be deleted, rather than their entire account.

  • The redesigned App content page makes outstanding tasks clearer, so you can quickly identify what you need to do to comply with our policies. And soon, you’ll see upcoming declaration requirements and deadlines, so you have more time to plan.

Finally, we rebuilt the Play Console App around modern developer needs. The new app is more customized, so you can tailor the homepage with the metrics you care about most, and integrates Inbox so you can stay up to date with key messages from Google Play. Join the open beta and let us know what you think.

We understand how exciting and challenging building and running a mobile business can be, and our teams are dedicated to building the tools and opportunities you need to succeed across your app lifecycle. Thank you for partnering with us, and please continue to share your feedback as we work together to build the future of Google Play.


Price in-app products with confidence by running price experiments in Play Console

Posted by Phalene Gowling, Product Manager, Google Play

At this year’s Google I/O, our “Boost your revenue with Play Commerce” session highlights the newest monetization tools that are deeply integrated into Google Play, with a focus on helping you optimize your pricing strategy. Pricing your products or content correctly is foundational to driving better user lifetime value and can result in reaching new buyers, improving conversion, and encouraging repeat orders. It can be the difference between a successful sale and pricing yourself out of one, or even undervaluing your products and missing out on key sales opportunities.

To help you price with confidence, we’re excited to announce price experiments for in-app products in Play Console, allowing you to test price points and optimize for local purchasing power at scale. Price experiements will launch in the coming weeks - so read on to get the details on the new tool and learn how you can prepare to take full advantage when it's live.

  • A/B test to find optimal local pricing that’s sensitive to the purchasing power of buyers in different markets. Adjusting your price to local markets has already been an industry-wide practice amongst developers, and at launch you will be able to test and manage your global prices, all within Play Console. An optimized price helps reach both new and existing buyers who may have previously been priced out of monetized experiences in apps and games. Additionally, an optimized price can help increase repeat purchases by buyers of their favorite products.
  • Image of two mobile devices showing A/B price testing in Google Play Console
    Illustrative example only. A/B test price points with ease in Play Console 
  • Experiment with statistical confidence: price experiments enables you to track how close you are to statistical significance with confidence interval tracking, or for a quick summary, you can view the top of the analysis when enough data has been collected in the experiment to determine a statistically significant result. To help make your decision on whether to apply the ‘winning’ price easier, we’ve also included support for tracking key monetization metrics such as revenue uplift, revenue derived from new installers, buyer ratio, orders, and average revenue per paying user. This gives you a more detailed understanding of how buyers behave differently for each experiment arm per market. This can also inspire further refinements towards a robust global monetization strategy.
  • Improve return on investment in user acquisition. Having a localized price and a better understanding of buyer behavior in each market, allows you to optimize your user acquisition strategy having known how buyers will react to market-specific products or content. It could also inform which products you chose to feature on Google Play.

Set up price experiments in minutes in Play Console

Price experiments will be easy to run with the new dedicated section in Play Console under Monetize > Products > Price experiments. You’ll first need to determine the in-app products, markets, and the price points you’d like to test. The intuitive interface will also allow you to refine the experiment settings by audience, confidence level and sensitivity. And once your experiment has reached statistical significance, simply apply the winning price to your selected products within the tool to automatically populate your new default price point for your experiment markets and products. You also have the flexibility to stop any experiment before it reaches statistical significance if needed.

You’ll have full control of what and how you want to test, reducing any overhead of managing tests independently or with external tools – all without requiring any coding changes.

Learn how to run an effective experiment with Play Academy

Get Started

You can start preparing now by strategizing what type of price experiment you might want to run first. For a metric-driven source of inspiration, game developers can explore strategic guidance, which can identify country-specific opportunities for buyer conversion. Alternatively, start building expertise on running effective pricing experiments for in-app products by taking our new Play Academy course, in
preparation for price experiments rolling out in the coming weeks.



Build smarter Android apps with on-device Machine Learning

Posted by Thomas Ezan, Developer Relations

In the past year, the Android team made significant improvements to on-device machine learning to help developers create smarter apps with more features to process images, sound, and text. In the Google I/O talk Build smarter Android apps with on-device Machine Learning, David Miro-Llopis PM on ML Kit and Thomas Ezan Android Developer Relation Engineer review new Android APIs and solutions and showcase applications using on-device ML.

Running ML processes on-device enables low-latency, increases data-privacy, enables offline support and potentially reduces cloud bill. Applications such as Lens AR Translate or the document scanning feature available in Files in India, benefit from the advantages of on-device ML.

To deploy ML features on Android, developers have two options:

  • ML Kit: which offers production-ready ML solutions to common user flows, via easy-to-use APIs.
  • Android’s custom ML stack: which is built on top of Tensorflow Lite, and provides control over the inference process and the user experience.

ML Kit released new APIs and improved existing features

Over the last year, the ML Kit team worked on both improving existing APIs and launching new ones: face mesh and document scanner. ML Kit is launching a new document scanner API in Q3 2023, that will provide a consistent scanning experience across apps in Android. Developers will be able to use it only with a few lines of code, without needing camera permission and with low apk size impact (given that it will be distributed via Google Play Services. In a similar fashion, Google code scanner is now generally available and provides a consistent scanning experience across apps, without needing camera permission, via Google Play Services.

Image a series of three photos of two girls smiling to show how face mesh improves facial recognition

Additionally, ML Kit improved the performance of the following APIs: barcode detection (by 17%), text recognition, digital ink recognition, pose detection, translation, and smart reply. ML Kit also integrated some APIs to Google Play Services so you don’t have to bundle the models to your application. Many developers are using ML Kit to easily integrate machine learning into their apps; for example, WPS uses ML Kit to translate text in 43 languages and save $65M a year.


Acceleration Service in Android’s custom ML stack is now in public beta

To support custom machine learning, the Android ML team is actively developing Android’s custom ML stack. Last year, TensorFlow Lite and GPU delegates were added to the Google Play Services which lets developers use TensorFlow Lite without bundling it to their app and provides automatic updates. With improved inference performance, hardware acceleration can in turn also significantly improve the user experience of your ML-enabled Android app. This year, the team is also announcing Acceleration Service, a new API enabling developers to pick the optimal hardware acceleration configuration at runtime. It is now in public beta and developers can learn more and get started here.

To learn more, watch the video:

What’s new in Android Health

Posted by Sara Hamilton, Developer Relations Engineer

Health and fitness data is interconnected – sleep, nutrition, workouts and more all inform one another. For example, consider that your sleep impacts your recovery, which impacts your readiness to run your favorite 5k. Over time, your recovery and workout habits drive metrics like heart rate variability, resting heart rate, VO2Max and more! Often this data exists in silos, making it hard for users to get a holistic view of their health data.

We want to make it simple for people to use their favorite apps and devices to track their health by bringing this data together. They should have full control of what data they share, and when they share it. And, we want to make sure developers can enable this with less complexity and fewer lines of code.

This is why we’ve continued to improve our Android Health offerings, and why today at I/O 2023, we’re announcing key updates across both Health Connect and Health Services for app developers and users.

What is Android Health?

Android Health brings together two important platforms for developers to deliver robust health and fitness app to users; Health Connect and Health Services.

Health Connect is an on-device data store that provides APIs for storing and sharing health and fitness data between Android apps. Before Health Connect, there was not a consistent way for developers to share data across Android apps. They had to integrate with many different APIs, each with a different set of data types and different permissions management frameworks.

Now, with Health Connect, there is less fragmentation. Health Connect provides a consistent set of 40+ data types and a single permissions management framework for users to control data permissions. This means that developers can share data with less effort, enabling people to access their health data in their favorite apps, and have more control over data permissions.

Screenshot of permissions via Health Connect

Health Services is our API surface for accessing sensor data on Wear OS devices in a power-efficient way. Before Health Services, developers had to work directly with low-level sensors, which required different configurations on different devices, and was not battery-efficient.

With Health Services, there is now a consistent API surface across all Wear OS 3+ devices, allowing developers to write code once and run it across all devices. And, the Health Services architecture means that developers get great power savings in the process, allowing people to track longer workouts.

Health Connect is coming to Android 14 with new features

Health Connect and Android 14 logos with an X between them to indicate collaboration

Health Connect is currently available for download as an app on the Play Store. We are excited to announce that starting with the release of Android 14 later this year, Health Connect will be a core part of Android and available on all Android mobile devices. Users will be able to access Health Connect directly from Settings on their device, helping to control how their health data is shared across apps.

Screenshot showing Health Connect avaialble in the privacy settings of an Android device

Several new features will be shipped with the Health Connect Android 14 release. We’re adding a new exercise routes feature to allow users to share maps of their workouts through Health Connect. We’ve also made improvements to make it easier for people to log their menstrual cycles. And, Health Connect updates will be delivered through Google Play System Updates, which will allow new features to be updated often.

Health Services now supports more uses cases with new API capabilities

We’ve released several exciting changes to Health Services this year to support more use cases. Our new Batching Modes feature allows developers to adjust the data delivery frequency of heart rate data to support home gym use cases. We’ve also added new API capabilities, like golf shot detection.

The new version of Wear OS arrives later this year. Wear OS 4 will be the most performant yet, delivering improved battery life for the next generation of Wear OS watches. We will be releasing additional Health Services updates with this change, including improved background body sensor permissions.

Our developer ecosystem is growing

There are over 50 apps already integrated with Health Connect and hundreds of apps with health services, including Peloton, Withings, Oura, and more. These apps are using Health Connect to incorporate new data, to give people an interconnected health experience, without building out many new API integrations. Learn more about how these health and fitness apps are creating new experiences for users in areas like sleep, exercise, nutrition, and more in our I/O technical session.

We also have over 100 apps integrated with Health Services. Apps using Health Services are seeing higher engagement from users with Wear apps, and are giving their users longer battery life in the process. For example, Strava found that users with their Wear app did 25% more activities than those without.

Get started with Health Connect

We hope many more developers will join us in bringing unique experiences within Android Health to your users this year.

If you’d like to create a more interconnected health experience for your users, we encourage you to integrate with Health Connect. And if you are a Wear developer, make sure you are using Health Services to get the best battery performance and future proofing for all upcoming Wear OS devices.

Check out our Health Services documentation, Health Connect documentation, and code samples to get started!

To learn more, watch the I/O session:

Introducing the Watch Face Format for Wear OS

Posted by Anna Bernbaum, Product Manager

We are excited to announce the launch of the Watch Face Format! We worked in partnership with Samsung to introduce a new way for you to build watch faces for Wear OS smartwatches.

The Watch Face Format is a declarative XML format to design the appearance and behavior of watch faces. This means that there is no executable code involved in creating a watch face, and there will be no code embedded in your watch face APK.

The Wear OS platform takes care of the logic needed to render the watch face so you no longer have to worry about code optimizations or battery performance.

Watch faces that are built with this new format require less maintenance and fewer updates than the ones built using the Jetpack Watch Face library. For example, you no longer need to update your watch face to benefit from improvements in performance or battery consumption, or to get the latest bug fixes.

Starting today, you can build watch faces in this new format and publish them on Google Play, ready for when the first Wear OS 4 watches are available.

The Watch Face Format lets you create…

Analog and digital watch faces:

Three watch faces illustrating different analogue and digital styles.

Watch faces with complications:

Three watch faces displaying different complication formats.

Customizable watch faces:

Three instances of the same watch face displayed in different color choices.

And more...

A planets-based watch face, a rainbow-inspired watch face and one based on the sun and moon

Watch Face Editing

With the Watch Face Format, we have included the watch face editor as part of Wear OS itself, so users can customize every watch face using the same editor UI. You no longer need to build your own watch face editor for users to customize their watch face.

Customizing the watch face with the in-built watch face editor
Wear OS 4’s editor for watch faces made using the Watch Face Format

Build Watch Faces, or Watch Face Tools

The new Watch Face Format can be used to build watch faces directly, or it can be integrated into creation tools, allowing designers to create watch faces without having to write any executable code.


Watch Face Studio

Today, Samsung has released the latest version of Watch Face Studio, ready for you to try now. As an alternative to directly writing XML using the Watch Face Format, Watch Face Studio makes it easy for designers to create watch faces without any coding experience.

Watch faces made in the latest version of Watch Face Studio use the Watch Face Format by default when they are run on a Wear OS 4 watch, or they run as traditional watch faces when the watch face runs on a Wear OS 3 watch.

Using Watch Face Studio to create a watch face.
Using Watch Face Studio to create a watch face

Learn more

Build watch faces using the Watch Face Format today:

What’s new with Android for Cars: I/O 2023

Posted by Jennifer Tsau, Product Management Lead and David Dandeneau, Engineering Lead

For more than a decade, Google has been committed to bringing safe and seamless connected experiences to cars. We’re continuing to see strong momentum and adoption across Android for Cars. Android Auto is supported by nearly every major car maker, and will be in nearly 200 million cars by the end of this year. And the number of cars powered by Android Automotive OS with Google built-in — which includes top brands like Chevrolet, Volvo, Polestar, Honda, Renault and more — is expected to nearly double by the end of this year.

With cars becoming more connected and equipped with immersive displays, there’s more opportunities for developers to bring app experiences to cars. We’re excited to share updates and new ways for developers to reach more users in the car.


Apps designed for driving experiences

Helping drivers while on the road - whether they are navigating, listening to music, or checking the weather - is a top priority. We’re continuing to invest in tools and resources, including the Android for Cars App Library, to make it even easier for developers to build new apps or port existing Android apps over to cars.

New capabilities for navigation apps

Today, we announced Waze rolling out globally on the Google Play Store for all cars with Google built-in, expanding its availability beyond Android Auto. As a part of this launch, we created more templates in Android for Cars App Library to help speed up development time across a number of app categories, including navigation.

For navigation apps, it’s also now possible to integrate with the instrument cluster, providing turn-by-turn directions right in the driver's line of sight. And developers can also access car sensor data to surface helpful information like range, fuel level, and speed to provide more contextual assistance to drivers.

A car dashboard shows the Waze app open on the display panel
The Waze app is coming to all cars with Google built-in, including the first-ever Chevrolet Blazer EV launching this year.

Tools to easily port your media apps across Android for Cars

Media apps continue to be a top use case in the car, and it’s quicker than ever to bring your media apps to Android Auto and Android Automotive OS. Audible recently joined popular streaming audio apps like Deezer, Soundcloud, and Spotify to offer their apps across both Android Auto and cars with Google built-in. If you have a media app on mobile, port it over to reach new users in the car.

New app categories for driving experiences

The Android for Cars App Library now allows developers to bring new apps to cars including internet of things (IoT) and weather apps to cars. The IoT category is available for all developers, while weather is in an early access program. In the weather category, The Weather Channel app will join other weather apps like Weather & Radar later this year.

We’re also working with messaging apps like Zoom, Microsoft Teams, and Webex by Cisco to allow you to join meetings by audio from your car display in the coming months.

A car display shows a Zoom meeting schedule next to a route in Google Maps.
Coming soon, join meetings by audio from your car display.

Apps designed for parked and passenger experiences

With screens expanding in size and more being added for passengers, there is growing demand for parked and passenger experiences in cars.

Video, gaming, and browsing in cars

Now, video and gaming app categories are available in the car, with an early access program for browsing apps coming soon. YouTube is now available for car makers to offer in cars with Google built-in. And drivers of cars with Google built-in will soon have access to popular titles like Beach Buggy Racing 2, Solitaire FRVR, and My Talking Tom Friends from publishers like Vector Unit, FRVR and Outfit7 Limited. Developers can now port their large screen optimized apps to cars to take advantage of this opportunity.

A car display shows a YouTube video of an animated character singing.
YouTube is coming to cars with Google built-in, like the Polestar 2.

More screens in cars allows for new experiences between drivers and passengers, including individual and shared entertainment experiences. We're excited to announce multi-screen support is coming to Android Automotive OS 14 — stay tuned for more updates.

A car with a panoramic front display and screens in headrests showing apps and video content.
Support for multiple screens is coming to Android Automotive OS 14.

Start developing apps for cars today

To learn how to bring your apps to cars, check out the technical session, codelab and documentation on the Android for Cars developer site. With all the opportunities across car screens, there has never been a better time to bring your apps and experiences to cars. Thanks for all the contributions to the Android ecosystem. See you on the road!

Building pixel-perfect living room experiences with Compose for TV

Posted by Paul Lammertsma, Developer Relations Engineer

Over the past year, we’ve continued to see significant growth on Android TV OS, now with over 150 million monthly active devices. In fact, according to Strategy Analytics, the Android TV streaming platform shipped on more devices worldwide than any other streaming TV platform in 2022.

Today, we’re launching the Alpha release of Compose for TV, the latest UI framework for developing beautiful and functional apps for Android TV.


Building pixel-perfect living room experiences with Compose for TV

Compose for TV unlocks all the benefits of Jetpack Compose for your TV apps, allowing you to build apps with less code, easier maintenance and a modern Material 3 look straight out of the box:

  • Less code: Do more with less code and avoid entire classes of bugs, so code is simple and easy to maintain. 
  • Intuitive: Describe your UI, and Compose takes care of the rest. As the app state changes, your UI automatically updates. 
  • Accelerate development: Compose for TV is compatible with all your existing code so you can adopt when and where you want. Iterate fast with live previews and full Android Studio support. 
  • Powerful & flexible: Create beautiful apps with direct access to the Android platform APIs that can be easily reused between other form factors, including your existing mobile, tablet, foldable, wearable and TV interfaces.

TV design guidelines

We're also excited to announce the launch of our new TV Design Guidelines for Android TV. This comprehensive guide gives you the tools you need to create TV apps that are visually appealing, intuitive, and immersive. The guidelines cover everything from typography and color to navigation and layout. Follow these guidelines to create high-quality TV apps that are easy to use.

image of a wall mounted, flat screen television in a modern home. The screen is showing the preview for a show titled 'Paws' with an adorable puppy as the show's star, and a Watch Now button

Components you can use today

Here are some components from the TV library that are optimized for the living room experience. You can use them alongside the Material components in Compose you’re already familiar with.


Scroll containers

TvLazyColumn {   items(contentList) { content ->     TvLazyRow { items(content) { cardItem -> Card(cardItem) }   } }

moving image of a grid of content cards
A grid of content cards

TvLazyRow( pivotOffsets = PivotOffsets(0.0f) ) { items(movie) { movie -> MyContentCard(movie) } }

moving image of a grid of content cards
Adjusting the pivot of a TvLazyRow


Immersive List

ImmersiveList( modifier = Modifier.height(130.dp).fillMaxWidth(), background = { index, _ -> AnimatedContent(targetState = index) { MyImmersiveListBackground(it) } }, ) { TvLazyRow { items(featuredContentList.size) { index -> MyCard( Modifier.focusableItem(index), featuredContentList[index] ) } } }

moving image of a grid of content cards
ImmersiveList allows TvLazyRows to be combined with content featuring

Featured carousel

Carousel( itemCount = featuredContentList.size, ) { index -> CarouselItem( background = { MyBackground(index) }, content = { MyFeaturedContent(featuredContentList[index]) } ) }

moving image of a grid of content cards
Carousel features content with custom content and backgrounds

Navigation

var selectedTabIndex by remember { mutableStateOf(0) } TabRow(selectedTabIndex = selectedTabIndex) { tabs.forEachIndexed { index, tab -> Tab( selected = selectedTabIndex == index, onFocus = { selectedTabIndex = index }, ) { Text(tab) } } } MyContentBody(selectedTabIndex)

moving image of a grid of content cards
TabRows can be placed at the top of the screen to provide top navigation

Side navigation with navigation drawer


NavigationDrawer( drawerContent = { if (DrawerValue.Open == it) { MyExpandedSideMenu() } else { MyCompactSideMenu() } } ){ MyContentBody() }

moving image of a grid of content cards
NavigationDrawer makes it easy to implement side navigation that expands and collapses

TV-optimized components

Subtle focus hints that work on phones and tablets might not be optimal for TVs, due to environmental factors such as distance from the screen and contrast ratio. To address this, we’ve built dedicated Material3 inspired components that provide big, bold focus for selected elements like Buttons and Cards, designed with accessibility in mind. You can use these Indications for your own custom surfaces as well.


moving image of a grid of content cards
Component focus can be customized through different indication types: Scale, Border, Glow and Color

Built with developers

We worked closely with a group of early adopters to get their feedback on Compose for TV. Here’s what they have to say:


Quote card with headshot of Dai Williams, Plex, smiling and text reads,'TV focus and scrolling support on Compose from Google has greatly improved our developer productivity and app performance. We are excited to launch more and more features using Compose this year.'

Quote card with headshot of Danny Preussler, Android Platform Lead, Soundcloud, smiling and text reads,'Thanks to Compose for TV, we are able to reuse components and move much faster than the old Leanback View APIs would have ever allowed us to'

Quote card with headshot of Petr Introvič, Showmax, smiling and text reads,'Dev-friendly components like ImmersiveList or Carousel helped us put front the top content, and NavigationDrawer added a top-level navigation—the final piece for our TV app migration to Compose.'

Quote card with headshot of Kishore AK, CTO, Zee5, smiling, and text reads 'We are constantly striving to ensure our users have the best possible experience. We started early on COmpose for TV and are confident that its implementation will help in making our app rendering faster and smoother.'

Learning more

To get started , check out the developer guides, design reference, our new codelab and sample code. Be sure to check the latest release notes to keep up to date with the latest updates to Compose for TV.


Feedback from developers & designers like you

We’ve heard your feedback about the Leanback API and your desire to use a modern UI framework that looks great out of the box, but also lends itself to be thoroughly themed and customized. Please continue to give us feedback so we can continue shaping Compose for TV to fit your needs.

Watch the Wear OS updates at I/O 2023

Posted by Kseniia Shumelchyk, Android Developer Relations Engineer

As we continue to evolve the Wear OS platform, we're excited to share with you some of the newest features and improvements that have been added to help you create innovative and engaging experiences for your users.

Partners like Peloton and Todoist have been building exceptional experiences for Wear OS - and seeing the impact on their feature-adoption and engagement. Hear directly from Peloton engineers about how they built a differentiated experience for the watch with Compose for Wear OS.


In this blog post, we’ll be highlighting some of the key updates we announced at Google I/O this year, so let’s dive in and explore the latest advancements in Wear OS!

Wear OS 4 Developer Preview

Today we’re releasing the first Developer Preview of Wear OS 4, the next version of Google’s smartwatch platform arriving later this year. It has enhancements to security, user customization, and power optimizations.

This preview introduces several new tools to help enhance your Wear OS app experience:

Watch Face Format

We are launching the Watch Face Format, a new way to create watch faces for Wear OS. The format makes it easier to create customizable and more power-efficient watch faces for Wear OS 4. Developed in partnership with Samsung, the Watch Face Format is a declarative XML format, so there is no executable code involved in creating a watch face and there will be no code embedded in your watch face APK. Read more.

Watch faces created using the new Format

Tiles

Wear OS tiles give users fast, predictable access to the information and actions they rely on most. Version 1.2 of the Jetpack Tiles library introduces support for platform data bindings, so if your tile uses platform data sources such as heart rate, step count, or time, your tile is updated once per second.

The new version of tiles also adds support for animations. You can use tween animations to create smooth transitions on changes to part of your layout, and transition animations can animate new or disappearing elements from the tile.

Image showing examples of animated Tiles
Examples of animated Tiles

Get your app ready

Wear OS 4 is based on Android 13, which is several versions newer than the current Wear OS version, so your app will need to handle the system behavior changes that took effect in Android 12 and Android 13. We recommend you start by testing your app and releasing a compatible update first – as devices get upgraded to Wear OS 4, it’s a basic but a critical level of quality that provides a good app experience for users.

Download the Wear OS 4 emulator in Android Studio Hedgehog to explore new features and test your app on Wear OS 4 Developer Preview.


Tooling and library updates

Wear OS support in Firebase Test Lab

Firebase Test Lab will support running tests for your standalone app on physical Google Pixel Watches in the next few weeks. You can run your automated tests on the Google Pixel Watch via Gradle Managed Devices, or use the Firebase Console to also run Robo tests. To learn more, check out available devices.

Wear OS support in the Pre-launch reports

Today we are also excited to announce Wear OS support in Google Play Pre-launch reports for standalone apps. The Pre-launch report helps to identify issues proactively before your app reaches users, so it’s an important tool to help you launch a high-quality app. You can test for stability, accessibility, security and trust, and screenshot previews! At the moment the analysis runs on Wear emulators and it is soon launching on Google Pixel Watches.

Emulator improvements

The Wear OS 4 emulator brings support for emulated Bluetooth, which lets you test more use cases, for example Bluetooth audio.

The new Wear OS 4 emulator doesn’t support unmanaged 32-bit code, so if your app uses native code, make sure that it includes both 32-bit and 64-bit native libraries. This will also prepare your app for upcoming 64-bit only hardware.

In Android Studio Hedgehog we also added capabilities for capturing screenshots and Logcat snapshots in the Wear OS emulator, so it is now much easier to generate screenshots for your app’s store listing.

Jetpack libraries

Since the latest stable Compose for Wear OS 1.1 release, we continue to bring new features and improvements to the toolkit. Version 1.2 already has a number of alpha releases – check out release notes to find out more.

Health Services version 1.0 has introduced a few new features in latest beta releases. Most notably, it includes BatchingMode to deliver batched exercise data at a configured interval instead of the default interval, as well as an ExerciseTypeConfig API which enables updates during ongoing exercises, such as golfing. If you are interested to learn what's new in Android Health, check out this blog.


Start building for Wear OS now

Wear OS active devices have grown 5x since launching Wear OS 3, and it's the fastest growing smartwatch platform.

We’re excited to share our brand new Wear OS Gallery, where you can find even more guidance with proven design and development patterns for messaging, media, and health & fitness apps!

With the latest updates, you'll have even more tools at your disposal to create beautiful, high-quality wearable experiences.


Learn more

Get started building for Wear OS with hands-on experience! Try our Compose for Wear OS codelab, and check out the documentation and samples.

The new Wear OS quality requirements will come into effect on August 31, 2023, so consider them early when designing and developing your app.

We’re looking forward to seeing the experiences that you build!

Introducing the Android Design Hub: The ultimate resource for building exceptional user interfaces across all form factors

Posted by Adhithya Ramakumar, Design Lead, Android Developer Experience and Rebecca Gutteridge, Senior Developer Relations Engineer, Android

Cross posted from Android Medium

We’re introducing the Android Design Hub to make it easier to build compelling UI across form factors.

What is the Android UI design hub?

The design hub is a comprehensive resource with opinionated guidance designed to empower designers and developers like you to create stunning and user-friendly interfaces for Android apps. It's all about sharing takeaways, examples and do’s and don’ts, starter Figma kits, UI samples and inspiring galleries. This is the beginning of a journey we'd love for you to join us on.

Why the Android UI design hub?

A well-designed UI is crucial to the success of an app, and that's why we created a one-stop shop designed to help create outstanding user interfaces across all Android form factors:

The design hub goes into depth about what it means to design for Android, at the same time complementing and extending our Material Design open-source design system. As the Android ecosystem adds an increasing variety of devices, it's more important than ever to create seamless and adaptable experiences for your users. The Android UI Design Hub is your comprehensive resource for mastering the art of designing and implementing UI on a diverse range of devices, from smartphones and tablets to foldables to wearables and TV.

Here’s a sneak peek of what you'll find in the design hub:

Resources organized by form factors

Gain access to a library tailored for each form factor, including guidance, design templates, and sample projects. With these resources at your fingertips, you're well equipped to create exceptional UI experiences for every device category.

moving image of various form factors

Galleries with UI inspiration across form factors

Spark your creativity by exploring our collection of Android app designs for a variety of popular categories of apps such as Social, Productivity, Health & Fitness, Shopping, and more. Get inspired by thoughtfully curated examples and discover design patterns to tackle common design challenges across form factors for

moving image of various Android aopp designs

Design guidelines

Dive into the nitty gritty of Android design principles, learning how to implement high-quality Android designs and understanding app layout with Android system bars, navigation modes, theming, and more. Our comprehensive documentation helps ensure that your app's design adheres to the highest standard and directs you to the latest Material Design guidelines.

Guidelines:

Design tools and resources

image of a 6 x 3 grid of illustrated designs

Equip yourself with an extensive selection of design tools, templates, and resources, specially tailored for Android development. Streamline your workflow and bring your app ideas to life more efficiently:

Check out the code samples with their corresponding Figma Kits:

Whether you're an experienced designer or a developer looking to enhance your design skills, the Android Design Hub is here to support and guide you throughout your journey.

So what are you waiting for? Visit the Android UI design hub today, and start creating exceptional user interfaces that captivate your audience and leave a lasting impression. Happy designing! 🚀