Tag Archives: Android

Take the 2023 Google Mobile Ads SDK developer survey

Today, we’re excited to announce the launch of our 2023 Google Mobile Ads SDK Developer Survey. As part of our efforts to continue updating the AdMob and Ad Manager products, we’d like to hear from you about where we should focus our efforts. This includes product feedback as well as feedback on our guides, code samples and other resources. Your feedback will help shape our future product and resource roadmap.

Take the survey

This anonymous survey should only take about 15 minutes to complete and will provide our team with your valuable feedback as we plan for the months ahead. Whether you’re an engineer, Ad Ops personnel, or a PM, your feedback on AdMob, Ad Manager, and the Google Mobile Ads SDK is valuable to us. We appreciate you taking the time to help improve our developer experience!

Take the 2023 Google Mobile Ads SDK developer survey

Today, we’re excited to announce the launch of our 2023 Google Mobile Ads SDK Developer Survey. As part of our efforts to continue updating the AdMob and Ad Manager products, we’d like to hear from you about where we should focus our efforts. This includes product feedback as well as feedback on our guides, code samples and other resources. Your feedback will help shape our future product and resource roadmap.

Take the survey

This anonymous survey should only take about 15 minutes to complete and will provide our team with your valuable feedback as we plan for the months ahead. Whether you’re an engineer, Ad Ops personnel, or a PM, your feedback on AdMob, Ad Manager, and the Google Mobile Ads SDK is valuable to us. We appreciate you taking the time to help improve our developer experience!

Take the 2023 Google Mobile Ads SDK developer survey

Today, we’re excited to announce the launch of our 2023 Google Mobile Ads SDK Developer Survey. As part of our efforts to continue updating the AdMob and Ad Manager products, we’d like to hear from you about where we should focus our efforts. This includes product feedback as well as feedback on our guides, code samples and other resources. Your feedback will help shape our future product and resource roadmap.

Take the survey

This anonymous survey should only take about 15 minutes to complete and will provide our team with your valuable feedback as we plan for the months ahead. Whether you’re an engineer, Ad Ops personnel, or a PM, your feedback on AdMob, Ad Manager, and the Google Mobile Ads SDK is valuable to us. We appreciate you taking the time to help improve our developer experience!

Concepts users spend 70% more time using the app on tablets than on phones

Posted by the Android team

Concepts is a digital illustration app created by TopHatch that helps creative thinkers bring their visions to life. The app uses an infinitely-large canvas format, so its users can sketch, plan, and edit all of their big ideas without limitation, while its vector-based ink provides the precision needed to refine and reorganize their ideas as they go.

For Concepts, having more on-screen real estate means more comfort, more creative space, and a better user experience overall. That’s why the app was specifically designed with large screens in mind. Concepts’ designers and engineers are always exploring new ways to expand the app’s large screen capabilities on Android. Thanks to Android’s suite of developer tools and resources, that’s easier than ever.

Evaluating an expanding market of devices

Large screens are the fastest growing segment of Android users, with more than 270 million users on tablets, foldables, and ChromeOS devices. It’s no surprise then that Concepts, an app that benefits users by providing them with more screen space, was attracted to the format. The Concepts team was also excited about innovation with foldables because having the large screen experience with greater portability gives users more opportunities to use the app in the ways that are best for them.

The team at Concepts spends a lot of time evaluating new large screen technologies and experiences, trying to find what hardware or software features might benefit the app the most. The team imagines and storyboards several scenarios, shares the best ones with a close-knit beta group, and quickly builds prototypes to determine whether these updates improve the UX for its larger user base.

For instance, Concepts’ designers recently tested the Samsung Galaxy Fold and found that users benefited from having more screen space when the device was folded. With help from the Jetpack WindowManager library, Concepts’ developers implemented a feature to automatically collapse the UI when the Galaxy’s large screen was folded, allowing for more on-screen space than if the UI were expanded.

Foldable UI

Concepts’ first release for Android was optimized for ChromeOS and, because of this, supporting resizable windows was important to their user experience from the very beginning. Initially, they needed to use a physical device to test for various screen sizes. Now, the Concepts team can use Android’s resizeable emulator, which makes testing for different screen sizes much easier.

Android’s APIs and toolkit carry the workload

The developers’ goal with Concepts is to make the illustration experience feel as natural as putting pen to paper. For the Concepts team, this meant achieving as close to zero lag as possible between the stylus tip and the lines drawn on the Concepts canvas.

When Concepts’ engineers first created the app, they put a lot of effort into creating low-latency drawing themselves. Now, Android’s graphical APIs eliminate the complexity of creating efficient inking.

“The hardware to support low-latency inking with higher refresh rate screens and more accurate stylus data keeps getting better,” said David Brittain, co-founder and CEO of TopHatch, parent company of Concepts. “Android’s mature set of APIs make it easy.”

Concepts engineers also found that the core Android View APIs take care of most of the workload for supporting tablets and foldables and make heavy use of custom Views and ViewGroups in Concepts. The app’s color wheel, for example, is a custom View drawing to a Canvas, which uses Animators for the reveal animation. View, Canvas, and Animator are all classes from the Android SDK.

“Android’s tools and platform are making it easier to address the variety of screen sizes and input methods, with well-structured APIs for developing and increasing the number of choices for testing. Plus, Kotlin allows us to create concise, readable code,” said David.


Concepts’ users prefer large screens

Tablets and foldables represent the bulk of Concepts’ investments and user base, and the company doesn’t see that changing any time soon. Currently, tablets deliver 50% higher revenue per user than smartphone users. Tablets also account for eight of the top 10 most frequently used devices among Concepts’ users, with the other two being ChromeOS devices.

Additionally, Concepts’ monthly users spend 70% more time engaging with the app on tablets than on traditional smartphones. The application’s rating is also 0.3 stars higher on tablets.

“We’re looking forward to future improvements in platform usability and customization while increasing experimentation with portable form factors. Continued efforts in this area will ensure high user adoption well into the future,” said David.

Start developing for large screens today

Learn how you can reach a growing audience of users by increasing development for large screens and foldables today.

Play Commerce prevented over $2 billion in fraudulent and abusive transactions in 2022

Posted by Sheenam Mittal, Product Manager, Google Play

Google Play Commerce enables you to monetize your apps and games at scale in over 170 markets, without the complexities and time consumption required to run your own global commerce platform. It enables you to easily transact with millions of users around the world and gives users trusted and safe ways to pay for your digital products and content. Ensuring developers and users have a secure purchase experience has been a key pillar of Play Commerce, and we achieve this by continuously preventing and monitoring for bad actors looking to defraud and abuse your apps.

Preventing fraud and securing purchases

In 2022, we prevented over $2 billion in fraudulent and abusive transactions. Bad actors looking to carry out abuse on apps implement an array of strategies across both one–time purchases as well as auto-renewing payments. For example, they may attempt to purchase an item in your app with a compromised form of payment, or request a refund for an in-app purchase that’s been already consumed or sold, or use scammed gift cards for purchases. When a combined or coordinated attempt is carried out by bad actors, it can result in large-scale abuse on your app. Preventing such fraud and abuse requires a comprehensive approach, consisting of automated solutions and an array of internal monitoring tools combined with human expertise.

Empower developers with tools to mitigate app abuse

Information asymmetry between Google Play and developers is commonly exploited by bad actors. Two of the most effective solutions that you can implement to help address this are Voided Purchases API and Obfuscated Account ID. Over 70% of our top 200 monetizing developers have integrated these solutions to reduce fraud and abuse on their apps.

  • Voided Purchases API provides you with a list of in-app and subscription orders for each user that have been voided. You can implement revocation that prevents the user from accessing products from those orders.
Diagram detailing Improve losses, preserve app economy, and secure game integrity as benfits of Voided Purchases API
Benefits of Voided Purchases API
  • Obfuscated Account ID helps Play detect fraudulent transactions, such as many devices making purchases on the same account in a short period of time.

You can also use Play Integrity API to protect your apps and games from potentially risky and fraudulent interactions, such as cheating and unauthorized access. You call the Play Integrity API at important moments to check that user actions or server requests are coming from your unmodified app, installed by Google Play, running on a genuine Android device. If something is wrong, your app’s backend server can respond with appropriate actions to prevent attacks and reduce abuse. Developers using the API have seen an average of over 50% reduction in unauthorized access of their apps and games. Stay tuned for new highly-requested feature updates.

Chart showing the flow of how Play Integrity API works from user action or server request to app request a Play Inegrity API verdict, to Play returns verdicts to backend server decides what to do next.
Flowchart of how Play Integrity API works

Looking forward

This month, we launched Purchases.product.consume, which allows you to consume in-app items using the Play Developer API, reducing the risk of client-side abuse by shifting more business logic to your secure backends. For example, if a bad actor purchases an item from your app but tampers with the client side, the purchase will be automatically refunded due to lack of acknowledgement after 3 days of purchase. Using server side consumption will prevent this type of app abuse.

Google Play Commerce is committed to providing developers and users a secure purchase experience. Learn more about how to prevent bad actors from harming users and abusing your app by visiting this guide, as well as other 2023 initiatives helping keep Android and Google Play safe.

Media3 is ready to play!

Posted by Nevin Mital - Developer Relations Engineer, Android Media

Today, we’re pleased to announce the full release of the Jetpack Media3 library. After sharing a first look at the library at Android Developer Summit 2021, we published several alpha and beta releases over the past several months to ensure a high-quality set of APIs that we now encourage everyone to adopt.

Media3 is the new home for APIs that enable you to create rich audio and video experiences. If you’ve used libraries like ExoPlayer, MediaCompat, or Media2, you’ll find Media3 to be familiar. However, instead of using these separate libraries, Media3 provides a unified API for playback use-cases and also expands to cover new use-cases like video editing and transcoding. The APIs are simple to use yet powerful, customizable to meet your needs, and reliable and optimized so you can build for the diverse Android device ecosystem.

In this blog post, we’ll focus on the playback APIs in Media3, so please stay tuned for an upcoming post where we’ll dive deeper into the video editing and transcoding APIs. As a brief introduction, the following table describes key components for playback in Media3:

Player

An interface that defines traditional high-level functionality for an audio or video player, such as playback controls.

ExoPlayer

The default implementation of the Player interface in Media3.

MediaSession

An API that advertises media playback to and receives playback command requests from external clients.

MediaSessionService

A service that holds a MediaSession to enable background playback.

MediaLibraryService

A service that additionally allows you to expose a content library to external clients.

MediaController

An API that is generally used by external clients to retrieve playback information and send playback command requests to your media app. Complementary to a MediaSession. Examples of external clients include the notification and lock screen media controls on mobile and large screen devices, Android Auto, WearOS, and Google Assistant.

MediaBrowser

An API that additionally enables external clients to navigate your media app’s content library. Complementary to a MediaLibraryService.

Our developer documentation has more details on these components. Let’s take a closer look into what this new library offers and how you can start using it.

Keeping it simple

By consolidating the APIs for the playback developer journey into a single library, Media3 is able to introduce a Player interface that is used by several components, such as MediaSession and MediaController. This interface outlines traditional high-level functionality for audio and video playback, such as playback controls and the ability to query properties of the currently playing media.

Having a common interface for all “player-like” components means that creating new instances of these objects is straightforward:

val player = ExoPlayer.builder(context).build() val session = MediaSession.Builder(context, player).build() val controller = MediaController.Builder(context, session.token).build()

Media3's MediaSession and MediaController will automatically reflect the state of the components they're connected to. As a result, you can also simplify your app’s architecture by removing connectors like ExoPlayer’s MediaSessionConnector and more easily follow the flow of logic through your app. Calling play() on the MediaController will forward the action to the MediaSession, which will then forward it to the player.

Similarly, Media3 aims to make background playback cases easier to handle. The PlayerNotificationManager from ExoPlayer is no longer needed, as Media3’s MediaSessionService and MediaLibraryService automatically handle publishing a media notification as needed. The library handles configuring, starting, and stopping a foreground service for you as needed, but please also note some known issues summarized in this comment.

ExoPlayer is deprecated, long live ExoPlayer!

ExoPlayer has a new home and is the default implementation of the aforementioned Player interface in Media3. The standalone ExoPlayer project, with package name com.google.android.exoplayer2, will soon be discontinued, and future updates will be published in Media3. For the next few months, we’ll continue publishing equivalent releases of both Media3 and ExoPlayer to help you make the transition to Media3. For example, this means that ExoPlayer 2.18.5 and ExoPlayer in Media3 1.0.0 are identical aside from their package names. However, this is only temporary and we will deprecate the standalone ExoPlayer later this year, so we highly recommend migrating to Media3 as soon as possible. The “Migrating to Media3” section below describes the process in more detail, which includes a script that handles most of the work for you.

Note that Media3 is developed with the same philosophy as ExoPlayer (and in fact, is developed by the same team!). In other words, Media3 retains ExoPlayer’s customizable components, open source development on GitHub, receptivity to pull requests, and public issue tracker, to name a few similarities.

Migrating to Media3

As mentioned previously, the standalone ExoPlayer project, with package name com.google.android.exoplayer2, will soon be discontinued, so to continue receiving updates, you will need to migrate to Media3 ExoPlayer. Other Media APIs that should be migrated to Media3 include, but are not limited to, MediaSessionConnectorMediaBrowserServiceCompat, and MediaBrowserCompat.

We’ve prepared two key resources to help you achieve this migration as smoothly as possible:

  1. migration guide to walk you through the process step-by-step
  2. migration script to convert your standalone ExoPlayer project packages to the corresponding new modules and packages under Media3

The good news is that if you’re currently using ExoPlayer, there’s no need for any code changes and no need to re-integrate or re-write any customizations. The standalone ExoPlayer and Media3 ExoPlayer are identical aside from the package name, and the conversion can be done automatically with the aforementioned migration script. Just make sure you’ve updated your project to use the latest version of ExoPlayer before getting started. For full details and steps, please refer to the migration guide.

Furthermore, since Media3 is fully backwards-compatible with prior media APIs such as MediaControllerCompat and MediaMetadataCompat, your existing integrations will continue to work as before even after the migration. Note that new features such as per-controller customization of commands are only available for clients using Media3. That is to say, for example, all legacy controllers, such as MediaControllerCompat, will receive the same set of available commands. You can identify a legacy controller by checking if getControllerVersion() returns 0 in the MediaSession.ControllerInfo.

The power of Media3, in the palm of your hand

Media3 offers several options for you to adjust its behavior to better fit your needs. The next few sections describe some such mechanisms.

Play it your own way

Although ExoPlayer is the recommended Player implementation to use for audio and video streaming apps, Media3 also introduces the SimpleBasePlayer to minimize the number of methods you need to implement to integrate with a custom player. Start by implementing the getState method. This is where you can declare the Command set supported by your player and configure metadata such as the currently playing media item index and the current timestamp.

class CustomPlayer : SimpleBasePlayer(looper) { override fun getState(): State { // Set available Commands // Configure playWhenReady, mediaItemIndex, currentPosition, etc. } // Implement methods required by available Commands }

The SimpleBasePlayer class will enforce valid player state and handle informing listeners of state changes. Additionally, any methods related to a Command you don’t declare as available are ignored, so beyond getState, you only need to implement the methods that will actually be used.

Better control over your commands

The MediaSession and MediaController APIs have also been updated to give you more control. With Media3, you can advertise your app’s playback capabilities on a per-controller basis. Modify the commands available to a client app in the onConnect method of your MediaSession.Callback. For example, to prevent a client app with package name com.example.myClient from having access to the “seek to next media item” Player.Command:

var sessionCallback = object : MediaSession.Callback { override fun onConnect( session: MediaSession, controller: MediaSession.ControllerInfo ): MediaSession.ConnectionResult { val connectionResult = super.onConnect(session, controller) if (controller.packageName == "com.example.myClient") { val availablePlayerCommands = connectionResult.availablePlayerCommands.buildUpon() .remove(Player.COMMAND_SEEK_TO_NEXT_MEDIA_ITEM) // Disallow myClient from being able to skip to the next media item .build() return MediaSession.ConnectionResult.accept( connectionResult.availableSessionCommands, availablePlayerCommands ) } return connectionResult // Other clients retain normal command access } } var mediaSession = MediaSession.Builder(context, player) .setCallback(sessionCallback) // Remember to set the callback on your MediaSession! .build()

Creating custom commands

Of course, as with the previous media APIs, you can add custom commands tailored to your app. To implement a custom command, create a new SessionCommand. Similar to as shown above, you can give controllers access to this custom command by including it in the list of available session commands. You can handle custom command behavior in the onCustomCommand method of the same Callback:

override fun onCustomCommand( session: MediaSession, controller: MediaSession.ControllerInfo, customCommand: SessionCommand, args: Bundle ): ListenableFuture<SessionResult> { if (customCommand.customAction == MY_CUSTOM_COMMAND) { // Do custom action return Futures.immediateFuture(SessionResult(SessionResult.RESULT_SUCCESS)) } // Return error for invalid custom command return Futures.immediateFuture(SessionResult(SessionResult.RESULT_ERROR_BAD_VALUE)) }

You can also ask client apps to display your custom command by including it in a setCustomLayout call in the onPostConnect method of the MediaSession.Callback.

Next steps

We’d love for you to start using Media3 in your app! 

To start exploring the library, feel free to check out the demo app to see an example of audio and video playback, including how to integrate with a media session. Stay tuned to our developer guides for more detailed guidance on the different components in Media3 landing soon. Our sample app, the Universal Android Music Player, and our testing tool, the Media Controller Test app, will also be updated to Media3 on their main branches in the coming weeks.

If you run into any issues, have any feature requests, or would like to share any other sort of feedback, please let us know using the Media3 issue tracker on GitHub. We look forward to hearing from you!

Launching new #WeArePlay stories from India

Posted by Parul Tyagi, Developer Marketing

Every month, over 2.5 billion people visit Google Play to discover millions of apps and games, which are created by people with all sorts of backgrounds, who founded companies big and small.

#WeArePlay celebrates this community of people building apps and games businesses, with monthly spotlights of founders from across the world.

Last summer we went on a virtual tour of the USA, sharing stories from every state, and today we’re continuing our tour across the world with our next stop: India.

To kick us off, we are spotlighting 20 stories from across the country, with many more coming throughout the year.

Moving text reads #WeArePlay INDIA Discover now g.co/play/weareplay-india Google Play

First, we begin with Pramit from Gurugram, Haryana. He was climbing the corporate ladder when medication he was taking damaged his retina, therefore losing his vision. No longer able to read, he required help from friends and family to perform daily tasks. One day, when a friend was booking a driver for him, Pramit got the idea to create a tool that could function exactly like a virtual friend through voice-activated commands. Using his app Louie Voice Control, people can operate other apps using their voice, making technology infinitely more accessible for the visually impaired.

#WeArePlay Pramit Visioapps Technology Gurugram, Haryana g.co/play/weareplay-india Google Play

Next, meet Sourav and Gunjan from Kolkata, West Bengal. When Sourav and Gunjan had their son, they noticed how fascinated he was watching videos on their phones. This gave Gunjan the idea to provide meaningful screen time for him by making educational games for young children. Fast forward to today and they have 42 apps, including Yoga for Kids where youngsters follow along with simple yoga poses and unlock animated pets as rewards.

#WeArePlay Sourav & Gunjan Gunjanapps Studios Kolkata, West Bengal g.co/play/weareplay-india Google Play

Now onto Tejas from Rajkot, Gurajat. He was always determined to go his own way in life and pursue programming, rather than his family's construction business. After discovering how popular cooking games are, his company TheAppGuruz makes versions catered specifically for Asian audiences - with some full of Indian dishes and specialties. Now, Tejas and his team are developing more cooking simulation titles, as well as traditional board games for a global audience.

#WeArePlay Tejas TheAppGuruz Rajkot, Gujarat g.co/play/weareplay-india Google Play

And last but not least, Anshul and Rohan from Mumbai, Maharashtra. After bonding over their experiences in overcoming mental health struggles, they discovered they had the same goal: to create something in the mental wellness space. So they built Evolve - an app with guided meditations, breathing exercises and daily affirmations. During the pandemic, the pair realized the LGBTQ+ community was one of the most underserved in mental health support, so they adapted Evolve to meet their needs.

#WeArePlay Rohan &Anshul Evolve Mumbai, Maharashtra g.co/play/weareplay-india Google Play

Check out all the stories now at g.co/play/weareplay-india and stay tuned for even more coming soon.


How useful did you find this blog post?

What’s new in the Jetpack Compose March ’23 release

Posted by Jolanda Verhoef, Android Developer Relations Engineer

Today, as part of the Compose March ‘23 Bill of Materials, we’re releasing version 1.4 of Jetpack Compose, Android's modern, native UI toolkit that is used by apps such as Booking.com, Pinterest, and Airbnb. This release contains new features like Pager and Flow Layouts, and new ways to style your text, such as hyphenation and line-break behavior. It also improves the performance of modifiers and fixes a number of bugs.

Swipe through content with the new Pager composable

Compose now includes out-of-the-box support for vertical and horizontal paging between different content. Using VerticalPager or HorizontalPager enables similar functionality to the ViewPager in the view system. However, just like the benefits of using LazyRow and LazyColumn, you no longer need to create an adapter or fragments! You can simply embed a composable inside the Pager:

// Display 10 items HorizontalPager(pageCount = 10) { page -> // Your specific page content, as a composable: Text( text = "Page: $page", modifier = Modifier.fillMaxWidth() ) }

ALT TEXT

These composables replace the implementation in the Accompanist library. If you already use the Accompanist implementation, check out the migration guide. See the Pager documentation for more information.

Get your content flowing with the new Flow Layouts

FlowRow and FlowColumn provide an efficient and compact way to lay out items in a container when the size of the items or the container are unknown or dynamic. These containers allow the items to flow to the next row in the FlowRow or next column in the FlowColumn when they run out of space. These flow layouts also allow for dynamic sizing using weights to distribute the items across the container.

Here’s an example that implements a list of filters for a real estate app:

ALT TEXT

@Composable fun Filters() { val filters = listOf( "Washer/Dryer", "Ramp access", "Garden", "Cats OK", "Dogs OK", "Smoke-free" ) FlowRow( horizontalArrangement = Arrangement.spacedBy(8.dp) ) { filters.forEach { title -> var selected by remember { mutableStateOf(false) } val leadingIcon: @Composable () -> Unit = { Icon(Icons.Default.Check, null) } FilterChip( selected, onClick = { selected = !selected }, label = { Text(title) }, leadingIcon = if (selected) leadingIcon else null ) } } }

Performance improvements in Modifiers

The major internal Modifier refactor we started in the October release has continued, with the migration of multiple foundational modifiers to the new Modifier.Node architecture. This includes graphicsLayer, lower level focus modifiers, padding, offset, and more. This refactoring should bring performance improvements to these APIs, and you don't have to change your code to receive these benefits. Work on this continues, and we expect even more gains in future releases as we migrate Modifiers outside of the ui module. Learn more about the rationale behind the changes in the ADS talk Compose Modifiers deep dive.

Increased flexibility of Text and TextField

Along with various performance improvements, API stabilizations, and bug fixes, the compose-text 1.4 release brings support for the latest emoji version, including backwards compatibility with older Android versions 🎉🙌. Supporting this requires no changes to your application. If you’re using a custom emoji solution, make sure to check out PlatformTextStyle(emojiSupportMatch).

In addition, we’ve addressed one of the main pain points of using TextField. In some scenarios, a text field inside a scrollable Column or LazyColumn would be obscured by the on-screen keyboard after being focused. We re-worked core parts of scroll and focus logic, and added key APIs like PinnableContainer to fix this bug.

Finally, we added a lot of new customization options to Text and its TextStyle:

  • Draw outlined text using TextStyle.drawStyle.
  • Improve text transition and legibility during animations using TextStyle.textMotion.
  • Configure line breaking behavior using TextStyle.lineBreak. Use built-in semantic configurations like Heading, Paragraph, or Simple, or construct your own LineBreak configuration with the desired Strategy, Strictness, and WordBreak values.
  • Add hyphenation support using TextStyle.hyphens.
  • Define a minimum number of visible lines using the minLines parameter of the Text and TextField composables.
  • Make your text move by applying the basicMarquee modifier. As a bonus, because this is a Modifier, you can apply it to any arbitrary composable to make it move in a similar marquee-like fashion!
  • ALT TEXT
    Marquee text using outline with shapes stamped on it using the drawStyle API.

Improvements and fixes for core features

In response to developer feedback, we have shipped some particularly in-demand features & bug fixes in our core libraries:
  • Test waitUntil now accepts a matcher! You can use this API to easily synchronize your test with your UI, with specific conditions that you define.
  • animatedContent now correctly supports getting interrupted and returning to its previous state.
  • Accessibility services focus order has been improved: the sequence is now more logical in common situations, such as with top/bottom bars.
  • AndroidView is now reusable in LazyList if you provide an optional onReset lambda. This improvement lets you use complex non-Compose-based Views inside LazyLists.
  • Color.lerp performance has been improved and now does zero allocations: since this method is called at high frequency during fade animations, this should reduce the amount of garbage collection pauses, especially on older Android versions.
  • Many other minor APIs and bug fixes as part of a general cleanup. For more information, see the release notes.

Get started!

We’re grateful for all of the bug reports and feature requests submitted to our issue tracker - they help us to improve Compose and build the APIs you need. Continue providing your feedback, and help us make Compose better!

Wondering what’s next? Check out our updated roadmap to see the features we’re currently thinking about and working on. We can’t wait to see what you build next!

Happy composing!

Evolution of Crash Management: Behind the Scenes with App Quality Insights

Posted by Rebecca Gutteridge, Senior Developer Relations Engineer

Hey there! I’m Rebecca Gutteridge, Senior Developer Relations Engineer at Google. As someone who has been working closely with developers to understand how we can make the Android platform better, I’m passionate about helping developers improve their app quality to create amazing experiences for users. In 2022 we announced Android Studio’s App Quality Insights (AQI) window which enables developers to discover, investigate, and reproduce issues reported by Firebase Crashlytics, directly within the context of your local Android Studio project. This is a big step in how Android developers can improve their app stability, and I wanted to learn more about the evolution of how mobile developers have managed crashes throughout the years. You can watch the behind the Scenes video on AQI here, and within the latest episode of #TheAndroidShow.


Early Days of Crash Management

I first chatted with Annyce Davis, VP of Engineering at Meetup and Android GDE. She has been in the mobile development space since 2010 and had a lot of hands on experience helping debug user experiences.

“In the early days, developers cared deeply about user crashes, but they didn’t have the tools to replicate or debug the issue, or to understand which users were being impacted. I remember spending lots of time trying to reproduce issues based on minimal information from bug reports.

One time I remember attempting to debug an experience only happening in a specific country, and no matter how many times I tried, I was unable to reproduce it. It wasn’t until I traveled there in person, I realized people were often using 2G. It never dawned on me to check the connection type!” -Annyce Davis

moving image of Annyce Davis, VP of Engineering at Meetup and Android GDE during the App Quality Insights segment of #TheAndroidShow


Firebase Crashlytics Changes the Game

Crashlytics was introduced in 2011 and it has helped developers track, prioritize, and fix app crashes faster. Annyce told me this was a game changer for crash management.

Moving image of text reads 'Crashlytics helps developers track, prioritize, and fix crashes faster'

“We could now know which devices were experiencing issues, could be notified of trending issues, and finally we were able to show non-technical stakeholders crashes visually, to create buy-in for urgent work.

My team received crash reports for a particular screen of the Meetup app, but we could never reproduce the issue given how inconsistent it was. First, Crashlytics helped us narrow down which feature to examine. We found a crash that was due to a null pointer exception on data that we never expected to be null, so it didn’t seem like the crash could even be possible! An engineer on my team was able to use this data from Crashlytics to uncover that the source was a race condition that would lead to the null, and then he was able to fix it.” -Annyce Davis

What a tricky bug, how fascinating!

Behind the Scenes of AQI

I wanted to learn more about the idea behind AQI, so I chatted with David Motsonashvili, a software engineer on the Firebase team who worked on the initial prototype.

“The original idea for the integration came from a quarterly Hackweek, where we were able to experiment on our own projects. We know Android developers use both Firebase console and Android Studio, so I had an idea to integrate Firebase into Android Studio to reduce their need to switch between the two.

The first prototype for this project was actually an integration with Firebase Performance Monitoring and Android Studio, but we realized Crashlytics would have a much bigger impact on developer workflow as an integration in Android Studio, so we pivoted in that direction instead, and the rest is history!” -David Motsonashvili

Moving stylized image of Android and Firebase logos

I loved that the idea came from wanting to help developers and make our tools easier for them to use! I asked David if he had any fun stories about the project.

“We had to be really scrappy about showing our test app's Crashlytics crash data in the IDE because of limitations we had with the API. It was a really fun project to figure out how to work around this during Hackweek!” -David Motsonashvili

I wanted to better understand how AQI evolved from being an idea during Hackweek, to where it is today.

“Once we launched the early developer preview we tested this with a few internal Google teams, and they loved it! We also started testing this with Android developers as part of an early access program. Some of the companies we talked to were Adobe, Luno, and Meetup. They had really valuable feedback that directly contributed to the roadmap. One example is when we learned many teams needed a place to collaborate within AQI, so we of course moved forward with adding the Crashlytics notes feature into AQI.” -David Motsonashvili

Moving image of quote text reads 'Directly solves one of our big pain points - Adobe Acrobat Reader' and 'Helps keep my finger on the pulse and resolve issues quickly [...] without leaving Android Studio - Maia Grotepass, Luno'


Modern Crash Management

Annyce and her team were early testers of AQI, and it was fun to learn about what they thought of the feature.

“I was truly happy to be able to go directly from a link in the stacktrace to the code. It was the feature in Android Studio that you never knew you needed! I especially like that you can filter issues based on the different variants in your app. Every engineer that I know and work with is passionate about delivering performant, quality code. App Quality Insights is the next step in the evolution of crash management, it can help engineers have more agency over addressing crashes while they also work on exciting new features.” -Annyce Davis

We’ve certainly come a long way with the tools developers have to manage bugs and crashes.

moving image of Annyce Davis, VP of Engineering at Meetup and Android GDE during the App Quality Insights segment of #TheAndroidShow with quote text reads 'It was the feature in Android Studio that you never knew you needed'


Get started with AQI

If you’re ready to try AQI out for yourself, download the latest version of Android Studio. You can also view the documentation, guide on medium, and our demo video to learn more about how to use it.

4 updates from the Google for Games Developer Summit

Posted by Alex Chen, Google for Games

This week, we announced new games solutions and updates to our tools at the Google for Games Developer Summit, a free digital event for developers, publishers and advertisers. From highlighting viewership growth trends on YouTube gaming to reaching more players on different devices with Google Play Games on PC, here’s a quick recap with some of our top announcements and key updates.

1. Build high-quality games on Android

The Android team talked about how they’ve made it easier to develop fun and engaging games with updates to Android vitals and the Android Game Development Kit. They also shared how you can get these games to more users on more devices, with Android support for form factors like foldables, Chromebooks and PCs. Learn more about these announcements, including new ways to connect with a global audience, on the Android Developers blog.

2. Strengthen your ads monetization and growth strategies

Google Ads showed advertisers how to get more value from both in-app ads and in-app purchases with a new feature called target return on ad spend for hybrid monetization. And AdMob showed publishers how to save time and costs with a more efficient way to manage ad mediation, with a revamped buyer management interface and streamlined ad unit mapping workflow. See more in the Google Ads blog post.

3. Create connections with your community

As a home of popular gaming creators, videos, and livestreams worldwide, YouTube continues to see incredible growth. The YouTube team announced that over 2 trillion hours of gaming content was consumed in 2022. Through different formats, availability on multiple devices and culture-shaping Creators, they’re committed to being the place where game publishers and Creators reach players and build communities around their favorite games.

4. Keep players engaged with live service games

Google Cloud shared their strategy for live service game development. They’re combining technology that brings togethers players from all over the world, databases that store critical data for an optimal player experience and the analytics that allow game companies to foster a relationship with their players. Learn more on Google Cloud’s blog.

Whether it’s creating the newest hit game, connecting with an enthusiastic community or growing your business to reach more players everywhere, Google is glad to be your partner along the way. To learn more, you can access all content on demand. And if you’re planning to attend Game Developers Conference next week in San Francisco, come say hi at one of our in-person developer sessions.