Tag Archives: App quality

Top things to know in Android Platform and Quality at Google I/O ’23

Posted by Dan Galpin, Developer Relations Engineer

Google I/O was HUGE for developers with exciting news all across the platform and more around quality. Here are the top three announcements around Android and App Quality from Google I/O 2023:

#1 Android 14 comes with new features in privacy and security, system UI, and more

Android 14 continues our effort to improve privacy and security on the platform with the CredentialManager, which has a unified API that brings support for passkeys and federated login. Health Connect is also now a core part of the platform and available on all Android mobile devices directly in Settings, helping to control how users’ health and fitness data is shared across apps.. In addition, the beta of Privacy Sandbox on Android ensures effective privacy-preserving personalized advertising experiences.

Additionally, you’ll find Foreground Service changes, with required types, new permissions, system runtime checks, and new purpose-built APIs for user initiated data transfers and VoIP telephony that behave more consistently across our entire ecosystem. Android 14 also introduces Grammatical Inflection to help your app correctly address your users, along with updated per-app language and regional preferences. Finally, check out the Updated Predictive Back APIs that support in-app animations.

Watch the sessions that will help you get your app ready:

#2 Premium devices mean premium app experiences with camera & media and on-device ML

To help devices become creative powerhouses, Media3's Transformer supports video editing and transcoding and Android 14 introduces Ultra HDR images and more premium camera extensions. To leverage that CPU and GPU power to enable new productivity experiences, ML Kit adds new, production-ready on-device machine learning models such as document scanning and face mesh, and the Acceleration service for your custom ML models is in public beta.

Check out the sessions from I/O to learn more:

#3 More around app quality: a new quality framework, quality hub, and design hub

We've introduced a quality framework and quality hub which includes insights into how Google Play views app quality. We also created a new UI design hub that gives you a centralized destination for guidance, Figma starter kits, UI samples, and inspirational galleries to help apply our best practices for phones, large screens, wearables, and TVs.

Be sure to catch the full Android Platform and Quality playlist from Google I/O for all these videos!

Mindful architecture: Headspace’s refactor to scale

Posted by Manuel Vicente Vivo, Android Developer Relations Engineer

Contributors: Mauricio Vergara, Product Marketing Manager, Developer Marketing, Marialaura Garcia, Associate Product Marketing Manager, Developer Marketing

Headspace Technical case study graphic


Executive Summary

Headspace was ready to launch new wellness and fitness features, but their app architecture wasn’t. They spent eight months refactoring to a Model-View-ViewModel architecture, rewriting in Kotlin and improving test coverage from 15 to 80%. The improved app experience increased MAU by 15% and increased review scores from 3.5 to 4.7 between Q2 and Q4 of 2020. To learn more about how Headspace’s focus on Android Excellence impacted their business, read the accompanying case study here.


Introduction

Headspace has grown into a leader in mindfulness by creating an app which helps millions of people to meditate daily. Mindfulness goes far beyond meditation, it connects to all aspects of a person’s life. That idea prompted the most recent stage in Headspace’s evolution. In 2019, they decided to expand beyond meditation and add new fitness and wellness features to their Android app. Headspace realized that they would need a cross-functional team of engineers and designers to be able to deliver on the new product vision and create an excellent app experience for users. An exciting new phase for the company: their design team started the process by creating prototypes for the new experience, with fresh new designs.

With designs in hand, the only thing stopping Headspace from expanding their app and broadening users’ horizons was their existing Android software architecture. It wasn’t well structured to support all these new features. Headspace’s development team made the case to their leadership that building on the existing code would take longer than a complete rewrite. After sharing the vision and getting everyone on board, the team set out on a collective journey to write a new Android app in pursuit of app excellence.


The Android Rewrite

Headspace’s Android development team first needed a convenient way to standardize how they built and implemented features. "Before we wrote a single line of code, our team spent a week evaluating some important implementation choices for the foundation of our app,” Aram Sheroyan, an Android developer at Headspace explains;

“This was crucial pre-work so that we were all on the same page when we actually started to build."

Immersing themselves in Google’s literature on the latest, best practices for Android development and app architecture, the team found a solution they could all confidently agree on. Google recommended refactoring their app using a new base architecture: model-view-view-model. MVVM is a widely-supported software pattern that is progressively becoming industry standard because it allows developers to create a clear separation of concerns, helping streamline an app’s architecture. “It allowed us to nicely separate our view logic," Sheroyan explained.

With MVVM as the base architecture, they identified Android’s Jetpack libraries, including Dagger and Hilt for dependency injection. The new tools made boilerplate code smaller and easier to structure, not to mention more predictable and efficient. Combined with MVVM, the libraries provided them with a more detailed understanding of how new features should be implemented. The team was also able to improve quality in passing arguments between functions. The app had previously suffered from crashes due to NullPointerException errors and incorrect arguments. Adopting the safeArgs library helped to eliminate errors when passing arguments.

In rewriting the app, the team further made sure to follow the Repository pattern to support a clearer separation of concerns. For example, instead of having one huge class that saves data in shared preferences, they decided that each repository’s local data source should handle the respective logic. This separation of data sources enables the team to test and reproduce business code outside of the live app for unit testing without having to change production code. Separating concerns in this way made the app more stable and the code more modular.

The team also took the opportunity to fully translate their app into the Kotlin programming language, which offered useful helper functions, sealed classes, and extension functions. Removing legacy code and replacing the mix of Java and Kotlin with pure Kotlin code decreased build time for the app. The new architecture also made it easier to write tests and allowed them to increase test coverage from around 15% to more than 80%. This resulted in faster deployments, higher quality code, and fewer crashes.

To capture the new user experience in the app’s reviews, Headspace implemented the Google Play In-App Review API. The new API allowed them to encourage all users to share reviews from within the app. The implementation increased review scores by 24%, and — as store listing reviews are tied to visibility on Google Play — helped draw attention to the app’s recent improvements.


Achieving App Excellence

The rewrite took eight months and with it came a new confidence in the code. Now that the codebase had 80%+ unit test coverage, they could develop and test new features with confidence rather than worries. The new architecture made this possible thanks to its improved logic separation, and a more reusable code, making it easier to plan and implement new features.

The build time for the app decreased dramatically and development velocity picked up. The team’s new clarity around best practices and architecture also reduced friction for onboarding new developers, since it was now based on Android industry standards. They could communicate more clearly with potential candidates during the interview process, as they now had a shared architectural language for discussing problem sets and potential solutions.

With velocity came faster implementation of features and an improved retention flow. They could now optimize their upsell process, which led to a 20% increase in the number of paid Android subscribers relative to other platforms where the app is published. The combination of a new app experience and the implementation of the new In-App Review API led to their review scores improving from 3.5 to 4.7 stars between Q2 and Q4 of 2020! Overall, the new focus on Android App Excellence and the improved ratings earned Headspace a 15% increase in MAU globally..

These were just a few of the payoffs from the significant investment Headspace made in app excellence. Their laser focus on quality paid off across the board, enabling them to continue to grow their community of users and lay a solid foundation for the future evolution of their app experience.


Get your own team on board

If you’re interested in getting your team on board for your own App Excellence journey, check out our condensed case study for product owners and executives linked here. To learn more about how consistent, intuitive app user experiences can grow your business, visit the App Excellence landing page.

Performance and Velocity: How Duolingo Adopted MVVM on Android

Posted by Kateryna Semenova, Android Developer Relations Engineer

illustration of hand holding up a chart with the Duolingo bird sitting on top

Executive Summary

Duolingo’s app began to experience growing pains due to scalability issues in their Android software architecture. They were able to solve these performance problems and regain developer productivity, by refactoring to a Model-View-ViewModel architecture and using Android Jetpack’s Dagger and Hilt for dependency injection. To learn more about how this impacted their business, read the accompanying article here.

Introduction

Duolingo is the world’s most popular language learning app, with over ten million daily learners, because they’ve managed to make something people found daunting feel easy and fun. This continued success relies on a constant stream of innovations and updates — and a smooth-running app that can deliver all of them. To Duolingo, a single unresponsive app in a device anywhere in the world could mean a learner potentially discouraged. This commits them to app excellence, particularly on the Android devices used by sixty percent of their learners, including their CEO, who keeps track of the app from an entry-level phone. And so, when Duolingo's Android development team registered an increase in “App not Responding” errors, dropped frames — and even received a hand-written complaint — they took action immediately.

Their situation wasn’t that uncommon. Apps that lack scalable architecture and clear best practices often perform well at the beginning but show signs of technical debt as they grow. Duolingo’s Android codebase was designed to allow them to add and release new features rapidly, but the lack of an agreed-upon architecture was manifesting in increasingly frequent performance regressions. It was starting to suffer from unreliable frame rates, visually inconsistent or broken interactions, and a growing assortment of new bugs. These regressions not only inconvenienced learners but also cost the team substantial development effort to diagnose and repair. Duolingo’s Android development team realized that if they wanted to keep shipping new features while providing the target level of user experience, a new approach to their codebase was needed.

Discovery

First, they had to get to the bottom of what exactly was going on. A deep dive into the numbers uncovered that, as they added new functionality, the app’s rendering performance was regressing 5-10% every month. In fact, one particularly unwieldy release had increased crashes by 10%, slowed frame renders by 25%, and saw lessons starting 70% slower on entry-level devices.

Further analysis of their code led them to the conclusion that most of the app’s issues could be traced back to a single bottleneck: a global state object called DuoState, which was responsible for maintaining state across different features of the app. A number of popular features (like experience points and daily streak tracking) were using it to access vital information. Centralizing their data in this way had once enabled the team to iterate rapidly. They simply added properties to DuoState whenever a new feature needed to share information across the app. But now the unoptimized and frequent access to the object was causing increasing performance regressions.

DuoState was so tightly coupled to the entire codebase that even small changes could impact the rest of the app. The team feared that a minor new feature could have the unforeseen side effect of triggering many internal updates to the app, causing the entire release to be too slow for many devices. These performance regressions became more frequent as the app grew, and the team onboarded new engineers to keep up with the accelerating product roadmap. In 2020, as they added more developers, they were starting to see significant regressions cropping up as often as every 90 days. Upon closer inspection, the likelihood of a regression in a given release was proportional to the number of changes it implemented. At this rate, these regressions would completely derail the product roadmap within a few years.

This outdated architecture had become a bottleneck for both the performance of the app and the velocity of the team. After much internal debate, they stopped development of new features, including some closely tied to their bottom line. For two full months, Duolingo’s development team went all-in on refactoring their Android app in an effort they called the “Android Reboot”.

The Android Reboot

One of the team’s first key takeaways was that their code lacked clear boundaries. The DuoState object was readily available at any point in the code, inviting developers to access it frequently in inefficient ways. They needed to create a greater separation of concerns within the codebase. They decided to tease apart each feature into its own, clearly-defined module, using the Model-View-ViewModel architectural pattern. MVVM allowed them to remove calls to the monolithic DuoState object, letting many modules work in separate threads.

Diagram showing before and after implementing the Model-View-ViewModel architectural pattern

The team’s familiarity with MVVM, and Google’s support for it, made it an obvious choice. It allowed them to clearly document what logic should go into what files (including views, view models and repositories). This helped make their feature architecture more consistent. With a clear path to follow, the team quickly began refactoring their monolithic code into sets of classes with clear boundaries and responsibilities.

Along with MVVM, the team used Dagger and Hilt (also included in Android Jetpack) to implement repository patterns to replace DuoState. Dagger generates clear readable code that provides verbose error logging designed to help developers understand exactly what their code is doing, eliminating dead stack traces to reflected properties; and Hilt reduces the amount of boilerplate code needed to write for this.

This new architecture allowed the team to split DuoState into smaller objects. This immediately reduced unnecessary coupling between domains. For example, the code responsible for tracking a user’s progress could now access their experience points but not the number of times they’ve logged in during a month. These new architecture guidelines meant that while no single thing was too difficult to change, it took coordination and planning to change it across the app. Implementing the new architecture across the code base drove significant performance gains in aggregate.

MVVM architecture facilitates a separation of concerns between the domain data, the interface the learners see, and the logic for how these two realms interact. It gave Duolingo’s developers a more deliberate way to control how the app responds to internal state updates. They could now develop more complex user experiences without the risk of triggering regressions, or affecting the underlying business rules.

Developer Productivity

In the past, inconsistent application of development patterns made different parts of the codebase harder to understand and maintain. Without consensus, each developer implemented code as they saw fit.

MVVM, Dagger, and Hilt, provided the team with a more detailed understanding of how new features should be implemented. Following these best practices made the code easier and more predictable. Developers could now assist in debugging features that they hadn’t originally worked on. And new developers could be onboarded more efficiently; as long as they understood the architecture, they could contribute meaningfully right away. This new clarity significantly boosted the team’s development velocity.

Ensuring Quality

Crucially, the new architecture also revealed that certain animation features in the app were underperforming on entry-level devices. Accordingly, the other core focus of the Android Reboot was the reduction of jank, dropped frames, and "App Not Responding” (ANR) errors. The team used repository patterns to help streamline the sharing of data between threads. These patterns ensured that they could more efficiently use device resources with multiple threaded modules. Moving work off the main thread improved responsiveness, overall frame rate, and led to smoother animations on entry-level devices. Performance on flagship devices improved as well.

A Better Overall Android Experience

In the six months working with the new architecture, Duolingo’s Android team has continued to ship new features without recording significant performance regressions. The days where they had to halt feature production to hunt and fix bugs are safely behind them.

The app’s daily ANR rate dropped 41%. The percentage of time that the app’s frame rate fell below target decreased by 28%. And importantly, users experienced a 40% increase in speed when scrolling through lessons, the leaderboard, and stories in the app.

The reboot allowed Duolingo to consistently provide their trademark fun, effective, and delightful language learning experience on a much wider range of Android devices.

Conclusion

Duolingo’s dedication to their mission made them the world's top app in the language learning space. Their commitment to app excellence — creating cutting edge educational experiences without compromising accessibility — is what kept them there.

If you’re interested in getting your team on board for your own Android Reboot, check out our condensed case study for product owners and executives linked here.

Working Towards Android App Excellence

Posted by Jacob Lehrbaum Director of Developer Relations, Android

illustration of freckled hand over mobile phone with graphs

Great app experiences are great for business. In fact, nearly three-quarters of Android app users who leave a 5 star review on Google Play mention the quality of their experience with the app1; its speed, design, and usability. At Google, we want to help all developers achieve app excellence, and in turn help you drive user acquisition, retention, and monetization.

So what is “app excellence”? This may sound aspirational, but it is within reach for many apps. It starts with a laser focus on the user, and more specifically, with intuitive user experiences that get people to the main functionality of your app as quickly as possible — but that is just the beginning. Excellent apps are consistent across all of their screens and experiences. They perform well, no matter the device used. App excellence is achievable when all of the stakeholders who influence your app are invested in the experience of using your app.

One of the blockers that gets in the way of app excellence is shared or unclear accountability. Some of the primary measures of app quality, such as crashes and load times, are often seen as the responsibility of one group in the company, such as the engineering team. However, when we talk to best-in-class organizations2 about how they achieve app quality, it is clear that taking a cross-functional approach is key, with engineering, design, product, and business teams working toward a common goal.

So what are some internal best practices behind app excellence?

Make app quality a cross-organizational focus — not just an engineering concern

It’s a way easier conversation for me at the business end because I can say “these competitors’ apps are faster than ours; we need to reduce our load time down from 5 seconds to 4 seconds”.
Software engineer, x-platform app

App excellence helps drive business performance. New features are great, but if they slow down app start-up times or take up too much device space, people will eventually use your app less often or even delete it. Engineers who have built a company-wide focus on quality have often done so by quantifying the impact of quality issues on business performance, through:

  • Case studies showing the impact of responsiveness, APK size, start-up time, and memory usage on business KPIs. Here you can find practical case studies showcasing how developers such as Headspace and Duolingo achieved app excellence.
  • Benchmarking against competitor apps. Check out peer benchmarks and other metrics on the Google Play Console.

Organize teams around features and/or app user journey stages

Companies that organize teams around features — or stages in the user journey — are more likely to deliver consistent experiences across each operating system they support, bring new apps or features to market faster, and deliver a better app experience for all their customers. These teams are often cross-functional groups that span engineering, marketing, ux, and product — and are responsible for the success of a feature or user journey stage3 across all devices and platforms. In addition to better experiences and feature parity, this structure enables alignment of goals across functional areas while reducing silos, and it also helps teams hyper-focus on addressing specific objectives.

Feature organized team graph

Squads focused on business objectives heighten focus on the user.

Use the same devices your customers use

If a majority of your users are on a specific type of device, you can build empathy for their experience if you use the same phone, tablet or smart watch as your primary device. This is especially relevant for senior leadership in your organization who make decisions that impact the day-to-day experience of millions of users. For example, Duolingo has built this into their company DNA. Every Duolingo employee — including their CEO — either uses exclusively or has access to an entry level Android device to reflect a significant portion of their user base.

A user-centric approach to quality and app excellence is essential to business growth. If you are interested in learning how to achieve app excellence, read our case studies with practical tips, and sign up to attend our App Excellence Summit by visiting the Android app excellence webpage.

In subsequent blog posts, we will dig deep into two drivers of excellent app experiences: app performance and how it is linked to user behavior, and creating seamless user experiences across devices. Sign up to the Android developer newsletter here to be notified of the next installment, and get news and insights from the Android team.

Notes


  1. Internal Google Play data, 2021. 

  2. Google App Quality Research, 2021 

  3. The series of steps each user takes as they interact with your app is referred to as the “user journey.” Examples of user journey stages include installs, onboarding, engagement, and retention 

Quality to match with your user’s expectations

Posted by Hoi Lam, Android App Quality

Since the launch of Android more than 10 years ago, the platform and the user’s expectations have grown. There are improvements from user experience through material design to the importance and advancement in privacy. We know you want your apps to offer a great user experience. At the same time, we also know that it’s not always straightforward to know which area to tackle first. That’s why we are launching a new App Quality section in our developer site to help you keep up-to-date with key aspects of app quality and provide related resources.

In the first release, we have updated the Core App Quality checklist to take into account recent Android releases as well as the current trends of the app ecosystem. Here are some highlights in this update:

  • Visual Experience - We highlight the best practice of using Material Design Components in place of platform components such as buttons. This will give your app a modern look as well as making features such as dark theme easy to implement. In addition to advice on back stack, we have expanded it to preserving the state of the app. This is becoming more important as edge-to-edge screens and gesture navigation are becoming commonplace, even in entry level phones.
  • Functionality - There are three areas where we have updated our guidance. For media applications, we have updated our recommendations around the playback experience as well as support for HEVC video compression for video encoding. For sharing between apps, we highlight the importance of using the Android Sharesheet. This will be critical going forward as apps will have limited visibility to other installed apps in API level 30 by default. Lastly, we expanded our recommendations around background services. Helping users to conserve battery is a priority for Android, and we will continue to share updates on this topic.
  • Performance & Stability - We have added tooling now available such as Android vitals in the Google Play Console. One important point to highlight here is Application Not Responding (ANR). ANRs are caused by threading issues and are something developers can fixed. The ANR troubleshooting guide can help you diagnose and resolve any ANRs that exist in the app.
  • Privacy & Security - We have summarized our latest recommendations to take into account the latest safeguards from runtime permission to securely using WebView. We have also expanded to include privacy norms that users come to expect from protecting private data to not using any non-resettable hardware Ids.
  • Google Play - In this section, we highlight some of the most important policies for developers and link you to more information on the guidelines.

Going forward, we aim to update this list on a quarterly basis to make sure this is up-to-date. In addition, we will be updating the quality checklists for other form factors.

We are working on additional tools and best practices to make it easier for you to build quality applications on Android. We can’t wait to introduce these new improvements to you. Stay tuned!

Android Developer Story: Music app developer DJIT builds higher quality experiences and a successful businesses on Android

Posted by Lily Sheringham, Google Play team

Paris-based DJiT is the creator of edjing, one of the most downloaded DJ apps in the world, it now has more than 60 million downloads and a presence in 182 countries. Following their launch on Android, the platform became the largest contributor of business growth, with 50 percent of total revenue and more than 70 percent of new downloads coming from their Android users.

Hear from Jean-Baptiste Hironde, CEO & Co-founder, Séverine Payet, Marketing Manager, and Damien Delépine, Android Software Engineer, to learn how DJit improved latency on new Android Marshmallow, as well as leveraged other Android and Google Play features to create higher quality apps.



Find out more about building great audio apps and how to find success on Google Play.

Tablet changes in Google Play

Posted by Ellie Powers, Google Play team

Fueled by the Nexus 7 and other great devices, more than 70 million Android tablets have been activated. Thousands of developers have already designed their apps to look great on tablets, and with the holidays fast approaching, we’re making it even easier for the next wave of tablet owners to discover great apps and games.

Play Store tablet changes coming up on November 21

Last year, Google Play added a “designed for tablets” section, where users could easily discover apps that look great on their 7”- and 10”-tablets. This section includes only apps and games which meet criteria and guidelines we established last year. (Here’s an overview if you missed it.) Developers who invest the time to meet the criteria are seeing great results; take Remember The Milk, which saw an 83% increase in tablet downloads from being in this section. (see the whole story here).

On November 21 2013, the Play Store made a series of changes so it’s even easier for tablet users to find those apps that are best for their devices. First, by default, users browsing Google Play on a tablet will now see apps and games that are designed for tablets on the top lists (Top Paid, Top Free, Top Grossing, Top New Paid, Top New Free, and Trending). Tablet users will still be able to switch the view so they can see all apps or games if they choose. Also starting November 21, apps and games that do not meet the “designed for tablets” criteria will be marked as “designed for phones” for users who browse the Play Store on tablets.

You’ll want to make sure that your app is designed for tablets; read more about how to do this at the end of this blog post.

Make sure your app is ready!

If you want to be sure your app is included in the “Designed for tablets” view, go to the Developer Console to check your tablet optimization tips. If you see any issues listed there, you’ll need to address them in your app and upload a new binary for distribution. If there are no issues listed, your app is eligible to be included in the “Designed for tablets" view in the top lists.

Also, make sure to read the full tablet quality checklist to understand how to build outstanding tablet experiences.

Everyday, thousands of Android developers are taking advantage of the tremendous Android tablet opportunity. The flood of new users coupled with the increased screen size means new user experiences, more engagement and more monetization opportunities. We’re excited to see what you do!

Respecting Audio Focus

Posted by Kristan Uccello, Google Developer Relations

It’s rude to talk during a presentation, it disrespects the speaker and annoys the audience. If your application doesn’t respect the rules of audio focus then it’s disrespecting other applications and annoying the user. If you have never heard of audio focus you should take a look at the Android developer training material.

With multiple apps potentially playing audio it's important to think about how they should interact. To avoid every music app playing at the same time, Android uses audio focus to moderate audio playback—your app should only play audio when it holds audio focus. This post provides some tips on how to handle changes in audio focus properly, to ensure the best possible experience for the user.

Requesting audio focus

Audio focus should not be requested when your application starts (don’t get greedy), instead delay requesting it until your application is about to do something with an audio stream. By requesting audio focus through the AudioManager system service, an application can use one of the AUDIOFOCUS_GAIN* constants (see Table 1) to indicate the desired level of focus.

Listing 1. Requesting audio focus.

1. AudioManager am = (AudioManager) mContext.getSystemService(Context.AUDIO_SERVICE);
2.     
3.  int result = am.requestAudioFocus(mOnAudioFocusChangeListener,
4.    // Hint: the music stream.
5.    AudioManager.STREAM_MUSIC,
6.    // Request permanent focus.
7.    AudioManager.AUDIOFOCUS_GAIN);
8.  if (result == AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {
9.    mState.audioFocusGranted = true;
10. } else if (result == AudioManager.AUDIOFOCUS_REQUEST_FAILED) {
11.   mState.audioFocusGranted = false;
12. }

In line 7 above, you can see that we have requested permanent audio focus. An application could instead request transient focus using AUDIOFOCUS_GAIN_TRANSIENT which is appropriate when using the audio system for less than 45 seconds.

Alternatively, the app could use AUDIOFOCUS_GAIN_TRANSIENT_MAY_DUCK, which is appropriate when the use of the audio system may be shared with another application that is currently playing audio (e.g. for playing a "keep it up" prompt in a fitness application and expecting background music to duck during the prompt). The app requesting AUDIOFOCUS_GAIN_TRANSIENT_MAY_DUCK should not use the audio system for more than 15 seconds before releasing focus.

Handling audio focus changes

In order to handle audio focus change events, an application should create an instance of OnAudioFocusChangeListener. In the listener, the application will need to handle theAUDIOFOCUS_GAIN* event and AUDIOFOCUS_LOSS* events (see Table 1). It should be noted that AUDIOFOCUS_GAIN has some nuances which are highlighted in Listing 2, below.

Listing 2. Handling audio focus changes.

1. mOnAudioFocusChangeListener = new AudioManager.OnAudioFocusChangeListener() {  
2.   
3. @Override
4. public void onAudioFocusChange(int focusChange) {
5.   switch (focusChange) {
6.   case AudioManager.AUDIOFOCUS_GAIN:
7.     mState.audioFocusGranted = true;
8.        
9.     if(mState.released) {
10.   initializeMediaPlayer();
11.    }
12.
13. switch(mState.lastKnownAudioFocusState) { 14. case UNKNOWN: 15. if(mState.state == PlayState.PLAY && !mPlayer.isPlaying()) { 16. mPlayer.start(); 17. } 18. break; 19. case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT: 20. if(mState.wasPlayingWhenTransientLoss) { 21. mPlayer.start(); 22. } 23. break; 24. case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK: 25. restoreVolume(); 26. break; 27. } 28.
29. break; 30. case AudioManager.AUDIOFOCUS_LOSS: 31. mState.userInitiatedState = false; 32. mState.audioFocusGranted = false; 33. teardown(); 34. break; 35. case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT: 36. mState.userInitiatedState = false; 37. mState.audioFocusGranted = false; 38. mState.wasPlayingWhenTransientLoss = mPlayer.isPlaying(); 39. mPlayer.pause(); 40. break; 41. case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK: 42. mState.userInitiatedState = false; 43. mState.audioFocusGranted = false; 44. lowerVolume(); 45. break; 46. } 47. mState.lastKnownAudioFocusState = focusChange; 48. } 49.};

AUDIOFOCUS_GAIN is used in two distinct scopes of an applications code. First, it can be used when registering for audio focus as shown in Listing 1. This does NOT translate to an event for the registered OnAudioFocusChangeListener, meaning that on a successful audio focus request the listener will NOT receive an AUDIOFOCUS_GAIN event for the registration.

AUDIOFOCUS_GAIN is also used in the implementation of an OnAudioFocusChangeListener as an event condition. As stated above, the AUDIOFOCUS_GAIN event will not be triggered on audio focus requests. Instead the AUDIOFOCUS_GAIN event will occur only after an AUDIOFOCUS_LOSS* event has occurred. This is the only constant in the set shown Table 1 that is used in both scopes.

There are four cases that need to be handled by the focus change listener. When the application receives an AUDIOFOCUS_LOSS this usually means it will not be getting its focus back. In this case the app should release assets associated with the audio system and stop playback. As an example, imagine a user is playing music using an app and then launches a game which takes audio focus away from the music app. There is no predictable time for when the user will exit the game. More likely, the user will navigate to the home launcher (leaving the game in the background) and launch yet another application or return to the music app causing a resume which would then request audio focus again.

However another case exists that warrants some discussion. There is a difference between losing audio focus permanently (as described above) and temporarily. When an application receives an AUDIOFOCUS_LOSS_TRANSIENT, the behavior of the app should be that it suspends its use of the audio system until it receives an AUDIOFOCUS_GAIN event. When the AUDIOFOCUS_LOSS_TRANSIENT occurs, the application should make a note that the loss is temporary, that way on audio focus gain it can reason about what the correct behavior should be (see lines 13-27 of Listing 2).

Sometimes an app loses audio focus (receives an AUDIOFOCUS_LOSS) and the interrupting application terminates or otherwise abandons audio focus. In this case the last application that had audio focus may receive an AUDIOFOCUS_GAIN event. On the subsequent AUDIOFOCUS_GAIN event the app should check and see if it is receiving the gain after a temporary loss and can thus resume use of the audio system or if recovering from an permanent loss, setup for playback.

If an application will only be using the audio capabilities for a short time (less than 45 seconds), it should use an AUDIOFOCUS_GAIN_TRANSIENT focus request and abandon focus after it has completed its playback or capture. Audio focus is handled as a stack on the system — as such the last process to request audio focus wins.

When audio focus has been gained this is the appropriate time to create a MediaPlayer or MediaRecorder instance and allocate resources. Likewise when an app receives AUDIOFOCUS_LOSS it is good practice to clean up any resources allocated. Gaining audio focus has three possibilities that also correspond to the three audio focus loss cases in Table 1. It is a good practice to always explicitly handle all the loss cases in the OnAudioFocusChangeListener.

Table 1. Audio focus gain and loss implication.

GAIN LOSS
AUDIOFOCUS_GAIN AUDIOFOCUS_LOSS
AUDIOFOCUS_GAIN_TRANSIENT AUDIOFOCUS_LOSS_TRANSIENT
AUDIOFOCUS_GAIN_TRANSIENT_MAY_DUCK AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK

Note: AUDIOFOCUS_GAIN is used in two places. When requesting audio focus it is passed in as a hint to the AudioManager and it is used as an event case in the OnAudioFocusChangeListener. The gain events highlighted in green are only used when requesting audio focus. The loss events are only used in the OnAudioFocusChangeListener.

Table 2. Audio stream types.

Stream Type Description
STREAM_ALARM The audio stream for alarms
STREAM_DTMF The audio stream for DTMF Tones
STREAM_MUSIC The audio stream for "media" (music, podcast, videos) playback
STREAM_NOTIFICATION The audio stream for notifications
STREAM_RING The audio stream for the phone ring
STREAM_SYSTEM The audio stream for system sounds

An app will request audio focus (see an example in the sample source code linked below) from the AudioManager (Listing 1, line 1). The three arguments it provides are an audio focus change listener object (optional), a hint as to what audio channel to use (Table 2, most apps should use STREAM_MUSIC) and the type of audio focus from Table 1, column 1. If audio focus is granted by the system (AUDIOFOCUS_REQUEST_GRANTED), only then handle any initialization (see Listing 1, line 9).

Note: The system will not grant audio focus (AUDIOFOCUS_REQUEST_FAILED) if there is a phone call currently in process and the application will not receive AUDIOFOCUS_GAIN after the call ends.

Within an implementation of OnAudioFocusChange(), understanding what to do when an application receives an onAudioFocusChange() event is summarized in Table 3.

In the cases of losing audio focus be sure to check that the loss is in fact final. If the app receives an AUDIOFOCUS_LOSS_TRANSIENT or AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK it can hold onto the media resources it has created (don’t call release()) as there will likely be another audio focus change event very soon thereafter. The app should take note that it has received a transient loss using some sort of state flag or simple state machine.

If an application were to request permanent audio focus with AUDIOFOCUS_GAIN and then receive an AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK an appropriate action for the application would be to lower its stream volume (make sure to store the original volume state somewhere) and then raise the volume upon receiving an AUDIOFOCUS_GAIN event (see Figure 1, below).

Table 3. Appropriate actions by focus change type.

Focus Change Type Appropriate Action
AUDIOFOCUS_GAIN Gain event after loss event: Resume playback of media unless other state flags set by the application indicate otherwise. For example, the user paused the media prior to loss event.
AUDIOFOCUS_LOSS Stop playback. Release assets.
AUDIOFOCUS_LOSS_TRANSIENT Pause playback and keep a state flag that the loss is transient so that when the AUDIOFOCUS_GAIN event occurs you can resume playback if appropriate. Do not release assets.
AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK Lower volume or pause playback keeping track of state as with AUDIOFOCUS_LOSS_TRANSIENT. Do not release assets.

Conclusion and further reading

Understanding how to be a good audio citizen application on an Android device means respecting the system's audio focus rules and handling each case appropriately. Try to make your application behave in a consistent manner and not negatively surprise the user. There is a lot more that can be talked about within the audio system on Android and in the material below you will find some additional discussions.

Example source code is available here:

https://android.googlesource.com/platform/development/+/master/samples/RandomMusicPlayer

Unlocking More Users, with Tablets and Games

Posted by Purnima Kochikar, Director of Business Development, Games & Applications

Last week, we unveiled a number of new things in the world of Android. And while we already showcased the new tools available at your disposal in Android 4.3, we also unveiled a new Nexus 7 tablet, as well as the Google Play Games app, both of which represent opportunities to take advantage of a growing number of users.

Nexus 7 and the Android tablet revolution

If you’re a developer optimizing your app for Android tablets, no doubt you’re familiar with the original Nexus 7. It was Google’s statement on what a great Android tablet experience should look like, and since then, the Android tablet ecosystem has come a long way. There have already been more than 70 million activations of Android tablets, with more than 1 in 2 tablets sold today running Android. We’re starting to see with Android tablets what could be the hockey stick growth all of us experienced a couple of years ago with Android smartphones, and we hope that the new Nexus 7 continues to fuel this growth even further.

Most top developers on Android have already prepared their applications for this wave of new Android tablet users, including many of the essentials, like the New York Times, Zappos, Evernote, Flipboard, Pinterest and more. To help users find your tablet-designed apps more easily on Google Play, you can now choose to only see apps designed for tablets in the top lists. There are also over 50 new collections, which highlight outstanding tablet apps.

To take advantage of the Android tablet revolution, check out our Tablet App Quality Checklist, which has tips and techniques on how to deliver a great app experience for tablet users. It details all of the key things you need to do to optimize your app for tablets, like taking advantage of the extra screen real estate and adjusting font sizes and touch targets, to things you can do on the distribution side, like declaring support for tablet screens and showcasing your tablet UI on Google Play by uploading tablet-specific screenshots. Optimizing your app for Android tablets will unlock a whole new group of users, like those who are about to receive their new Nexus 7 tablets.

Taking your game to the next level

The Android games category on Google Play is on fire; in fact, the vast majority of top mobile game developers are building Android tablet games, and most new titles launch immediately on Android. To help game developers take advantage of the next generation of games, at Google I/O in May, we introduced Google Play game services, our gaming platform for Android, iOS, and the web. By building on Google’s strengths in mobile and cloud services, Google Play game services allows game developers to focus on what they’re good at: creating great gaming experiences for their users.

Turbocharging that growth even more, on Wednesday we introduced the Google Play Games app, which brings your friends together with the games you love, where you can invite a friend and start challenging gamers around the world, compete for top achievements, and race to the top of the leaderboard.

Since the launch at Google I/O, just over two months ago, over one thousand games have added Google Play game services, with millions of users enjoying features like leaderboards and multiplayer inside of the games they love. Some of those early developers using Google Play game services are reporting incredible upticks in vital engagement metrics; for instance, Concrete Software is seeing session length up 15%, and Glu is reporting a 40% increase in 7-day user retention.

Here are a few things you can do to take your game to the next level with Google Play:

  • Integrate with Play Games using achievements and leaderboards to activate your players.
  • Add real-time multiplayer to competitive and cooperative games and increase engagement.
  • Use Play Games branding guidelines and create rich visuals that bolster your presence in the Google Play Games app.

Whether it be getting your app ready for the wave of new Android tablets that are lighting up each day, or opening up a whole new set of features for your users with Google Play game services, a great Android experience starts with a great app or game. That’s why we’re working hard to help provide you with the tools and features needed to create those great experiences for your users, and to help you reach as many of them as possible in the process, with Google Play.