Tag Archives: media

11 Weeks of Android: Games, media, and 5G

Posted by Dan Galpin, Developer Advocate

Android

This blog post is part of a weekly series for #11WeeksOfAndroid. For each of the #11WeeksOfAndroid, we’re diving into key areas so you don’t miss anything. This week, we spotlighted games, media, and 5G; here’s a look at what you should know.

What's the buzz in Android 11?

  • You can now control media applications from a dedicated space within the notification area while enabling features such as playback resumption and seamless transfer.
  • New and updated 5g APIs help you unlock transformative new user experiences.
  • Adds new support for key game tools and technologies. On top of that foundation, we're building tools to both improve your game developer experience and help you better characterize the performance of your game, services to help you expand the reach of your game to more devices and new audiences, and new and improved features to support your games' go-to-market with Google Play.

Android 11 media

We covered how to take advantage of Android 11's new media controls by making sure your app is using MediaStyle with a valid MediaSession token. We showed how to support Media resumption by making your app discoverable with a MediaBrowserServiceCompat, using the EXTRA_RECENT hint to help with resuming content, and handling the onPlay and onGetRoot callbacks. Finally we showed you how to leverage the MediaRouter jetpack library to support seamless media transfer between devices. Check out the updated version of the UAMP sample which contains a reference implementation for media controls and playback resumption.

Android 11 and 5G

We covered some of the primary ways apps can benefit from 5g, including:

  • Turning indoor use cases into outdoor use cases
  • Turning photo-centric UX into video-centric or AR-centric UX
  • Prefetch helpfully to make your app even more responsive
  • Turn niche use cases into mainstream use cases, such as allowing streaming content everywhere

Android 11 adds new APIs and updates existing APIs to ensure you have all the tools you need to leverage the capabilities of 5G, such as an enhanced bandwidth estimation API, 5G detection capabilities, and a new meteredness flag from cellular carriers. The Android emulator now enables you to develop and test these APIs without needing a 5G device or network connection. All of this and more is available from our dedicated 5G page.

Catch up on what's happening with game development

We presented a special "11 Weeks" episode of The Android Game Developer Show providing an update on the tools, services, and technologies we're bringing to help you build, optimize, and distribute great games.

Check out d.android.com/games to learn about everything we've covered this week and more, and stay up to date by signing up for the games quarterly newsletter.

Android game development tooling

In Android Studio 4.1, we enhanced the System Trace view of the CPU Profiler and added the Native Memory Profiler, and both can now be launched standalone from Android Studio. The System Trace and Native Memory blog posts have more details on how to use them with your game or app.

You can sign up for developer previews of the Android Game Development Extension, and the Android GPU Inspector. The Android Game Development Extension helps with building multi-platform C/C++ games, while the GPU Inspector is used to profile and debug graphics. Stay tuned for the open beta of the Android GPU Inspector.

Reaching more devices and users with your game

We took a deep dive into the Android Performance Tuner, explaining annotations, quality levels, and fidelity parameters along with some best practices on how to use them. Once you've implemented that, we also covered how to use the new insights and analysis you'll get within Android Vitals.

We showed how Google Play Asset Delivery brings the benefits of app bundles to games with large asset sizes, flexible delivery modes, auto-updates, compression, and delta patching. Texture compression format targeting is coming very soon letting you tap into modern texture compression such as ASTC (now supported on over 50% of devices) allowing you to considerably cut your game size and in-memory footprint.

We published new codelabs to help you integrate Android Performance Tuner and Google Play Asset Delivery into your Unity or native C/C++ game.

We explained how we can help protect your game, players, and business by fighting monetization and distribution abuse.

Boost your games' go-to-market

We launched the open beta of Play Games Services - Friends to help you bootstrap and enhance your in-game friend networks while having your games surfaced in new clusters in the Play Games app.

We demonstrated the new release management experience in the Google Play Console beta and showed how it can help your testing and publishing workflow.

Day one auto-installs is a new Google Play feature that allows users to request the automatic installation of your game during pre-registration. Early experiments show a +20% increase in day 1 installs when using this feature. The new pre-registration menu in the beta Google Play Console makes it easier than ever to access this feature.

We showed how to optimize your store listing page to take advantage of the greatly improved games visual experience within Google Play, showcasing rich game graphics and engaging videos.

The new in-app review API lets you choose when to prompt users to write reviews from within your game, without heading back to the app details page. This API supports both public and private reviews for when your app is in beta.

Learning path

If you’re looking for an easy way to pick up the highlights of this week, check out the Games, media, and 5G pathway. A pathway is an ordered tutorial that allows users to complete a pre-defined module that culminates in a quiz. It includes videos and blog posts. A virtual badge is awarded to each user who passes the quiz. Test your knowledge of key takeaways about Android game development, media, and 5G to earn a limited edition badge.

Key takeaways

Thank you for tuning in and learning about the latest in Android game, media, and 5G development.

Seamless media transfer and media resumption

MediaRouter API (UAMP Sample)

5G

Bandwidth estimation API

5G Detection (Android Emulator)

Meteredness flag

Features found in Android Studio 4.1 (Beta Channel)

System Trace in Android Studio CPU Profiler

Android Studio Native Memory Profiler

Pre-release standalone tools

Android Game Development Extension

Android GPU Inspector.

Features in the Android Game SDK

Android Frame Pacing Library

Android Performance Tuner (C/C++ Codelab) (Unity Codelab)

Google Play features

Play Asset Delivery (C/C++ Codelab) (Unity Codelab)

In App Review API

App Licensing

SafetyNet Attestation

Pre-registration

Google Play Games Services

Play Games Services Friends Beta

You can find the entire playlist of #11WeeksOfAndroid video content here, and learn more about each week here. We’ll continue to spotlight new areas each week, so keep an eye out and follow us on Twitter and YouTube. Thanks so much for letting us be a part of this experience with you!

Playing nicely with media controls

Posted by Don Turner - Developer Advocate - Android Media

Android

In Android 11 we've made it easier than ever for users to control media playback. This is achieved with three related features: media controls, playback resumption and seamless transfer.

This article will explain what these features are, how they work together and how you can take advantage of them in your apps.

Media Controls

Android 11's media controls are found below the Quick Settings panel and represent a dedicated persistent space for controlling media playback.

Media

Media controls in Android 11

Part of the motivation for media controls is that users often have multiple media apps (music player, podcasts, video player etc) and regularly switch between them. Media controls display up to five current and recent media sessions in a carousel allowing the user to swipe between them.

On Android 10 and earlier, media notifications for multiple apps can occupy most of the notification area. All those control buttons can also be confusing. Moving the controls into a dedicated space means that there's more room for other notifications, and provides a more consistent user experience for controlling media apps.

Here's the comparison:

image image of screen

Android 10 media notifications (left) Android 11 media controls (right)

Displaying media controls for your app

Now, the really good news. As long as you're using MediaStyle with a valid MediaSession token (both available since Lollipop API 21), media controls will be displayed for your app automatically - no extra work for you!

In case you're not using a MediaStyle and MediaSession here's a quick recap in code:

// Create a media session. NotificationCompat.MediaStyle
// PlayerService is your own Service or Activity responsible for media playback.  
val mediaSession = MediaSessionCompat(this, "PlayerService")

// Create a MediaStyle object and supply your media session token to it. 
val mediaStyle = Notification.MediaStyle().setMediaSession(mediaSession.sessionToken)

// Create a Notification which is styled by your MediaStyle object. 
// This connects your media session to the media controls. 
// Don't forget to include a small icon.
val notification = Notification.Builder(this@PlayerService, CHANNEL_ID)
            .setStyle(mediaStyle)
            .setSmallIcon(R.drawable.ic_app_logo)
            .build()

// Specify any actions which your users can perform, such as pausing and skipping to the next track. 
val pauseAction: Notification.Action = Notification.Action.Builder(
            pauseIcon, "Pause", pauseIntent
        ).build()
notification.addAction(pauseAction)

The small icon and app name are shown in the upper left of the media controls. The actions are shown in the bottom center.

Paging

Media controls UI and corresponding Notification fields

The remaining UI fields, such as track title and playback position, are obtained from the media session's metadata and playback state.

Here's how the metadata fields map to the UI.

mediaSession.setMetadata(
    MediaMetadataCompat.Builder()
        
        // Title. 
        .putString(MediaMetadata.METADATA_KEY_TITLE, currentTrack.title)

        // Artist. 
        // Could also be the channel name or TV series.
        .putString(MediaMetadata.METADATA_KEY_ARTIST, currentTrack.artist)
        
        // Album art. 
        // Could also be a screenshot or hero image for video content
        // The URI scheme needs to be "content", "file", or "android.resource".
        .putString(
            MediaMetadata.METADATA_KEY_ALBUM_ART_URI, currentTrack.albumArtUri)
        )

        // Duration. 
        // If duration isn't set, such as for live broadcasts, then the progress
        // indicator won't be shown on the seekbar.
        .putLong(MediaMetadata.METADATA_KEY_DURATION, currentTrack.duration) // 4

        .build()
)

This screenshot shows how these metadata fields are displayed in the media controls.

metadata

Media controls UI and corresponding metadata fields

The seek bar is updated using the media session's playback state in a similar way:

mediaSession.setPlaybackState(
    PlaybackStateCompat.Builder()
        .setState(
            PlaybackStateCompat.STATE_PLAYING,
                        
            // Playback position.
            // Used to update the elapsed time and the progress bar. 
            mediaPlayer.currentPosition.toLong(), 
                        
            // Playback speed. 
            // Determines the rate at which the elapsed time changes. 
            playbackSpeed
        )

        // isSeekable. 
        // Adding the SEEK_TO action indicates that seeking is supported 
        // and makes the seekbar position marker draggable. If this is not 
        // supplied seek will be disabled but progress will still be shown.
        .setActions(PlaybackStateCompat.ACTION_SEEK_TO)
        .build()
)

This screenshot shows how these playback state fields are displayed in the media controls.

Media controls UI and corresponding playback state fields

Your media controls should now look and function perfectly!

Media resumption

Ever wanted to continue listening to a podcast, TV episode or DJ set but couldn't remember where you left off, or even the app that was playing it? Media resumption solves this problem.

There are two stages to media resumption: discovering recent media apps and resuming playback.

Discovering recent media apps

After booting, Android will look for recent media apps and ask them what their most recently played content was. It will then create media controls for that content.

To be discoverable your app must provide a MediaBrowserService, typically using the MediaBrowserServiceCompat library from Android Jetpack.

On boot, Android will call your MediaBrowserServiceCompat's onGetRoot method, so it's imperative that you return quickly. Usually you would return the root of your media tree from this method but the system also specifies the EXTRA_RECENT hint.

You should treat EXTRA_RECENT as a special case and instead return the root of a media tree that contains the most recently played media item as the first element.

The system will call your onLoadChildren method to obtain this media tree, which is a list of MediaItem objects.

Here's a diagram showing how the system and a media app interact to retrieve the most recently played item.

How the system retrieves the most recently played item from the MediaBrowserService

For the first playable media item in this list the system will create static media controls with just a play button.

static

Static media controls

At this point no media session has been created. This is to save resources - the system doesn't know whether the user actually wants to resume playing that content yet.

Resuming playback

If the user taps the play button the system will make another call to onGetRoot with the EXTRA_RECENT hint. This is so you can prepare your previously played content as before, just in case anything has changed since the static controls were created.

Android will then connect to your media session and issue a play command to it. You should override the media session onPlay callback in order to start playback of your media content and create your MediaStyle notification.

Once you post the notification the static media controls will be swapped with the media controls created from your notification.

Graphic

Figure 7: Diagram showing interaction between System UI and media app when resuming playback

Seamless media transfer

As well as being able to resume media sessions, Android 11 also allows you to easily transfer playback from one device to another. This is known as "seamless media transfer", and is done through the output switcher. The output switcher is shown in the upper right corner of the media notification that appears below.

graphic

Output Switcher (upper right corner)

The output switcher shows the name and icon of the current media route; and when you tap on it you'll see a list of media routes which this app supports.

image

Media routes available to the current app

By default, only local media routes are shown. If your app supports other media routes, such as remote playback you'll need to let the system know.

To do this add the MediaRouter jetpack library to your app.

dependencies {
    implementation 'androidx.mediarouter:mediarouter:1.2.0-alpha02'`
}

Seamless media transfer is supported from 1.2.0-alpha02.

Then, add the MediaTransferReceiver class to your Android manifest.

<receiver android:name="androidx.mediarouter.media.MediaTransferReceiver" />

Now in your app, obtain the MediaRouter singleton - this is an object that maintains the state of all currently available media routes.

router = MediaRouter.getInstance(this)

Create a MediaRouteSelector and specify the route categories which your app supports. The categories you define here determine the routes which are displayed in the output switcher.

Here we'll just specify the "remote playback" category which is used for Cast devices.

routeSelector = MediaRouteSelector.Builder() // Add control categories that this media app is interested in.
            .addControlCategory(MediaControlIntent.CATEGORY_REMOTE_PLAYBACK)
            .build()

If you want to support transfer from remote to local devices, you need to explicitly enable this using setTransferToLocalEnabled:

router.routerParams = MediaRouterParams.Builder().setTransferToLocalEnabled(true).build()

We can now use our selector when adding a media router callback.

router.addCallback(routeSelector, MediaRouterCallback(),
             MediaRouter.CALLBACK_FLAG_REQUEST_DISCOVERY);

We also supply a callback object so we can be informed of changes to media routes.

Here's the callback class:

    private class MediaRouterCallback : MediaRouter.Callback() {
        override fun onRouteSelected(
            router: MediaRouter,
            route: MediaRouter.RouteInfo,
            reason: Int
        ) {
            if (reason == MediaRouter.UNSELECT_REASON_ROUTE_CHANGED) {
                Timber.d("Unselected because route changed, continue playback")
            } else if (reason == MediaRouter.UNSELECT_REASON_STOPPED) {
                Timber.d("Unselected because route was stopped, stop playback")
            }
        }
    }

The method we override is onRouteSelected which will be called whenever a new media route has been selected.

When this happens we need to take into account the reason why it was selected.

If the existing route was disconnected (for example, bluetooth headphones were switched off) we should pause or stop playback.

If the route was actively changed, for example when switching from a phone to a Cast device, then we should continue playing the media from its previous playback position - this is the "seamless" part of "seamless media transfer" :)

Get started

To get started with media controls and related features on Android 11 take a look at the official documentation. Also be sure to check out UAMP which contains a reference implementation for many of the features mentioned in this article.

Good luck and remember to play nicely!

Verifying your Google Assistant media action integrations on Android

Posted by Nevin Mital, Partner Developer Relations

The Media Controller Test (MCT) app is a powerful tool that allows you to test the intricacies of media playback on Android, and it's just gotten even more useful. Media experiences including voice interactions via the Google Assistant on Android phones, cars, TVs, and headphones, are powered by Android MediaSession APIs. This tool will help you verify your integrations. We've now added a new verification testing framework that can be used to help automate your QA testing.

The MCT is meant to be used in conjunction with an app that implements media APIs, such as the Universal Android Music Player. The MCT surfaces information about the media app's MediaController, such as the PlaybackState and Metadata, and can be used to test inter-app media controls.

The Media Action Lifecycle can be complex to follow; even in a simple Play From Search request, there are many intermediate steps (simplified timeline depicted below) where something could go wrong. The MCT can be used to help highlight any inconsistencies in how your music app handles MediaController TransportControl requests.

Timeline of the interaction between the User, the Google Assistant, and the third party Android App for a Play From Search request.

Previously, using the MCT required a lot of manual interaction and monitoring. The new verification testing framework offers one-click tests that you can run to ensure that your media app responds correctly to a playback request.

Running a verification test

To access the new verification tests in the MCT, click the Test button next to your desired media app.

MCT Screenshot of launch screen; contains a list of installed media apps, with an option to go to either the Control or Test view for each.

The next screen shows you detailed information about the MediaController, for example the PlaybackState, Metadata, and Queue. There are two buttons on the toolbar in the top right: the button on the left toggles between parsable and formatted logs, and the button on the right refreshes this view to display the most current information.

MCT Screenshot of the left screen in the Testing view for UAMP; contains information about the Media Controller's Playback State, Metadata, Repeat Mode, Shuffle Mode, and Queue.

By swiping to the left, you arrive at the verification tests view, where you can see a scrollable list of defined tests, a text field to enter a query for tests that require one, and a section to display the results of the test.

MCT Screenshot of the right screen in the Testing view for UAMP; contains a list of tests, a query text field, and a results display section.

As an example, to run the Play From Search Test, you can enter a search query into the text field then hit the Run Test button. Looks like the test succeeded!

MCT Screenshot of the right screen in the Testing view for UAMP; the Play From Search test was run with the query 'Memories' and ended successfully.

Below are examples of the Pause Test (left) and Seek To test (right).

MCT Screenshot of the right screen in the Testing view for UAMP; a Pause test was run successfully. MCT Screenshot of the right screen in the Testing view for UAMP; a Seek To test was run successfully.

Android TV

The MCT now also works on Android TV! For your media app to work with the Android TV version of the MCT, your media app must have a MediaBrowserService implementation. Please see here for more details on how to do this.

On launching the MCT on Android TV, you will see a list of installed media apps. Note that an app will only appear in this list if it implements the MediaBrowserService.

Android TV MCT Screenshot of the launch screen; contains a list of installed media apps that implement the MediaBrowserService.

Selecting an app will take you to the testing screen, which will display a list of verification tests on the right.

Android TV MCT Screenshot of the testing screen; contains a list of tests on the right side.

Running a test will populate the left side of the screen with selected MediaController information. For more details, please check the MCT logs in Logcat.

Android TV MCT Screenshot of the testing screen; the Pause test was run successfully and the left side of the screen now displays selected MediaController information.

Tests that require a query are marked with a keyboard icon. Clicking on one of these tests will open an input field for the query. Upon hitting Enter, the test will run.

Android TV MCT Screenshot of the testing screen; clicking on the Seek To test opened an input field for the query.

To make text input easier, you can also use the ADB command:

adb shell input text [query]

Note that '%s' will add a space between words. For example, the command adb shell input text hello%sworld will add the text "hello world" to the input field.

What's next

The MCT currently includes simple single-media-action tests for the following requests:

  • Play
  • Play From Search
  • Play From Media ID
  • Play From URI
  • Pause
  • Stop
  • Skip To Next
  • Skip To Previous
  • Skip To Queue Item
  • Seek To

For a technical deep dive on how the tests are structured and how to add more tests, visit the MCT GitHub Wiki. We'd love for you to submit pull requests with more tests that you think are useful to have and for any bug fixes. Please make sure to review the contributions process for more information.

Check out the latest updates on GitHub!

Support for new ad formats in AdWords Scripts

AdWords scripts now fully support responsive ads, image ads, HTML5 ads and multiple Gmail ad formats. See our guideon ad types and related code snippets to learn more about using these ad formats in Scripts.

This update also introduces a media service which can be used to upload and query media for use in ads. See our ad media guide for a more detailed overview of media support.

If you have any questions about these changes or AdWords scripts in general, you can post them on our developer forum.