Tag Archives: Android Media

Apps adopt Transformer to support more reliable and performant media editing use cases

Posted by Caren Chang – Developer Relations Engineer

The Jetpack Media3 library enables Android apps to build high quality media apps. As part of the Media3 library, the Transformer module aims to provide easy to use, reliable, and performant APIs for transcoding and editing media.

For example, apps can use Transformer to apply editing operations such as trimming a long piece of media file, or applying effects to video tracks. Transformer can also be used to convert media files from one format to another, such as adjusting the resolution or encoding of the media file.

Developing Transformer APIs

As part of the process to introduce new APIs, our engineering team works closely with Google apps such as Google Photos to test and experiment the new APIs. Experimental flags are first introduced to enable performance improvements. Once the results are successful and conclusive, these experimental features are then built into the default API implementations or promoted to public APIs for all apps to use. This approach allows Transformer APIs to be tested on a wide variety of devices.

Transformer Adoption in apps

Apps that have been using Transformer in production observed in-app performance improvements, less code to maintain, and better developer experience. Let’s take a closer look at how Transformer has helped apps for their media-editing use cases.

One of users’ favorite features in Google Photos is memory sharing, where snippets of your life story that are curated and presented as Google Photos memories can now be shared as videos to social media and chat apps. However, the process of combining media items to create a video on device is resource intensive and subject to significant latency, especially on low-end devices. To reduce this latency and enable the feature on a wider range of devices, Photos adopted Transformer in their media creation pipeline. Along with other improvements made, the team found that Transformer played a part in reducing the median user latency for creating memory videos by 41% on high-end devices and 27% on mid-range devices.

The Photos app also enables users to perform media edits such as trimming or rotating a video. By adopting Transformer APIs for rotating videos, median save latency was reduced by 79% for applicable videos. The app also adopted Transformer’s API for optimizing video trimming, and observed video save latency decrease by 64%.

1 Second Everyday is a personal video journal that helps you create captivating montages and timelapses. One of the app’s main user journeys is sequentially combining short videos to create a meaningful movie. After adopting Transformer for this use case, the app observed that video encoding performance was up to 5x faster, allowing them to explore enabling 4k and HDR support. The Transformer adoption also helped decrease relevant code by 30%, making it easier for the developers to maintain the code base.

BandLab is the next-generation music creation platform used by millions around the world to make and share their music. The app originally used MediaCodecs for their video creation use cases, but found that the low level implementation resulted in native crashes that were difficult to debug. After researching more on Transformer, the team made the decision to migrate from MediaCodecs to Transformer. Overall, it only took the team 12 working days for the migration, and this resulted in a simpler codebase and more maintainable pipeline for their media creation use cases. In addition, the app observed that all previously observed native crashes were no longer occurring anymore.

What’s next for Transformers?

We’re excited to see Transformer’s adoption in the developer community, and will continue adding new features to support more media-editing use cases for the Android ecosystem including:

    • Better support for previewing media edits
    • Improving the performance and developer experience for video frame extraction
    • Easier integration with AI effects
    • and much more

Keep an eye out on what we’re working on in the Media3 Github, and file feature requests to help shape the future of Transformer!

Spotlight Week: Android Camera and Media

Posted by Caren Chang- Android Developer Relations Engineer

Android offers Camera and Media APIs to help you build apps that can capture, edit, share, and play media. To help you enhance Android Camera and Media experiences to be even more delightful for your users, this week we will be kicking off the Camera and Media Spotlight week.

This Spotlight Week will provide resources—blog posts, videos, sample code, and more—all designed to help you uplevel the media experiences in your app. Check out highlights from the latest releases in Camera and Media APIs, including better Jetpack Compose support in CameraX, motion photo support in Media3 Transformer, simpler ExoPlayer setup, and much more! We’ll also bring in developers from the community to talk about their experiences building Android camera and media apps.


Here’s what we’re covering during Camera and Media Spotlight week:

What’s new in camera and media

Tuesday, January 7

Check out what’s new in the latest CameraX and Media3 releases, including how to get started with building Camera apps with Compose.

Creating delightful and premium experiences

Wednesday, January 8

Building delightful and premium experiences for your users is what can help your app really stand out. Learn about different ways to achieve this such as utilizing the Media Performance Class or enabling HDR video capture in your app. Learn from developers, such as how Google Drive enabled Ultra HDR images in their Android app, and Instagram improved the in-app image capture experience by implementing Night Mode.

Adaptive for camera and media, for large screens and now XR!

Thursday, January 9

Thinking adaptive is important, so your app works just as well on phones as it does large screens, like foldables, tablets, ChromeOS, cars, and the new Android XR platform! On Thursday, we’ll be diving into the media experience on large screen devices, and how you can build in a smooth tabletop mode for your camera applications. Prepare your apps for XR devices by considering Spatial Audio and Video.

Media creation

Friday, January 10

Capturing, editing, and processing media content are fundamental features of the Android ecosystem. Learn about how Media3’s Transformer module can help your app’s media processing use cases, and see case studies of apps that are using Transformer in production. Listen in to how the 1 Second Everyday Android app approaches media use cases, and check out a new API that allows apps to capture concurrent camera streams.Learn from Android Google Developer Tom Colvin on how he experimented with building an AI-powered Camera app.


These are just some of the things to think about when building camera and media experiences in your app. Keep checking this blog post for updates; we’ll be adding links and more throughout the week.