Tag Archives: Android

Use ad inspector to debug your mobile applications

Ad inspector is an in-app overlay that enables authorized devices to perform real-time analysis of Google Mobile Ads SDK test ad requests directly within your mobile app. It is included with the Google Mobile Ads SDK and you can enable it with no coding required.

Ad inspector empowers you to thoroughly test all your ad sources before releasing those changes to your users so you can verify everything is working properly. To help you understand and utilize ad inspector effectively, we published a 7-part ad inspector video series on our Google AdMob YouTube channel.

Each video focuses on a specific challenge in testing your ad integration, offering in-depth tutorials and demonstrations on how to:

Check out our ad inspector documentation (Android, iOS, Unity, Flutter) to learn more. If you have questions, comments, or general feedback about ad inspector, contact us in the developer forum. And remember to subscribe to our Google AdMob YouTube channel for more technical content.

Use ad inspector to debug your mobile applications

Ad inspector is an in-app overlay that enables authorized devices to perform real-time analysis of Google Mobile Ads SDK test ad requests directly within your mobile app. It is included with the Google Mobile Ads SDK and you can enable it with no coding required.

Ad inspector empowers you to thoroughly test all your ad sources before releasing those changes to your users so you can verify everything is working properly. To help you understand and utilize ad inspector effectively, we published a 7-part ad inspector video series on our Google AdMob YouTube channel.

Each video focuses on a specific challenge in testing your ad integration, offering in-depth tutorials and demonstrations on how to:

Check out our ad inspector documentation (Android, iOS, Unity, Flutter) to learn more. If you have questions, comments, or general feedback about ad inspector, contact us in the developer forum. And remember to subscribe to our Google AdMob YouTube channel for more technical content.

Celebrating Another Year of #WeArePlay

Posted by Robbie McLachlan – Developer Marketing

This year #WeArePlay took us on a journey across the globe, spotlighting 300 people behind apps and games on Google Play. From a founder whose app uses AI to assist visually impaired people to a game where nimble-fingered players slice flying fruits and use special combos to beat their own high score, we met founders transforming ideas into thriving businesses.

Let’s start by taking a look back at the people featured in our global film series. From a mother and son duo preserving African languages, to a founder whose app helps kids become published authors - check out the full playlist.


We also continued our global tour around the world with:

And we released global collections of 36 stories, each with a theme reflecting the diversity of the app and game community on Google Play, including:


To the global community of app and game founders, thank you for sharing your inspiring journey. As we enter 2025, we look forward to discovering even more stories of the people behind games and apps businesses on Google Play.



How useful did you find this blog post?

The Second Developer Preview of Android 16

Posted by Matthew McCullough – VP of Product Management, Android Developer


The second developer preview of Android 16 is now available to test with your apps. This build includes changes designed to enhance the app experience, improve battery life, and boost performance while minimizing incompatibilities, and your feedback is critical in helping us understand the full impact of this work.

System triggered profiling

ProfilingManager was added in Android 15, giving apps the ability to request profiling data collection using Perfetto on public devices in the field. To help capture challenging trace scenarios such as startups or ANRs, ProfilingManager now includes System Triggered Profiling. Apps can use ProfilingManager#addProfilingTriggers() to register interest in receiving information about these flows. Flows covered in this release include onFullyDrawn for activity based cold starts, and ANRs.

val anrTrigger = ProfilingTrigger.Builder(
                ProfilingTrigger.TRIGGER_TYPE_ANR
            )
                .setRateLimitingPeriodHours(1)
                .build()

val startupTrigger: ProfilingTrigger =  //...

mProfilingManager.addProfilingTriggers(listOf(anrTrigger, startupTrigger))

Start component in ApplicationStartInfo

ApplicationStartInfo was added in Android 15, allowing an app to see reasons for process start, start type, start times, throttling, and other useful diagnostic data. Android 16 adds getStartComponent() to distinguish what component type triggered the start, which can be helpful for optimizing the startup flow of your app.

Richer Haptics

Android has exposed limited control over the haptic actuator since its inception.

Android 11 added support for more complex haptic effects that more advanced actuators can support through VibrationEffect.Compositions of device-defined semantic primitives.

Android 16 adds haptic APIs that let apps define the amplitude and frequency curves of a haptic effect while abstracting away differences between device capabilities.

Better job introspection

Android 16 introduces JobScheduler#getPendingJobReasons(int jobId) which can return multiple reasons why a job is pending, due to both explicit constraints set by the developer and implicit constraints set by the system.

We're also introducing JobScheduler#getPendingJobReasonsHistory(int jobId), which returns a list of the most recent constraint changes.

The API can help you debug why your jobs may not be executing, especially if you're seeing reduced success rates with certain tasks or latency issues with job completion as well. This can also better help you understand if certain jobs are not completing due to system defined constraints versus explicitly set constraints.

Adaptive refresh rate

Adaptive refresh rate (ARR), introduced in Android 15, enables the display refresh rate on supported hardware to adapt to the content frame rate using discrete VSync steps. This reduces power consumption while eliminating the need for potentially jank-inducing mode-switching.

Android 16 DP2 introduces hasArrSupport() and getSuggestedFrameRate(int) while restoring getSupportedRefreshRates() to make it easier for your apps to take advantage of ARR.

RecyclerView 1.4 internally supports ARR when it is settling from a fling or smooth scroll, and we're continuing our work to add ARR support into more Jetpack libraries. This frame rate article covers many of the APIs you can use to set the frame rate so that your app can directly leverage ARR.

Job execution optimizations

Starting in Android 16, we're adjusting regular and expedited job execution runtime quota based on the following factors:

    • Which app standby bucket the application is in; active standby buckets will be given a generous runtime quota.
    • Jobs started while the app is visible to the user and continues after the app becomes invisible will adhere to the job runtime quota.
    • Jobs that are executing concurrently with a foreground service will adhere to the job runtime quota. If you need to perform a data transfer that may take a long time consider using a user initiated data transfer.
Note: To understand how to further debug and test the behavior change, read more about JobScheduler quota optimizations.

Fully deprecating JobInfo#setImportantWhileForeground

The JobInfo.Builder#setImportantWhileForeground(boolean) method indicates the importance of a job while the scheduling app is in the foreground or when temporarily exempted from background restrictions.

This method has been deprecated since Android 12 (API 31). Starting in Android 16, it will no longer function effectively and calling this method will be ignored.

This removal of functionality also applies to JobInfo#isImportantWhileForeground(). Starting in Android 16, if the method is called, the method will return false.

Deprecated Disruptive Accessibility Announcements

Android 16 DP2 deprecates accessibility announcements, characterized by the use of announceForAccessibility or the dispatch of TYPE_ANNOUNCEMENT AccessibilityEvents. They can create inconsistent user experiences for users of TalkBack and Android's screen reader, and alternatives better serve a broader range of user needs across a variety of Android's assistive technologies.

Examples of alternatives:

The deprecated announceForAccessibility API includes more detail on suggested alternatives.

Cloud search in photo picker

The photo picker provides a safe, built-in way for users to grant your app access to selected images and videos from both local and cloud storage, instead of their entire media library. Using a combination of Modular System Components through Google System Updates and Google Play services, it's supported back to Android 4.4 (API level 19). Integration requires just a few lines of code with the associated Android Jetpack library.

The developer preview includes new APIs to enable searching from the cloud media provider for the Android photo picker. Search functionality in the photo picker is coming soon.

Ranging with enhanced security

Android 16 adds support for robust security features in WiFi location on supported devices with WiFi 6's 802.11az, allowing apps to combine the higher accuracy, greater scalability, and dynamic scheduling of the protocol with security enhancements including AES-256-based encryption and protection against MITM attacks. This allows it to be used more safely in proximity use cases, such as unlocking a laptop or a vehicle door. 802.11az is integrated with the Wi-Fi 6 standard, leveraging its infrastructure and capabilities for wider adoption and easier deployment.

Health Connect updates

Health Connect in the developer preview adds ACTIVITY_INTENSITY, a new datatype defined according to World Health Organization guidelines around moderate and vigorous activity. Each record requires the start time, the end time and whether the activity intensity is moderate or vigorous.

Health Connect also contains updated APIs supporting health records. This allows apps to read and write medical records in FHIR format with explicit user consent. This API is currently in an early access program. Sign up if you'd like to be part of our early access program.

Predictive back additions

Android 16 adds new APIs to help you enable predictive back system animations in gesture navigation such as the back-to-home animation. Registering the onBackInvokedCallback with the new PRIORITY_SYSTEM_NAVIGATION_OBSERVER allows your app to receive the regular onBackInvoked call whenever the system handles a back navigation without impacting the normal back navigation flow.

Android 16 additionally adds the finishAndRemoveTaskCallback() and moveTaskToBackCallback(). By registering these callbacks with the OnBackInvokedDispatcher, the system can trigger specific behaviors and play corresponding ahead-of-time animations when the back gesture is invoked.

Two Android API releases in 2025

This preview is for the next major release of Android with a planned launch in Q2 of 2025 and we plan to have another release with new developer APIs in Q4. The Q2 major release will be the only release in 2025 to include planned behavior changes that could affect apps. The Q4 minor release will pick up feature updates, optimizations, and bug fixes; it will not include any app-impacting behavior changes.

2025 SDK release timeline showing a features only update in Q1 and Q3, a major SDK release with behavior changes, APIs, and features in Q2, and a minor SDK release with APIs and features in Q4

We'll continue to have quarterly Android releases. The Q1 and Q3 updates in-between the API releases will provide incremental updates to help ensure continuous quality. We’re actively working with our device partners to bring the Q2 release to as many devices as possible.

There’s no change to the target API level requirements and the associated dates for apps in Google Play; our plans are for one annual requirement each year, and that will be tied to the major API level.

How to get ready

In addition to performing compatibility testing on the next major release, make sure that you're compiling your apps against the new SDK, and use the compatibility framework to enable targetSdkVersion-gated behavior changes as they become available for early testing.

App compatibility

The Android 16 production timeline shows the release stages, highlighting 'Beta Releases' and 'Platform Stability' in blue and green, respectively, from December to the final release.

The Android 16 Preview program runs from November 2024 until the final public release next year. At key development milestones, we'll deliver updates for your development and testing environments. Each update includes SDK tools, system images, emulators, API reference, and API diffs. We'll highlight critical APIs as they are ready to test in the preview program in blogs and on the Android 16 developer website.

We’re targeting Late Q1 of 2025 for our Platform Stability milestone. At this milestone, we’ll deliver final SDK/NDK APIs and also final internal APIs and app-facing system behaviors. We’re expecting to reach Platform Stability in March 2025, and from that time you’ll have several months before the official release to do your final testing. Learn more in the release timeline details.

Get started with Android 16

You can get started today with Developer Preview 2 by flashing a system image and updating the tools. If you are currently on Developer Preview 1, you will automatically get an over-the-air update to Developer Preview 2. We're looking for your feedback so please report issues and submit feature requests on the feedback page. The earlier we get your feedback, the more we can include in the final release.

For the best development experience with Android 16, we recommend that you use the latest preview of the Android Studio Ladybug feature drop. Once you’re set up, here are some of the things you should do:

    • Compile against the new SDK, test in CI environments, and report any issues in our tracker on the feedback page.
    • Test your current app for compatibility, learn whether your app is affected by changes in Android 16, and install your app onto a device or emulator running Android 16 and extensively test it.

We’ll update the preview system images and SDK regularly throughout the Android 16 release cycle. This preview release is for developers only and not intended for daily consumer use. We're making it available by manual download. Once you’ve manually installed a preview build, you’ll automatically get future updates over-the-air for all later previews and Betas.

If you've already installed Android 15 QPR Beta 2 and would like to flash Android 16 Developer Preview 2, you can do so without first having to wipe your device.

As we reach our Beta releases, we'll be inviting consumers to try Android 16 as well, and we'll open up enrollment for Android 16 in the Android Beta program at that time.

For complete information, visit the Android 16 developer site.

How Instagram enabled users to take stunning Low Light Photos

Posted by Donovan McMurray – Developer Relations Engineer

Instagram, the popular photo and video sharing social networking service, is constantly delighting users with a best-in-class camera experience. Recently, Instagram launched another improvement on Android with their Night Mode implementation.

As devices and their cameras become more and more capable, users expect better quality images in a wider variety of settings. Whether it’s a night out with friends or the calmness right after you get your baby to fall asleep, the special moments users want to capture often don’t have ideal lighting conditions.

Now, when Instagram users on Android take a photo in low light environments, they’ll see a moon icon that allows them to activate Night Mode for better image quality. This feature is currently available to users with any Pixel device from the 6 series and up, a Samsung Galaxy S24Ultra, or a Samsung Flip6 or Fold6, with more devices to follow.

Moving image showing the user experience of taking a photo of a shelf with plants, oranges, and decorative items in low light

Leveraging Device-specific Camera Technologies

Android enables apps to take advantage of device-specific camera features through the Camera Extensions API. The Extensions framework currently provides functionality like Night Mode for low-light image captures, Bokeh for applying portrait-style background blur, and Face Retouch for beauty filters. All of these features are implemented by the Original Equipment Manufacturers (OEMs) in order to maximize the quality of each feature on the hardware it's running on.

A quote by Nilesh Patel, Software Engineer, reads: 'For Meta's billions of users, having to write custom code for each new device is simply not scalable. It would also add unnecessary app size when Meta users download the app. Hence our guideline is ‘write once to scale to billions’, favoring platform APIs.' A headshot of Nilesh Patel is displayed to the right of the quote card.

Furthermore, exposing this OEM-specific functionality through the Extensions API allows developers to use a consistent implementation across all of these devices, getting the best of both worlds: implementations that are tuned to a wide-range of devices with a unified API surface. According to Nilesh Patel, a Software Engineer at Instagram, “for Meta’s billions of users, having to write custom code for each new device is simply not scalable. It would also add unnecessary app size when Meta users download the app. Hence our guideline is ‘write once to scale to billions’, favoring platform APIs.”

More and more OEMs are supporting Extensions, too! There are already over 120 different devices that support the Camera Extensions, representing over 75 million monthly active users. There’s never been a better time to integrate Extensions into your Android app to give your users the best possible camera experience.

Impact on Instagram

The results of adding Night Mode to Instagram have been very positive for Instagram users. Jin Cui, a Partner Engineer on Instagram, said “Night Mode has increased the number of photos captured and shared with the Instagram camera, since the quality of the photos are now visibly better in low-light scenes.”

A quote from Jin Cui, Partner Engineer, reads: 'Night Mode has increased the number of photos captured and shared with the Instagram camera, since the quality of the photos are now visibly better in low-light scenes.'  A photo of Jin Cui wearing glasses and a maroon hoodie is shown to the right of the quote card.

Compare the following photos to see just how big of a difference Night Mode makes. The first photo is taken in Instagram with Night Mode off, the second photo is taken in Instagram with Night Mode on, and the third photo is taken with the native camera app with the device’s own low-light processing enabled.

A 3x3 grid of photos compares low-light performance across different smartphone cameras and Instagram's night mode. The photos show a shelf with plants, oranges, and decorative items, taken with a Pixel 9 Pro, Samsung Galaxy S24 Ultra, and Pixel 6 Pro, both with and without night mode enabled.

Ensuring Quality through Image Test Suite (ITS)

The Android Camera Image Test Suite (ITS) is a framework for testing images from Android cameras. ITS tests configure the camera and capture shots to verify expected image data. These tests are functional and ensure advertised camera features work as expected. A tablet mounted on one side of the ITS box displays the test chart. The device under test is mounted on the opposite side of the ITS box.

Devices must pass the ITS tests for any feature that the device claims to support for apps to use, including the tests we have for the Night Mode Camera Extension.

Regular field-of-view (RFoV) ITS box Rev1b showing the device mounting brackets
Regular field-of-view (RFoV) ITS box Rev1b showing the device mounting brackets

The Android Camera team faced the challenge of ensuring the Night Mode Camera Extension feature functioned consistently across all devices in a scalable way. This required creating a testing environment with very low light and a wide dynamic range. This configuration was necessary to simulate real-world lighting scenarios, such as a city at night with varying levels of brightness and shadow, or the atmospheric lighting of a restaurant.

The first step to designing the test was to define the specific lighting conditions to simulate. Field testing with a light meter in various locations and lighting conditions was conducted to determine the target lux level. The goal was to ensure the camera could capture clear images in low-light conditions, which led to the establishment of 3 lux as the target lux level. The figure below shows various lighting conditions and their respective lux value.

Evaluation of scenes of varying lighting conditions measured with a Light Meter
Evaluation of scenes of varying lighting conditions measured with a Light Meter

The next step was to develop a test chart to accurately measure a wide dynamic range in a low light environment. The team developed and iterated on several test charts and arrived at the following test chart shown below. This chart arranges a grid of squares in varying shades of grey. A red outline defines the test area for cropping. This enables excluding darker external regions. The grid follows a Hilbert curve pattern to minimize abrupt light or dark transitions. The design allows for both quantitative measurements and simulation of a broad range of light conditions.

Low Light test chart displayed on tablet in ITS box
Low Light test chart displayed on tablet in ITS box

The test chart captures an image using the Night Mode Camera Extension in low light conditions. The image is used to evaluate the improvement in the shadows and midtones while ensuring the highlights aren’t saturated. This evaluation involves two criteria: ensure the average luma value of the six darkest boxes is at least 85, and ensure the average luma contrast between these boxes is at least 17. The figure below shows the test capture and chart results.

Night Mode Camera Extension capture and test chart result
Night Mode Camera Extension capture and test chart result

By leveraging the existing ITS infrastructure, the Android Camera team was able to provide consistent, high quality Night Mode Camera Extension captures. This gives application developers the confidence to integrate and enable Night Mode captures for their users. It also allows OEMs to validate their implementations and ensure users get the best quality capture.

How to Implement Night Mode with Camera Extensions

Camera Extensions are available to apps built with Camera2 or CameraX. In this section, we’ll walk through each of the features Instagram implemented. The code examples will use CameraX, but you’ll find links to the Camera2 documentation at each step.

Enabling Night Mode Extension

Night Mode involves combining multiple exposures into a single still photo for better quality shots in low-light environments. So first, you’ll need to check for Night Mode availability, and tell the camera system to start a Camera Extension session. With CameraX, this is done with an ExtensionsManager instead of the standard CameraManager.

private suspend fun setUpCamera() {
  // Obtain an instance of a process camera provider. The camera provider
  // provides access to the set of cameras associated with the device.
  // The camera obtained from the provider will be bound to the activity lifecycle.
  val cameraProvider = ProcessCameraProvider.getInstance(application).await()

  // Obtain an instance of the extensions manager. The extensions manager 
  // enables a camera to use extension capabilities available on the device.
  val extensionsManager = ExtensionsManager.getInstanceAsync(
    application, cameraProvider).await()

  // Select the camera.
  val cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA

  // Query if extension is available. Not all devices will support 
  // extensions or might only support a subset of extensions.
  if (extensionsManager.isExtensionAvailable(cameraSelector, ExtensionMode.NIGHT)) {
    // Unbind all use cases before enabling different extension modes.
    try {
      cameraProvider.unbindAll()

      // Retrieve a night extension enabled camera selector
      val nightCameraSelector = extensionsManager.getExtensionEnabledCameraSelector(
        cameraSelector,
        ExtensionMode.NIGHT
      )

      // Bind image capture and preview use cases with the extension enabled camera
      // selector.
      val imageCapture = ImageCapture.Builder().build()
      val preview = Preview.Builder().build()
        
      // Connect the preview to receive the surface the camera outputs the frames
      // to. This will allow displaying the camera frames in either a TextureView
      // or SurfaceView. The SurfaceProvider can be obtained from the PreviewView.
      preview.setSurfaceProvider(surfaceProvider)

      // Returns an instance of the camera bound to the lifecycle
      // Use this camera object to control various operations with the camera
      // Example: flash, zoom, focus metering etc.
      val camera = cameraProvider.bindToLifecycle(
        lifecycleOwner,
        nightCameraSelector,
        imageCapture,
        preview
      )
    } catch (e: Exception) {
      Log.e(TAG, "Use case binding failed", e)
    }
  } else {
    // In the case where the extension isn't available, you should set up
    // CameraX normally with non-extension-enabled CameraSelector.
  }
}

To do this in Camera2, see the Create a CameraExtensionSession with the Camera2 Extensions API guide.

Implementing the Progress Bar and PostView Image

For an even more elevated user experience, you can provide feedback while the Night Mode capture is processing. In Android 14, we added callbacks for the progress and for post view, which is a temporary image capture before the Night Mode processing is complete. The below code shows how to use these callbacks in the takePicture() method. The actual implementation to update the UI is very app-dependent, so we’ll leave the actual UI updating code to you.

// When setting up the ImageCapture.Builder, set postviewEnabled and 
// posviewResolutionSelector in order to get a PostView bitmap in the
// onPostviewBitmapAvailable callback when takePicture() is called.
val cameraInfo = cameraProvider.getCameraInfo(cameraSelector)
val isPostviewSupported =
  ImageCapture.getImageCaptureCapabilities(cameraInfo).isPostviewSupported

val postviewResolutionSelector = ResolutionSelector.Builder()
  .setAspectRatioStrategy(AspectRatioStrategy(
    AspectRatioStrategy.RATIO_16_9_FALLBACK_AUTO_STRATEGY, 
    AspectRatioStrategy.FALLBACK_RULE_AUTO))
  .setResolutionStrategy(ResolutionStrategy(
    previewSize, 
    ResolutionStrategy.FALLBACK_RULE_CLOSEST_LOWER_THEN_HIGHER
  ))
  .build()

imageCapture = ImageCapture.Builder()
  .setTargetAspectRatio(AspectRatio.RATIO_16_9)
  .setPostviewEnabled(isPostviewSupported)
  .setPostviewResolutionSelector(postviewResolutionSelector)
  .build()

// When the Night Mode photo is being taken, define these additional callbacks
// to implement PostView and a progress indicator in your app.
imageCapture.takePicture(
  outputFileOptions,
  Dispatchers.Default.asExecutor(),
  object : ImageCapture.OnImageSavedCallback {
    override fun onPostviewBitmapAvailable(bitmap: Bitmap) {
      // Add the Bitmap to your UI as a placeholder while the final result is processed
    }

    override fun onCaptureProcessProgressed(progress: Int) {
      // Use the progress value to update your UI; values go from 0 to 100.
    }
  }
)

To accomplish this in Camera2, see the CameraFragment.kt file in the Camera2Extensions sample app.

Implementing the Moon Icon Indicator

Another user-focused design touch is showing the moon icon to let the user know that a Night Mode capture will happen. It’s also a good idea to let the user tap the moon icon to disable Night Mode capture. There’s an upcoming API in Android 16 next year to let you know when the device is in a low-light environment.

Here are the possible values for the Night Mode Indicator API:

      UNKNOWN

      • The camera is unable to reliably detect the lighting conditions of the current scene to determine if a photo will benefit from a Night Mode Camera Extension capture.

      OFF

      • The camera has detected lighting conditions that are sufficiently bright. Night Mode Camera Extension is available but may not be able to optimize the camera settings to take a higher quality photo.

      ON

      • The camera has detected low-light conditions. It is recommended to use Night Mode Camera Extension to optimize the camera settings to take a high-quality photo in the dark.

Next Steps

Read more about Android’s camera APIs in the Camera2 guides and the CameraX guides. Once you’ve got the basics down, check out the Android Camera and Media Dev Center to take your camera app development to the next level. For more details on upcoming Android features, like the Night Mode Indicator API, get started with the Android 16 Preview program.

What’s new in CameraX 1.4.0 and a sneak peek of Jetpack Compose support

Posted by Scott Nien – Software Engineer (scottnien@)

Get ready to level up your Android camera apps! CameraX 1.4.0 just dropped with a load of awesome new features and improvements. We're talking expanded HDR capabilities, preview stabilization and the versatile effect framework, and a whole lot of cool stuff to explore. We will also explore how to seamlessly integrate CameraX with Jetpack Compose! Let's dive in and see how these enhancements can take your camera app to the next level.

HDR preview and Ultra HDR

A split-screen image compares Standard Dynamic Range (SDR) and High Dynamic Range (HDR) image quality side-by-side using a singular image of a detailed landscape. The HDR side is more vivid and vibrant.

High Dynamic Range (HDR) is a game-changer for photography, capturing a wider range of light and detail to create stunningly realistic images. With CameraX 1.3.0, we brought you HDR video recording capabilities, and now in 1.4.0, we're taking it even further! Get ready for HDR Preview and Ultra HDR. These exciting additions empower you to deliver an even richer visual experience to your users.

HDR Preview

This new feature allows you to enable HDR on Preview without needing to bind a VideoCapture use case. This is especially useful for apps that use a single preview stream for both showing preview on display and video recording with an OpenGL pipeline.

To fully enable the HDR, you need to ensure your OpenGL pipeline is capable of processing the specific dynamic range format and then check the camera capability.

See following code snippet as an example to enable HLG10 which is the baseline HDR standard that device makers must support on cameras with 10-bit output.

// Declare your OpenGL pipeline supported dynamic range format. 
val openGLPipelineSupportedDynamicRange = setOf(
     DynamicRange.SDR, 
     DynamicRange.HLG_10_BIT
)
// Check camera dynamic range capabilities. 
val isHlg10Supported =  
     cameraProvider.getCameraInfo(cameraSelector)
           .querySupportedDynamicRanges(openGLPipelineSupportedDynamicRange)
           .contains(DynamicRange.HLG_10_BIT)

val preview = Preview.Builder().apply {
     if (isHlg10Supported) {
        setDynamicRange(DynamicRange.HLG_10_BIT)
     }
}

Ultra HDR

Introducing Ultra HDR, a new format in Android 14 that lets users capture stunningly realistic photos with incredible dynamic range. And the best part? CameraX 1.4.0 makes it incredibly easy to add Ultra HDR capture to your app with just a few lines of code:

val cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA
val cameraInfo = cameraProvider.getCameraInfo(cameraSelector)
val isUltraHdrSupported = 
      ImageCapture.getImageCaptureCapabilities(cameraInfo)
                  .supportedOutputFormats
                  .contains(ImageCapture.OUTPUT_FORMAT_JPEG_ULTRA_HDR)

val imageCapture = ImageCapture.Builder().apply {
    if (isUltraHdrSupported) {
        setOutputFormat(ImageCapture.OUTPUT_FORMAT_JPEG_ULTRA_HDR)
    }
}.build()

Jetpack Compose support

While this post focuses on 1.4.0, we're excited to announce the Jetpack Compose support in CameraX 1.5.0 alpha. We’re adding support for a Composable Viewfinder built on top of AndroidExternalSurface and AndroidEmbeddedExternalSurface. The CameraXViewfinder Composable hooks up a display surface to a CameraX Preview use case, handling the complexities of rotation, scaling and Surface lifecycle so you don’t need to.

// in build.gradle 
implementation ("androidx.camera:camera-compose:1.5.0-alpha03")


class PreviewViewModel : ViewModel() {
    private val _surfaceRequests = MutableStateFlow<SurfaceRequest?>(null)

    val surfaceRequests: StateFlow<SurfaceRequest?>
        get() = _surfaceRequests.asStateFlow()

    private fun produceSurfaceRequests(previewUseCase: Preview) {
        // Always publish new SurfaceRequests from Preview
        previewUseCase.setSurfaceProvider { newSurfaceRequest ->
            _surfaceRequests.value = newSurfaceRequest
        }
    }

    // ...
}

@Composable
fun MyCameraViewfinder(
    viewModel: PreviewViewModel,
    modifier: Modifier = Modifier
) {
    val currentSurfaceRequest: SurfaceRequest? by
        viewModel.surfaceRequests.collectAsState()

    currentSurfaceRequest?.let { surfaceRequest ->
        CameraXViewfinder(
            surfaceRequest = surfaceRequest,
            implementationMode = ImplementationMode.EXTERNAL, // Or EMBEDDED
            modifier = modifier        
        )
    }
}

Kotlin-friendly APIs

CameraX is getting even more Kotlin-friendly! In 1.4.0, we've introduced two new suspend functions to streamline camera initialization and image capture.

// CameraX initialization 
val cameraProvider = ProcessCameraProvider.awaitInstance()

val imageProxy = imageCapture.takePicture() 
// Processing imageProxy
imageProxy.close()

Preview Stabilization and Mirror mode

Preview Stabilization

Preview stabilization mode was added in Android 13 to enable the stabilization on all non-RAW streams, including previews and MediaCodec input surfaces. Compared to the previous video stabilization mode, which may have inconsistent FoV (Field of View) between the preview and recorded video, this new preview stabilization mode ensures consistency and thus provides a better user experience. For apps that record the preview directly for video recording, this mode is also the only way to enable stabilization.

Follow the code below to enable preview stabilization. Please note that once preview stabilization is turned on, it is not only applied to the Preview but also to the VideoCapture if it is bound as well.

val isPreviewStabilizationSupported =  
    Preview.getPreviewCapabilities(cameraProvider.getCameraInfo(cameraSelector))
        .isStabilizationSupported
val preview = Preview.Builder().apply {
    if (isPreviewStabilizationSupported) {
      setPreviewStabilizationEnabled(true)
    }
}.build()

MirrorMode

While CameraX 1.3.0 introduced mirror mode for VideoCapture, we've now brought this handy feature to Preview in 1.4.0. This is especially useful for devices with outer displays, allowing you to create a more natural selfie experience when using the rear camera.

To enable the mirror mode, simply call Preview.Builder.setMirrorMode APIs. This feature is supported for Android 13 and above.

Real-time Effect

CameraX 1.3.0 introduced the CameraEffect framework, giving you the power to customize your camera output with OpenGL. Now, in 1.4.0, we're taking it a step further. In addition to applying your own custom effects, you can now leverage a set of pre-built effects provided by CameraX and Media3, making it easier than ever to enhance your app's camera features.

Overlay Effect

The new camera-effects artifact aims to provide ready-to-use effect implementations, starting with the OverlayEffect. This effect lets you draw overlays on top of camera frames using the familiar Canvas API.

The following sample code shows how to detect the QR code and draw the shape of the QR code once it is detected.

By default, drawing is performed in surface frame coordinates. But what if you need to use camera sensor coordinates? No problem! OverlayEffect provides the Frame#getSensorToBufferTransform function, allowing you to apply the necessary transformation matrix to your overlayCanvas.

In this example, we use CameraX's MLKit Vision APIs (MlKitAnalyzer) and specify COORDINATE_SYSTEM_SENSOR to obtain QR code corner points in sensor coordinates. This ensures accurate overlay placement regardless of device orientation or screen aspect ratio.

// in build.gradle 
implementation ("androidx.camera:camera-effects:1.4.1}")      
implementation ("androidx.camera:camera-mlkit-vision:1.4.1")

var qrcodePoints: Array<Point>? = null
val qrcodeBoxEffect 
    = OverlayEffect(
        PREVIEW /* applied on the preview only */,
        0, /* queueDepth */, 
        Handler(Looper.getMainLooper()), {}
      )

fun initCamera() {
    qrcodeBoxEffect.setOnDrawListener { frame ->
        frame.overlayCanvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.CLEAR)
        qrcodePoints?.let {
            // Using sensor coordinates to draw.
            frame.overlayCanvas.setMatrix(frame.sensorToBufferTransform)
            val path = android.graphics.Path().apply {
                it.forEachIndexed { index, point ->
                    if (index == 0) {
                        moveTo(point.x.toFloat(), point.y.toFloat())
                    } else {
                        lineTo(point.x.toFloat(), point.y.toFloat())
                    }
                 }
                 lineTo(it[0].x.toFloat(), it[0].y.toFloat())
            }
            frame.overlayCanvas.drawPath(path, paint)
        }
        true
    }

    val imageAnalysis = ImageAnalysis.Builder()
        .build()
        .apply {
            setAnalyzer(executor,
                MlKitAnalyzer(
                    listOf(barcodeScanner!!),
                    COORDINATE_SYSTEM_SENSOR,
                    executor
                ) { result ->
                    val barcodes = result.getValue(barcodeScanner!!)
                    qrcodePoints = 
                        barcodes?.takeIf { it.size > 0}?.get(0)?.cornerPoints
                }
            )
        }

    val useCaseGroup = UseCaseGroup.Builder()
          .addUseCase(preview)
          .addUseCase(imageAnalysis)
          .addEffect(qrcodeBoxEffect)
          .build()

    cameraProvider.bindToLifecycle(
        lifecycleOwner, cameraSelector, usecaseGroup)
  }

Media3 Effect

Want to add stunning camera effects to your CameraX app? Now you can tap into the power of Media3's rich effects framework! This exciting integration allows you to apply Media3 effects to your CameraX output, including Preview, VideoCapture, and ImageCapture.

This means you can easily enhance your app with a wide range of professional-grade effects, from blurs and color filters to transitions and more. To get started, simply use the new androidx.camera:media3:media3-effect artifact.

Here's a quick example of how to apply a Gaussian blur to your camera output:

// in build.gradle 
implementation ("androidx.camera.media3:media3-effect:1.0.0-alpha01")
implementation ("androidx.media3:media3-effect:1.5.0")

import androidx.camera.media3.effect.Media3Effect
val media3Effect =
            Media3Effect(
                requireContext(),  PREVIEW or VIDEO_CAPTURE or IMAGE_CAPTURE,
                mainThreadExecutor(), {}
            )
// use grayscale effect
media3Effect.setEffects(listOf(RgbFilter.createGrayscaleFilter()) 
cameraController.setEffects(setOf(media3Effect)) // or using UseCaseGroup API

Here is what the effect looks like:

A black and white view from inside a coffee shop looking out at a city street.  The bottom of the photo shows the edge of a table with a laptop and two buttons labeled 'BACK' and 'RECORD'

Screen Flash

Taking selfies in low light just got easier with CameraX 1.4.0! This release introduces a powerful new feature: screen flash. Instead of relying on a traditional LED flash which most selfie cameras don’t have, screen flash cleverly utilizes your phone's display. By momentarily turning the screen bright white, it provides a burst of illumination that helps capture clear and vibrant selfies even in challenging lighting conditions.

Integrating screen flash into your CameraX app is flexible and straightforward. You have two main options:

      1. Implement the ScreenFlash interface: This gives you full control over the screen flash behavior. You can customize the color, intensity, duration, and any other aspect of the flash. This is ideal if you need a highly tailored solution.

      2. Use the built-in implementation: For a quick and easy solution, leverage the pre-built screen flash functionality in ScreenFlashView or PreviewView. This implementation handles all the heavy lifting for you.

If you're already using PreviewView in your app, enabling screen flash is incredibly simple. Just enable it directly on the PreviewView instance. If you need more control or aren't using PreviewView, you can use ScreenFlashView directly.

Here's a code example demonstrating how to enable screen flash:

// case 1: PreviewView + CameraX core API.
previewView.setScreenFlashWindow(activity.getWindow());
imageCapture.screenFlash = previewView.screenFlash
imageCapture.setFlashMode(ImageCapture.FLASH_MODE_SCREEN)

// case 2: PreviewView + CameraController
previewView.setScreenFlashWindow(activity.getWindow());
cameraController.setImageCaptureFlashMode(ImageCapture.FLASH_MODE_SCREEN);

// case 3 : use ScreenFlashView 
screenFlashView.setScreenFlashWindow(activity.getWindow());
imageCapture.setScreenFlash(screenFlashView.getScreenFlash());
imageCapture.setFlashMode(ImageCapture.FLASH_MODE_SCREEN);

Camera Extensions new features

Camera Extensions APIs aim to help apps to access the cutting-edge capabilities previously available only on built-in camera apps. And the ecosystem is growing rapidly! In 2024, we've seen major players like Pixel, Samsung, Xiaomi, Oppo, OnePlus, Vivo, and Honor all embrace Camera Extensions, particularly for Night Mode and Bokeh Mode. CameraX 1.4.0 takes this even further by adding support for brand-new Android 15 Camera Extensions features, including:

    • Postview: Provides a preview of the captured image almost instantly before the long-exposure shots are completed
    • Capture Process Progress: Displays a progress indicator so users know how long capturing and processing will take, improving the experience for features like Night Mode
    • Extensions Strength: Allows users to fine-tune the intensity of the applied effect

Below is an example of the improved UX that uses postview and capture process progress features on Samsung S24 Ultra.

moving image capturing process progress features on Samsung S24 Ultra

Interested to know how this can be implemented? See the sample code below:

val extensionsCameraSelector =  
    extensionsManager
        .getExtensionEnabledCameraSelector(DEFAULT_BACK_CAMERA, extensionMode)
val isPostviewSupported = ImageCapture.getImageCaptureCapabilities(                   
    cameraProvider.getCameraInfo(extensionsCameraSelector)
).isPostviewSupported
val imageCapture = ImageCapture.Builder().apply {
    setPostviewEnabled(isPostviewSupported)
}.build()

imageCapture.takePicture(outputfileOptions, executor,  
    object : OnImageSavedCallback {
        override fun onImageSaved(outputFileResults: OutputFileResults) {
            // final image saved. 
        }
        override fun onPostviewBitmapAvailable(bitmap: Bitmap) {
            // Postview bitmap is available.
        }
        override fun onCaptureProcessProgressed(progress: Int) {
            // capture process progress update 
        }
}
Important: If your app ran into the CameraX Extensions issue on Pixel 9 series devices, please use CameraX 1.4.1 instead. This release fixes a critical issue that prevented Night Mode from working correctly with takePicture.

What's Next

We hope you enjoy this new release. Our mission is to make camera development a joy, removing the friction and pain points so you can focus on innovation. With CameraX, you can easily harness the power of Android's camera capabilities and build truly amazing app experiences.

Have questions or want to connect with the CameraX team? Join the CameraX developer discussion group or file a bug report:

We can’t wait to see what you create!

Get your apps ready for 16 KB page size devices

Posted by Yacine Rezgui – Developer Relations Engineer, Steven Moreland – Staff Software Engineer

Android is evolving to deliver even faster, more performant experiences. One key improvement is the adoption of a 16 KB memory page size. This change enables the operating system to manage memory more efficiently, leading to noticeable performance gains (5-10%) in both apps and games. We provided an in-depth technical explanation and highlighted the performance improvements in Adding 16 KB Page Size to Android.

To help you test your app on 16 KB devices, this functionality is available as a developer option on Google Pixel 8 and 9 devices, and Samsung devices will soon offer similar support, as well as Xiaomi, vivo, and other Android OEMs.

To ensure compatibility with 16 KB devices, apps that utilize native code, either directly or through libraries or SDKs, might require rebuilding. However, the transition is significantly easier than the previous shift from 32-bit to 64-bit architecture. This article will guide you through the necessary steps to prepare your apps for the upcoming devices. The next generation of devices is on its way, with the first models supporting 16 KB page sizes expected to arrive in a couple of years.

Getting ready for 16 KB: SDK developers

If you develop your own SDKs and libraries, we encourage you to update them to be 16 KB page size compatible and test them on 16 KB devices as soon as possible. This will give app developers ample time to incorporate the necessary changes. Registering with Play SDK Console is a great way to ensure you receive advanced notices like these in the future and in a timely manner.

Getting ready for 16 KB: app developers with no native code

Apps written in and with dependencies entirely in Kotlin or the Java programming languages will work as-is!

Getting ready for 16 KB: app developers with native code

To check if your app has native code, you can utilize tools like APK Analyzer in Android Studio. However, the only way to ensure app compatibility is to test.

Rebuild your app

To ensure your app works on devices with a 16 KB page size, follow these steps:

      1. Upgrade your tools: Start by upgrading to Android Gradle Plugin (AGP) 8.5.1 or higher. These updated tools incorporate the necessary 16 KB page size configuration for your App Bundle and the APKs generated from it using bundletool.

      2. Align your native code: If your app includes native code, use NDK version r28 or higher, or rebuild it with 16 KB page size alignment. You should also ensure that your native code does not rely on or hardcode the value of PAGE_SIZE.

      3. Update SDKs and libraries: Confirm that all SDKs and libraries used in your app are compatible with 16 KB page size. If necessary, contact the SDK or library developers for updated versions.

Test your app in 16 KB mode

To make sure your application does not assume the page size to be 4 KB anywhere, test it with a 16 KB page size emulator or virtual device in addition to how you have been testing (with a 4 KB page size). This helps identify and resolve any compatibility issues from the move to 16 KB page sizes. You can also test on physical devices with the developer option available on Pixel 8, 8a, and 8 Pro starting with the Android 15 QPR1 and Pixel 9, 9 Pro, 9 Pro XL in the Android 15 QPR2 Beta 2, with more devices on the way.

The Future is Faster and More Efficient

The move to 16 KB page size benefits the Android ecosystem. It unlocks performance improvements, paves the way for future innovations, and provides users with smoother and richer app experiences.

We'll continue to provide updates and resources to help you through this transition. Start preparing your apps today to ensure you're ready for the future of Android!

#WeArePlay | Meet the people building sport apps and games

Posted by Robbie McLachlan – Developer Marketing

In a year filled with iconic sports moments—from the Olympic and Paralympic Games in Paris to the UEFA Euro Cup in Germany—our celebration of app and game businesses continues with nine new #WeArePlay stories. These founders are building sports apps and games that unite players, fans, and communities—from immersive sports simulations to apps that motivate runners with rewards like vouchers and free gifts.

Let’s take a look at some of my favourites.

Immerse yourself into your favourite sport with Hao, Yukun, and Mingming's simulator games

Hao, Yukun, and Mingming, founders of Feamber Games pose for a photo - Chengdu, China
Hao, Yukun and Mingming, co-founders of Feamber Games
Chengdu, China

Hao always dreamed of creating video games. After studying computer science, he joined a gaming company where he met Yukun and Mingming. Their shared passion for game design and long conversations about graphics, movie scenes, and nostalgic childhood games inspired them to start Feamber Games. Specializing in realistic 3D sports simulations like pool and archery, they’ve added competitive elements to enhance the experience. Recently, they’ve expanded into immersive games that let players build business empires and manage hotels. Now, the trio is focused on growing their global audience.

Anna’s boxing fitness app is a knockout, with tailored training and on-demand classes

Anna, founder of Boxx, sits cross-legged and smiles at the camera - London, UK
Anna, founder of Boxx
London, UK

Anna discovered her love for boxing at 11, staying dedicated to non-contact training throughout adulthood. After a career in accounting and becoming a mother, she struggled to attend classes, inspiring her to create Boxx – an app that brings boxing training to any location. Collaborating with fitness instructors, she developed personalized sessions, hybrid workouts, expert-led on-demand classes, and progress tracking. With hands-free guided audio and community features coming soon, Anna is regularly reviewing feedback to find innovative approaches to improve boxers’ experiences.

Get active and track your progress with Yi Hern, Dana, and Pearl's running app

Yi Hern, founder of JomRun, smiling at the camera - Cyberjava, Malaysia
Yi Hern, co-founder of JomRun
Cyberjaya, Malaysia

After creating a successful augmented reality game, childhood friends Yi Hern, Dana, and Pearl decided to inspire people to stay active. Combining Yi Hern's engineering skills, Dana's visual arts expertise, and Pearl's scientific background, they developed JomRun – Let’s Run. The app allows runners to track their progress, earn rewards like vouchers and free gifts, and easily join marathons. With teams in Malaysia and Singapore, and plans to introduce new features, the trio is gearing up to expand across Southeast Asia.

Ohjun and Jaeho’s volleyball game get high scores from players worldwide

Ohjun and Jaeho, co-founders of SUNCYAN - Seoul, South Korea
Ohjun and Jaeho, co-founders of SUNCYAN
Seoul, South Korea

Ohjun and Jaeho, childhood friends from an online game development community, combined their love for game building and volleyball to create The Spike - Volleyball Story. After a successful test release on Google Play, the game gained popularity in South Korea, inspiring them to improve it and reach a global audience. They added new features like story and tournament modes, plus a complete UX overhaul, all to recreate the excitement of real-life volleyball. Now, they’re focused on creating even more thrilling sports games.




How useful did you find this blog post?