Tag Archives: Android Camera

Spotlight Week: Android Camera and Media

Posted by Caren Chang- Android Developer Relations Engineer

Android offers Camera and Media APIs to help you build apps that can capture, edit, share, and play media. To help you enhance Android Camera and Media experiences to be even more delightful for your users, this week we will be kicking off the Camera and Media Spotlight week.

This Spotlight Week will provide resources—blog posts, videos, sample code, and more—all designed to help you uplevel the media experiences in your app. Check out highlights from the latest releases in Camera and Media APIs, including better Jetpack Compose support in CameraX, motion photo support in Media3 Transformer, simpler ExoPlayer setup, and much more! We’ll also bring in developers from the community to talk about their experiences building Android camera and media apps.


Here’s what we’re covering during Camera and Media Spotlight week:

What’s new in camera and media

Tuesday, January 7

Check out what’s new in the latest CameraX and Media3 releases, including how to get started with building Camera apps with Compose.

Creating delightful and premium experiences

Wednesday, January 8

Building delightful and premium experiences for your users is what can help your app really stand out. Learn about different ways to achieve this such as utilizing the Media Performance Class or enabling HDR video capture in your app. Learn from developers, such as how Google Drive enabled Ultra HDR images in their Android app, and Instagram improved the in-app image capture experience by implementing Night Mode.

Adaptive for camera and media, for large screens and now XR!

Thursday, January 9

Thinking adaptive is important, so your app works just as well on phones as it does large screens, like foldables, tablets, ChromeOS, cars, and the new Android XR platform! On Thursday, we’ll be diving into the media experience on large screen devices, and how you can build in a smooth tabletop mode for your camera applications. Prepare your apps for XR devices by considering Spatial Audio and Video.

Media creation

Friday, January 10

Capturing, editing, and processing media content are fundamental features of the Android ecosystem. Learn about how Media3’s Transformer module can help your app’s media processing use cases, and see case studies of apps that are using Transformer in production. Listen in to how the 1 Second Everyday Android app approaches media use cases, and check out a new API that allows apps to capture concurrent camera streams.Learn from Android Google Developer Tom Colvin on how he experimented with building an AI-powered Camera app.


These are just some of the things to think about when building camera and media experiences in your app. Keep checking this blog post for updates; we’ll be adding links and more throughout the week.

How Instagram enabled users to take stunning Low Light Photos

Posted by Donovan McMurray – Developer Relations Engineer

Instagram, the popular photo and video sharing social networking service, is constantly delighting users with a best-in-class camera experience. Recently, Instagram launched another improvement on Android with their Night Mode implementation.

As devices and their cameras become more and more capable, users expect better quality images in a wider variety of settings. Whether it’s a night out with friends or the calmness right after you get your baby to fall asleep, the special moments users want to capture often don’t have ideal lighting conditions.

Now, when Instagram users on Android take a photo in low light environments, they’ll see a moon icon that allows them to activate Night Mode for better image quality. This feature is currently available to users with any Pixel device from the 6 series and up, a Samsung Galaxy S24Ultra, or a Samsung Flip6 or Fold6, with more devices to follow.

Moving image showing the user experience of taking a photo of a shelf with plants, oranges, and decorative items in low light

Leveraging Device-specific Camera Technologies

Android enables apps to take advantage of device-specific camera features through the Camera Extensions API. The Extensions framework currently provides functionality like Night Mode for low-light image captures, Bokeh for applying portrait-style background blur, and Face Retouch for beauty filters. All of these features are implemented by the Original Equipment Manufacturers (OEMs) in order to maximize the quality of each feature on the hardware it's running on.

A quote by Nilesh Patel, Software Engineer, reads: 'For Meta's billions of users, having to write custom code for each new device is simply not scalable. It would also add unnecessary app size when Meta users download the app. Hence our guideline is ‘write once to scale to billions’, favoring platform APIs.' A headshot of Nilesh Patel is displayed to the right of the quote card.

Furthermore, exposing this OEM-specific functionality through the Extensions API allows developers to use a consistent implementation across all of these devices, getting the best of both worlds: implementations that are tuned to a wide-range of devices with a unified API surface. According to Nilesh Patel, a Software Engineer at Instagram, “for Meta’s billions of users, having to write custom code for each new device is simply not scalable. It would also add unnecessary app size when Meta users download the app. Hence our guideline is ‘write once to scale to billions’, favoring platform APIs.”

More and more OEMs are supporting Extensions, too! There are already over 120 different devices that support the Camera Extensions, representing over 75 million monthly active users. There’s never been a better time to integrate Extensions into your Android app to give your users the best possible camera experience.

Impact on Instagram

The results of adding Night Mode to Instagram have been very positive for Instagram users. Jin Cui, a Partner Engineer on Instagram, said “Night Mode has increased the number of photos captured and shared with the Instagram camera, since the quality of the photos are now visibly better in low-light scenes.”

A quote from Jin Cui, Partner Engineer, reads: 'Night Mode has increased the number of photos captured and shared with the Instagram camera, since the quality of the photos are now visibly better in low-light scenes.'  A photo of Jin Cui wearing glasses and a maroon hoodie is shown to the right of the quote card.

Compare the following photos to see just how big of a difference Night Mode makes. The first photo is taken in Instagram with Night Mode off, the second photo is taken in Instagram with Night Mode on, and the third photo is taken with the native camera app with the device’s own low-light processing enabled.

A 3x3 grid of photos compares low-light performance across different smartphone cameras and Instagram's night mode. The photos show a shelf with plants, oranges, and decorative items, taken with a Pixel 9 Pro, Samsung Galaxy S24 Ultra, and Pixel 6 Pro, both with and without night mode enabled.

Ensuring Quality through Image Test Suite (ITS)

The Android Camera Image Test Suite (ITS) is a framework for testing images from Android cameras. ITS tests configure the camera and capture shots to verify expected image data. These tests are functional and ensure advertised camera features work as expected. A tablet mounted on one side of the ITS box displays the test chart. The device under test is mounted on the opposite side of the ITS box.

Devices must pass the ITS tests for any feature that the device claims to support for apps to use, including the tests we have for the Night Mode Camera Extension.

Regular field-of-view (RFoV) ITS box Rev1b showing the device mounting brackets
Regular field-of-view (RFoV) ITS box Rev1b showing the device mounting brackets

The Android Camera team faced the challenge of ensuring the Night Mode Camera Extension feature functioned consistently across all devices in a scalable way. This required creating a testing environment with very low light and a wide dynamic range. This configuration was necessary to simulate real-world lighting scenarios, such as a city at night with varying levels of brightness and shadow, or the atmospheric lighting of a restaurant.

The first step to designing the test was to define the specific lighting conditions to simulate. Field testing with a light meter in various locations and lighting conditions was conducted to determine the target lux level. The goal was to ensure the camera could capture clear images in low-light conditions, which led to the establishment of 3 lux as the target lux level. The figure below shows various lighting conditions and their respective lux value.

Evaluation of scenes of varying lighting conditions measured with a Light Meter
Evaluation of scenes of varying lighting conditions measured with a Light Meter

The next step was to develop a test chart to accurately measure a wide dynamic range in a low light environment. The team developed and iterated on several test charts and arrived at the following test chart shown below. This chart arranges a grid of squares in varying shades of grey. A red outline defines the test area for cropping. This enables excluding darker external regions. The grid follows a Hilbert curve pattern to minimize abrupt light or dark transitions. The design allows for both quantitative measurements and simulation of a broad range of light conditions.

Low Light test chart displayed on tablet in ITS box
Low Light test chart displayed on tablet in ITS box

The test chart captures an image using the Night Mode Camera Extension in low light conditions. The image is used to evaluate the improvement in the shadows and midtones while ensuring the highlights aren’t saturated. This evaluation involves two criteria: ensure the average luma value of the six darkest boxes is at least 85, and ensure the average luma contrast between these boxes is at least 17. The figure below shows the test capture and chart results.

Night Mode Camera Extension capture and test chart result
Night Mode Camera Extension capture and test chart result

By leveraging the existing ITS infrastructure, the Android Camera team was able to provide consistent, high quality Night Mode Camera Extension captures. This gives application developers the confidence to integrate and enable Night Mode captures for their users. It also allows OEMs to validate their implementations and ensure users get the best quality capture.

How to Implement Night Mode with Camera Extensions

Camera Extensions are available to apps built with Camera2 or CameraX. In this section, we’ll walk through each of the features Instagram implemented. The code examples will use CameraX, but you’ll find links to the Camera2 documentation at each step.

Enabling Night Mode Extension

Night Mode involves combining multiple exposures into a single still photo for better quality shots in low-light environments. So first, you’ll need to check for Night Mode availability, and tell the camera system to start a Camera Extension session. With CameraX, this is done with an ExtensionsManager instead of the standard CameraManager.

private suspend fun setUpCamera() {
  // Obtain an instance of a process camera provider. The camera provider
  // provides access to the set of cameras associated with the device.
  // The camera obtained from the provider will be bound to the activity lifecycle.
  val cameraProvider = ProcessCameraProvider.getInstance(application).await()

  // Obtain an instance of the extensions manager. The extensions manager 
  // enables a camera to use extension capabilities available on the device.
  val extensionsManager = ExtensionsManager.getInstanceAsync(
    application, cameraProvider).await()

  // Select the camera.
  val cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA

  // Query if extension is available. Not all devices will support 
  // extensions or might only support a subset of extensions.
  if (extensionsManager.isExtensionAvailable(cameraSelector, ExtensionMode.NIGHT)) {
    // Unbind all use cases before enabling different extension modes.
    try {
      cameraProvider.unbindAll()

      // Retrieve a night extension enabled camera selector
      val nightCameraSelector = extensionsManager.getExtensionEnabledCameraSelector(
        cameraSelector,
        ExtensionMode.NIGHT
      )

      // Bind image capture and preview use cases with the extension enabled camera
      // selector.
      val imageCapture = ImageCapture.Builder().build()
      val preview = Preview.Builder().build()
        
      // Connect the preview to receive the surface the camera outputs the frames
      // to. This will allow displaying the camera frames in either a TextureView
      // or SurfaceView. The SurfaceProvider can be obtained from the PreviewView.
      preview.setSurfaceProvider(surfaceProvider)

      // Returns an instance of the camera bound to the lifecycle
      // Use this camera object to control various operations with the camera
      // Example: flash, zoom, focus metering etc.
      val camera = cameraProvider.bindToLifecycle(
        lifecycleOwner,
        nightCameraSelector,
        imageCapture,
        preview
      )
    } catch (e: Exception) {
      Log.e(TAG, "Use case binding failed", e)
    }
  } else {
    // In the case where the extension isn't available, you should set up
    // CameraX normally with non-extension-enabled CameraSelector.
  }
}

To do this in Camera2, see the Create a CameraExtensionSession with the Camera2 Extensions API guide.

Implementing the Progress Bar and PostView Image

For an even more elevated user experience, you can provide feedback while the Night Mode capture is processing. In Android 14, we added callbacks for the progress and for post view, which is a temporary image capture before the Night Mode processing is complete. The below code shows how to use these callbacks in the takePicture() method. The actual implementation to update the UI is very app-dependent, so we’ll leave the actual UI updating code to you.

// When setting up the ImageCapture.Builder, set postviewEnabled and 
// posviewResolutionSelector in order to get a PostView bitmap in the
// onPostviewBitmapAvailable callback when takePicture() is called.
val cameraInfo = cameraProvider.getCameraInfo(cameraSelector)
val isPostviewSupported =
  ImageCapture.getImageCaptureCapabilities(cameraInfo).isPostviewSupported

val postviewResolutionSelector = ResolutionSelector.Builder()
  .setAspectRatioStrategy(AspectRatioStrategy(
    AspectRatioStrategy.RATIO_16_9_FALLBACK_AUTO_STRATEGY, 
    AspectRatioStrategy.FALLBACK_RULE_AUTO))
  .setResolutionStrategy(ResolutionStrategy(
    previewSize, 
    ResolutionStrategy.FALLBACK_RULE_CLOSEST_LOWER_THEN_HIGHER
  ))
  .build()

imageCapture = ImageCapture.Builder()
  .setTargetAspectRatio(AspectRatio.RATIO_16_9)
  .setPostviewEnabled(isPostviewSupported)
  .setPostviewResolutionSelector(postviewResolutionSelector)
  .build()

// When the Night Mode photo is being taken, define these additional callbacks
// to implement PostView and a progress indicator in your app.
imageCapture.takePicture(
  outputFileOptions,
  Dispatchers.Default.asExecutor(),
  object : ImageCapture.OnImageSavedCallback {
    override fun onPostviewBitmapAvailable(bitmap: Bitmap) {
      // Add the Bitmap to your UI as a placeholder while the final result is processed
    }

    override fun onCaptureProcessProgressed(progress: Int) {
      // Use the progress value to update your UI; values go from 0 to 100.
    }
  }
)

To accomplish this in Camera2, see the CameraFragment.kt file in the Camera2Extensions sample app.

Implementing the Moon Icon Indicator

Another user-focused design touch is showing the moon icon to let the user know that a Night Mode capture will happen. It’s also a good idea to let the user tap the moon icon to disable Night Mode capture. There’s an upcoming API in Android 16 next year to let you know when the device is in a low-light environment.

Here are the possible values for the Night Mode Indicator API:

      UNKNOWN

      • The camera is unable to reliably detect the lighting conditions of the current scene to determine if a photo will benefit from a Night Mode Camera Extension capture.

      OFF

      • The camera has detected lighting conditions that are sufficiently bright. Night Mode Camera Extension is available but may not be able to optimize the camera settings to take a higher quality photo.

      ON

      • The camera has detected low-light conditions. It is recommended to use Night Mode Camera Extension to optimize the camera settings to take a high-quality photo in the dark.

Next Steps

Read more about Android’s camera APIs in the Camera2 guides and the CameraX guides. Once you’ve got the basics down, check out the Android Camera and Media Dev Center to take your camera app development to the next level. For more details on upcoming Android features, like the Night Mode Indicator API, get started with the Android 16 Preview program.

What’s new in CameraX 1.4.0 and a sneak peek of Jetpack Compose support

Posted by Scott Nien – Software Engineer (scottnien@)

Get ready to level up your Android camera apps! CameraX 1.4.0 just dropped with a load of awesome new features and improvements. We're talking expanded HDR capabilities, preview stabilization and the versatile effect framework, and a whole lot of cool stuff to explore. We will also explore how to seamlessly integrate CameraX with Jetpack Compose! Let's dive in and see how these enhancements can take your camera app to the next level.

HDR preview and Ultra HDR

A split-screen image compares Standard Dynamic Range (SDR) and High Dynamic Range (HDR) image quality side-by-side using a singular image of a detailed landscape. The HDR side is more vivid and vibrant.

High Dynamic Range (HDR) is a game-changer for photography, capturing a wider range of light and detail to create stunningly realistic images. With CameraX 1.3.0, we brought you HDR video recording capabilities, and now in 1.4.0, we're taking it even further! Get ready for HDR Preview and Ultra HDR. These exciting additions empower you to deliver an even richer visual experience to your users.

HDR Preview

This new feature allows you to enable HDR on Preview without needing to bind a VideoCapture use case. This is especially useful for apps that use a single preview stream for both showing preview on display and video recording with an OpenGL pipeline.

To fully enable the HDR, you need to ensure your OpenGL pipeline is capable of processing the specific dynamic range format and then check the camera capability.

See following code snippet as an example to enable HLG10 which is the baseline HDR standard that device makers must support on cameras with 10-bit output.

// Declare your OpenGL pipeline supported dynamic range format. 
val openGLPipelineSupportedDynamicRange = setOf(
     DynamicRange.SDR, 
     DynamicRange.HLG_10_BIT
)
// Check camera dynamic range capabilities. 
val isHlg10Supported =  
     cameraProvider.getCameraInfo(cameraSelector)
           .querySupportedDynamicRanges(openGLPipelineSupportedDynamicRange)
           .contains(DynamicRange.HLG_10_BIT)

val preview = Preview.Builder().apply {
     if (isHlg10Supported) {
        setDynamicRange(DynamicRange.HLG_10_BIT)
     }
}

Ultra HDR

Introducing Ultra HDR, a new format in Android 14 that lets users capture stunningly realistic photos with incredible dynamic range. And the best part? CameraX 1.4.0 makes it incredibly easy to add Ultra HDR capture to your app with just a few lines of code:

val cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA
val cameraInfo = cameraProvider.getCameraInfo(cameraSelector)
val isUltraHdrSupported = 
      ImageCapture.getImageCaptureCapabilities(cameraInfo)
                  .supportedOutputFormats
                  .contains(ImageCapture.OUTPUT_FORMAT_JPEG_ULTRA_HDR)

val imageCapture = ImageCapture.Builder().apply {
    if (isUltraHdrSupported) {
        setOutputFormat(ImageCapture.OUTPUT_FORMAT_JPEG_ULTRA_HDR)
    }
}.build()

Jetpack Compose support

While this post focuses on 1.4.0, we're excited to announce the Jetpack Compose support in CameraX 1.5.0 alpha. We’re adding support for a Composable Viewfinder built on top of AndroidExternalSurface and AndroidEmbeddedExternalSurface. The CameraXViewfinder Composable hooks up a display surface to a CameraX Preview use case, handling the complexities of rotation, scaling and Surface lifecycle so you don’t need to.

// in build.gradle 
implementation ("androidx.camera:camera-compose:1.5.0-alpha03")


class PreviewViewModel : ViewModel() {
    private val _surfaceRequests = MutableStateFlow<SurfaceRequest?>(null)

    val surfaceRequests: StateFlow<SurfaceRequest?>
        get() = _surfaceRequests.asStateFlow()

    private fun produceSurfaceRequests(previewUseCase: Preview) {
        // Always publish new SurfaceRequests from Preview
        previewUseCase.setSurfaceProvider { newSurfaceRequest ->
            _surfaceRequests.value = newSurfaceRequest
        }
    }

    // ...
}

@Composable
fun MyCameraViewfinder(
    viewModel: PreviewViewModel,
    modifier: Modifier = Modifier
) {
    val currentSurfaceRequest: SurfaceRequest? by
        viewModel.surfaceRequests.collectAsState()

    currentSurfaceRequest?.let { surfaceRequest ->
        CameraXViewfinder(
            surfaceRequest = surfaceRequest,
            implementationMode = ImplementationMode.EXTERNAL, // Or EMBEDDED
            modifier = modifier        
        )
    }
}

Kotlin-friendly APIs

CameraX is getting even more Kotlin-friendly! In 1.4.0, we've introduced two new suspend functions to streamline camera initialization and image capture.

// CameraX initialization 
val cameraProvider = ProcessCameraProvider.awaitInstance()

val imageProxy = imageCapture.takePicture() 
// Processing imageProxy
imageProxy.close()

Preview Stabilization and Mirror mode

Preview Stabilization

Preview stabilization mode was added in Android 13 to enable the stabilization on all non-RAW streams, including previews and MediaCodec input surfaces. Compared to the previous video stabilization mode, which may have inconsistent FoV (Field of View) between the preview and recorded video, this new preview stabilization mode ensures consistency and thus provides a better user experience. For apps that record the preview directly for video recording, this mode is also the only way to enable stabilization.

Follow the code below to enable preview stabilization. Please note that once preview stabilization is turned on, it is not only applied to the Preview but also to the VideoCapture if it is bound as well.

val isPreviewStabilizationSupported =  
    Preview.getPreviewCapabilities(cameraProvider.getCameraInfo(cameraSelector))
        .isStabilizationSupported
val preview = Preview.Builder().apply {
    if (isPreviewStabilizationSupported) {
      setPreviewStabilizationEnabled(true)
    }
}.build()

MirrorMode

While CameraX 1.3.0 introduced mirror mode for VideoCapture, we've now brought this handy feature to Preview in 1.4.0. This is especially useful for devices with outer displays, allowing you to create a more natural selfie experience when using the rear camera.

To enable the mirror mode, simply call Preview.Builder.setMirrorMode APIs. This feature is supported for Android 13 and above.

Real-time Effect

CameraX 1.3.0 introduced the CameraEffect framework, giving you the power to customize your camera output with OpenGL. Now, in 1.4.0, we're taking it a step further. In addition to applying your own custom effects, you can now leverage a set of pre-built effects provided by CameraX and Media3, making it easier than ever to enhance your app's camera features.

Overlay Effect

The new camera-effects artifact aims to provide ready-to-use effect implementations, starting with the OverlayEffect. This effect lets you draw overlays on top of camera frames using the familiar Canvas API.

The following sample code shows how to detect the QR code and draw the shape of the QR code once it is detected.

By default, drawing is performed in surface frame coordinates. But what if you need to use camera sensor coordinates? No problem! OverlayEffect provides the Frame#getSensorToBufferTransform function, allowing you to apply the necessary transformation matrix to your overlayCanvas.

In this example, we use CameraX's MLKit Vision APIs (MlKitAnalyzer) and specify COORDINATE_SYSTEM_SENSOR to obtain QR code corner points in sensor coordinates. This ensures accurate overlay placement regardless of device orientation or screen aspect ratio.

// in build.gradle 
implementation ("androidx.camera:camera-effects:1.4.1}")      
implementation ("androidx.camera:camera-mlkit-vision:1.4.1")

var qrcodePoints: Array<Point>? = null
val qrcodeBoxEffect 
    = OverlayEffect(
        PREVIEW /* applied on the preview only */,
        0, /* queueDepth */, 
        Handler(Looper.getMainLooper()), {}
      )

fun initCamera() {
    qrcodeBoxEffect.setOnDrawListener { frame ->
        frame.overlayCanvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.CLEAR)
        qrcodePoints?.let {
            // Using sensor coordinates to draw.
            frame.overlayCanvas.setMatrix(frame.sensorToBufferTransform)
            val path = android.graphics.Path().apply {
                it.forEachIndexed { index, point ->
                    if (index == 0) {
                        moveTo(point.x.toFloat(), point.y.toFloat())
                    } else {
                        lineTo(point.x.toFloat(), point.y.toFloat())
                    }
                 }
                 lineTo(it[0].x.toFloat(), it[0].y.toFloat())
            }
            frame.overlayCanvas.drawPath(path, paint)
        }
        true
    }

    val imageAnalysis = ImageAnalysis.Builder()
        .build()
        .apply {
            setAnalyzer(executor,
                MlKitAnalyzer(
                    listOf(barcodeScanner!!),
                    COORDINATE_SYSTEM_SENSOR,
                    executor
                ) { result ->
                    val barcodes = result.getValue(barcodeScanner!!)
                    qrcodePoints = 
                        barcodes?.takeIf { it.size > 0}?.get(0)?.cornerPoints
                }
            )
        }

    val useCaseGroup = UseCaseGroup.Builder()
          .addUseCase(preview)
          .addUseCase(imageAnalysis)
          .addEffect(qrcodeBoxEffect)
          .build()

    cameraProvider.bindToLifecycle(
        lifecycleOwner, cameraSelector, usecaseGroup)
  }

Media3 Effect

Want to add stunning camera effects to your CameraX app? Now you can tap into the power of Media3's rich effects framework! This exciting integration allows you to apply Media3 effects to your CameraX output, including Preview, VideoCapture, and ImageCapture.

This means you can easily enhance your app with a wide range of professional-grade effects, from blurs and color filters to transitions and more. To get started, simply use the new androidx.camera:media3:media3-effect artifact.

Here's a quick example of how to apply a Gaussian blur to your camera output:

// in build.gradle 
implementation ("androidx.camera.media3:media3-effect:1.0.0-alpha01")
implementation ("androidx.media3:media3-effect:1.5.0")

import androidx.camera.media3.effect.Media3Effect
val media3Effect =
            Media3Effect(
                requireContext(),  PREVIEW or VIDEO_CAPTURE or IMAGE_CAPTURE,
                mainThreadExecutor(), {}
            )
// use grayscale effect
media3Effect.setEffects(listOf(RgbFilter.createGrayscaleFilter()) 
cameraController.setEffects(setOf(media3Effect)) // or using UseCaseGroup API

Here is what the effect looks like:

A black and white view from inside a coffee shop looking out at a city street.  The bottom of the photo shows the edge of a table with a laptop and two buttons labeled 'BACK' and 'RECORD'

Screen Flash

Taking selfies in low light just got easier with CameraX 1.4.0! This release introduces a powerful new feature: screen flash. Instead of relying on a traditional LED flash which most selfie cameras don’t have, screen flash cleverly utilizes your phone's display. By momentarily turning the screen bright white, it provides a burst of illumination that helps capture clear and vibrant selfies even in challenging lighting conditions.

Integrating screen flash into your CameraX app is flexible and straightforward. You have two main options:

      1. Implement the ScreenFlash interface: This gives you full control over the screen flash behavior. You can customize the color, intensity, duration, and any other aspect of the flash. This is ideal if you need a highly tailored solution.

      2. Use the built-in implementation: For a quick and easy solution, leverage the pre-built screen flash functionality in ScreenFlashView or PreviewView. This implementation handles all the heavy lifting for you.

If you're already using PreviewView in your app, enabling screen flash is incredibly simple. Just enable it directly on the PreviewView instance. If you need more control or aren't using PreviewView, you can use ScreenFlashView directly.

Here's a code example demonstrating how to enable screen flash:

// case 1: PreviewView + CameraX core API.
previewView.setScreenFlashWindow(activity.getWindow());
imageCapture.screenFlash = previewView.screenFlash
imageCapture.setFlashMode(ImageCapture.FLASH_MODE_SCREEN)

// case 2: PreviewView + CameraController
previewView.setScreenFlashWindow(activity.getWindow());
cameraController.setImageCaptureFlashMode(ImageCapture.FLASH_MODE_SCREEN);

// case 3 : use ScreenFlashView 
screenFlashView.setScreenFlashWindow(activity.getWindow());
imageCapture.setScreenFlash(screenFlashView.getScreenFlash());
imageCapture.setFlashMode(ImageCapture.FLASH_MODE_SCREEN);

Camera Extensions new features

Camera Extensions APIs aim to help apps to access the cutting-edge capabilities previously available only on built-in camera apps. And the ecosystem is growing rapidly! In 2024, we've seen major players like Pixel, Samsung, Xiaomi, Oppo, OnePlus, Vivo, and Honor all embrace Camera Extensions, particularly for Night Mode and Bokeh Mode. CameraX 1.4.0 takes this even further by adding support for brand-new Android 15 Camera Extensions features, including:

    • Postview: Provides a preview of the captured image almost instantly before the long-exposure shots are completed
    • Capture Process Progress: Displays a progress indicator so users know how long capturing and processing will take, improving the experience for features like Night Mode
    • Extensions Strength: Allows users to fine-tune the intensity of the applied effect

Below is an example of the improved UX that uses postview and capture process progress features on Samsung S24 Ultra.

moving image capturing process progress features on Samsung S24 Ultra

Interested to know how this can be implemented? See the sample code below:

val extensionsCameraSelector =  
    extensionsManager
        .getExtensionEnabledCameraSelector(DEFAULT_BACK_CAMERA, extensionMode)
val isPostviewSupported = ImageCapture.getImageCaptureCapabilities(                   
    cameraProvider.getCameraInfo(extensionsCameraSelector)
).isPostviewSupported
val imageCapture = ImageCapture.Builder().apply {
    setPostviewEnabled(isPostviewSupported)
}.build()

imageCapture.takePicture(outputfileOptions, executor,  
    object : OnImageSavedCallback {
        override fun onImageSaved(outputFileResults: OutputFileResults) {
            // final image saved. 
        }
        override fun onPostviewBitmapAvailable(bitmap: Bitmap) {
            // Postview bitmap is available.
        }
        override fun onCaptureProcessProgressed(progress: Int) {
            // capture process progress update 
        }
}
Important: If your app ran into the CameraX Extensions issue on Pixel 9 series devices, please use CameraX 1.4.1 instead. This release fixes a critical issue that prevented Night Mode from working correctly with takePicture.

What's Next

We hope you enjoy this new release. Our mission is to make camera development a joy, removing the friction and pain points so you can focus on innovation. With CameraX, you can easily harness the power of Android's camera capabilities and build truly amazing app experiences.

Have questions or want to connect with the CameraX team? Join the CameraX developer discussion group or file a bug report:

We can’t wait to see what you create!