Tag Archives: Camera2

Snapchat integrated new camera features 50% faster with the Camera2 Extensions API

Posted by Fred Chung, Android Developer Relations

Snapchat is a visual messaging app that enhances Snapchatters’ relationships with friends, family, and the world. It opens to the camera and offers millions of augmented reality and AI-powered Lenses for self expression, learning, and play. Ensuring Snapchatters can easily capture and share their lives with close friends and family is a priority for Snapchat, and they're always exploring new ways to improve the overall app experience.

As part of this, the Snapchat team added new camera features into the app using Android’s Camera2 Extensions API, which allows developers to access various capabilities that OEMs have implemented on various devices, like Night Mode and Bokeh. Thanks to Android’s intuitive API, the Snapchat team implemented new camera features 50% faster than before.

Camera2 Extensions API gives access to advanced features

The Snapchat team wanted to optimize the application for the expanding selection of Android devices, knowing many OEMs differentiate their devices with their respective camera technologies. As Snapchat is a primarily visual app that works with a device’s camera, the team optimizes the app to take full advantage of each device’s unique hardware.

“We wanted to leverage each OEM’s software to enhance the Snapchat experience on Android,” said Ye Tian, a software engineer at Snapchat. “This would help the app achieve higher-quality Snaps that are comparable to what a device's native camera offers.”

https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiU-F-r5DnCnu0YdvwT1OYmjJinR-gH1LNVq_wmImMPU4rwXDfroLQlvht1k8640kMaTabS8maaYYeRgfDQwBYrjv8Gi5QygnmWMb1nw-X8OfxSxEoSjp3V56uhg3lbdoaXRruZHzHuvscejVS-9dsqeQHzJ9QDytZQuQmmZRQcfLYb42v578M4Ln8OX9g/s1600/image3.gif
Snapchat developers enhanced the app’s zoom and night mode camera capabilities using the Camera2 Extensions API

What started as a goal to improve the app’s low-light capabilities led to much more. The Snapchat team worked on finding new ways to improve the app’s camera capabilities by implementing features like night mode, portrait mode, face retouch, tap-to-focus, zoom, and more.

“Our collaboration with Google Pixel paved the way for collaboration with other OEMs to implement night mode and super-night mode in their devices with very minimal code changes,” said Ye. “The Camera2 Extensions API is flexible and extensive. Snapchat can now use it to build full-fledged applications on demand without negatively impacting performance and stability.”

The implementation via the Camera2 Extension API made it easy for Snapchat developers to add more camera features into the app. And using the extensions made available with Android’s camera API, Snapchat integrated new camera features 50% faster when compared to the typical industry-standard approaches it used in the past.

The Camera2 Extensions API is flexible and extensive. Snapchat can now use it to build full-fledged applications on demand without negatively impacting performance and stability.” — Ye Tian, Software Engineer at Snapchat

More opportunities on more devices

The Snapchat team was happy to give its users a more cohesive experience using the Camera2 Extensions API. Thanks to the extensions provided in the API, developers easily improved the app’s camera on a range of manufacturer devices using the Android platform, and much faster than before.

“I enjoy the diversity of the Android platform and utilizing the unique advantages of each mobile phone manufacturers’ devices,” said Ye. “It helps us bring their cutting-edge innovations into the Snapchat app, allowing Snapchatters to better capture their life moments.”

Snapchat’s team looks forward to working with more OEMs to further improve the app’s processing capabilities across devices using the Camera2 Extensions API. They’re also looking forward to improving the app’s backward compatibility using the new API, which will allow even more users to benefit from the extensions.

“I would recommend using Camera2 Extension API. It provides extensive functionalities and stable performance to improve the velocity that developers can deliver features,” said Ye.

Get started

Learn how to increase your app’s camera capabilities with the Camera2 Extensions API.

Introducing Camera Viewfinder

Posted by Francesco Romano, Developer Relations Engineer, Androidhand holding a phoneOver the years, Android devices have evolved to include a variety of sizes, shapes, and displays, among other features. Since the beginning, however, taking pictures with your phone has been one of the most important use cases. Today, camera capabilities are still one of the top reasons consumers purchase a phone.

As a developer, you want to leverage camera capabilities in your app, so you decide to adopt the Android Camera Framework. The first use case you want to implement is the Preview use case, which shows the output of the camera sensor on the screen.

So you go ahead and create a CaptureSession using a surface as big as the screen size. As long as the screen has the same aspect ratio as the camera sensor output and the device stays in its natural portrait orientation, everything should be fine.

But what happens when you resize the window, unfold your device, or change the display or orientation? Well, in most cases, the preview may appear stretched, upside down, or incorrectly rotated. And if you are in a multi-window environment, your app may even crash.

Why does this happen? Because of the implicit assumptions you made while creating the CaptureSession.

Historically, your app could live in the same window for its whole life cycle, but with the availability of new form factors such as foldable devices, and new display modes such as multi-window and multi-display, you can't assume this will be true anymore.

In particular, let's see some of the most important considerations when developing an app targeting various form factors:

Let's examine some common pitfalls to avoid when developing an app targeting various form factors:

  • Don't assume your app will live in a portrait-shaped window. Requesting a fixed orientation is still supported in Android 13, but now device manufacturers may have the option of overriding an app request for a preferred orientation.
  • Don't assume any fixed dimension or aspect ratio for your app. Even if you set resizableActivity = "false", your app could still be used in multi-window mode on large screens (>=600dp).
  • Don't assume a fixed relationship between the orientation of the screen and the camera. The Android Compatibility Definition Document specifies that a camera image sensor "MUST be oriented so that the long dimension of the camera aligns with the screen's long dimension." Starting with API level 32, camera clients that query the orientation on foldable devices can receive a value that dynamically changes depending on the device/fold state.
  • Don't assume the size of the inset can't change. The new taskbar is reported to applications as an inset, and when used with gesture navigation, the taskbar can be hidden and shown dynamically.
  • Don't assume your app has exclusive access to the camera. While your app is in a multi-window state, other apps can obtain access to shared resources like camera and microphone.

While CameraX already handles most of the cases above, implementing a preview that works in different scenarios with Camera2 APIs can be complex, as we describe in the Support resizable surfaces in your camera app Codelab.

Wouldn’t it be great to have a simple component that takes care of those details and lets you focus on your specific app logic?

Say no more…

Introducing CameraViewfinder

CameraViewfinder is a new artifact from the Jetpack library that allows you to quickly implement camera previews with minimal effort. It internally uses either a TextureView or SurfaceView to display the camera feed, and applies the required transformations on them to correctly display the viewfinder. This involves correcting their aspect ratio, scale, and rotation. It is fully compatible with your existing Camera2 codebase and continuously tested on several devices.

Let’s see how to use it.

First, add the dependency in your app-level build.gradle file:

implementation "androidx.camera:camera-viewfinder:1.3.0-alpha01"

Sync your project. Now you should be able to directly use the CameraViewfinder as any other View. For example, you can add it to your layout file:

<androidx.camera.viewfinder.CameraViewfinder
  android:id="@+id/view_finder"
  app:scaleType="fitCenter"
  app:implementationMode="performance"
  android:layout_width="match_parent"
  android:layout_height="match_parent"/>

As you can see, CameraViewfinder has the same controls available on PreviewView, so you can choose different Implementation modes and scaling types.

Now that the component is part of the layout, you can still create a CameraCaptureSession, but instead of providing a TextureView or SurfaceView as target surfaces, use the result of requestSurfaceAsync().

fun startCamera(){
    val previewResolution = Size(width, height)
    val viewfinderSurfaceRequest =
ViewfinderSurfaceRequest(previewResolution, characteristics)
    val surfaceListenableFuture =
        cameraViewfinder.requestSurfaceAsync(viewfinderSurfaceRequest)

    Futures.addCallback(surfaceListenableFuture, object : FutureCallback<Surface> {
        override fun onSuccess(surface: Surface) {
            //create a CaptureSession using this surface as usual
        }
        override fun onFailure(t: Throwable) { /* something went wrong */}
    }, ContextCompat.getMainExecutor(context))
}


Bonus: optimized layouts for foldable devices

CameraViewFinder is ready-to-use across resizable surfaces, configuration changes, rotations, and multi-window modes, and it has been tested on many foldable devices.

But if you want to implement optimized layouts for foldable and dual screen devices, you can combine CameraViewFinder with the Jetpack WindowManager library to provide unique experiences for your users.

For example, you can choose to avoid showing full screen preview if there is a hinge in the middle of the screen, or if the device is in “book” or “tabletop” mode. In those scenarios, you can have the viewfinder in one portion of the screen and the controls on the other side, or you can use part of the screen to show the last pictures taken. Imagination is the limit!

The sample app is already optimized for foldable devices and you can find the code to handle posture changes here. Have a look!