Tag Archives: Camera X

CameraX update makes dual concurrent camera even easier

Posted by Donovan McMurray – Developer Relations Engineer

CameraX, Android's Jetpack camera library, is getting an exciting update to its Dual Concurrent Camera feature, making it even easier to integrate this feature into your app. This feature allows you to stream from 2 different cameras at the same time. The original version of Dual Concurrent Camera was released in CameraX 1.3.0, and it was already a huge leap in making this feature easier to implement.

Starting with 1.5.0-alpha01, CameraX will now handle the composition of the 2 camera streams as well. This update is additional functionality, and it doesn’t remove any prior functionality nor is it a breaking change to your existing Dual Concurrent Camera code. To tell CameraX to handle the composition, simply use the new SingleCameraConfig constructor which has a new parameter for a CompositionSettings object. Since you’ll be creating 2 SingleCameraConfigs, you should be consistent with what constructor you use.

Nothing has changed in the way you check for concurrent camera support from the prior version of this feature. As a reminder, here is what that code looks like.

// Set up primary and secondary camera selectors if supported on device.
var primaryCameraSelector: CameraSelector? = null
var secondaryCameraSelector: CameraSelector? = null

for (cameraInfos in cameraProvider.availableConcurrentCameraInfos) {
    primaryCameraSelector = cameraInfos.first {
        it.lensFacing == CameraSelector.LENS_FACING_FRONT
    }.cameraSelector
    secondaryCameraSelector = cameraInfos.first {
        it.lensFacing == CameraSelector.LENS_FACING_BACK
    }.cameraSelector

    if (primaryCameraSelector == null || secondaryCameraSelector == null) {
        // If either a primary or secondary selector wasn't found, reset both
        // to move on to the next list of CameraInfos.
        primaryCameraSelector = null
        secondaryCameraSelector = null
    } else {
        // If both primary and secondary camera selectors were found, we can
        // conclude the search.
        break
    }
}

if (primaryCameraSelector == null || secondaryCameraSelector == null) {
    // Front and back concurrent camera not available. Handle accordingly.
}

Here’s the updated code snippet showing how to implement picture-in-picture, with the front camera stream scaled down to fit into the lower right corner. In this example, CameraX handles the composition of the camera streams.

// If 2 concurrent camera selectors were found, create 2 SingleCameraConfigs
// and compose them in a picture-in-picture layout.
val primary = SingleCameraConfig(
    cameraSelectorPrimary,
    useCaseGroup,
    CompositionSettings.Builder()
        .setAlpha(1.0f)
        .setOffset(0.0f, 0.0f)
        .setScale(1.0f, 1.0f)
        .build(),
    lifecycleOwner);
val secondary = SingleCameraConfig(
    cameraSelectorSecondary,
    useCaseGroup,
    CompositionSettings.Builder()
        .setAlpha(1.0f)
        .setOffset(2 / 3f - 0.1f, -2 / 3f + 0.1f)
        .setScale(1 / 3f, 1 / 3f)
        .build()
    lifecycleOwner);

// Bind to lifecycle
ConcurrentCamera concurrentCamera =
    cameraProvider.bindToLifecycle(listOf(primary, secondary));

You are not constrained to a picture-in-picture layout. For instance, you could define a side-by-side layout by setting the offsets and scaling factors accordingly. You want to keep both dimensions scaled by the same amount to avoid a stretched preview. Here’s how that might look.

// If 2 concurrent camera selectors were found, create 2 SingleCameraConfigs
// and compose them in a picture-in-picture layout.
val primary = SingleCameraConfig(
    cameraSelectorPrimary,
    useCaseGroup,
    CompositionSettings.Builder()
        .setAlpha(1.0f)
        .setOffset(0.0f, 0.25f)
        .setScale(0.5f, 0.5f)
        .build(),
    lifecycleOwner);
val secondary = SingleCameraConfig(
    cameraSelectorSecondary,
    useCaseGroup,
    CompositionSettings.Builder()
        .setAlpha(1.0f)
        .setOffset(0.5f, 0.25f)
        .setScale(0.5f, 0.5f)
        .build()
    lifecycleOwner);

// Bind to lifecycle
ConcurrentCamera concurrentCamera =
    cameraProvider.bindToLifecycle(listOf(primary, secondary));

We’re excited to offer this improvement to an already developer-friendly feature. Truly the CameraX way! CompositionSettings in Dual Concurrent Camera is currently in alpha, so if you have feature requests to improve upon it before the API is locked in, please give us feedback in the CameraX Discussion Group. And check out the full CameraX 1.5.0-alpha01 release notes to see what else is new in CameraX.

Better Device Compatibility with CameraX

Posted by The Android Team CameraX is an Android Jetpack library that makes it easy to incorporate camera functionality directly in your Android app. That’s why we focus heavily on device compatibility out-of-the-box, so you can focus on what makes your app unique.

In this post, we’ll look at three ways CameraX makes developers’ lives easier when it comes to device compatibility. First, we’ll take a peek into our CameraX Test Lab where we test over 150 physical phones every day. Second, we’ll look at Quirks, the mechanism CameraX uses to automatically handle device inconsistencies. Third, we’ll discuss the ways CameraX makes it easier to develop apps for foldable phones.


CameraX Test Lab

A single rack in our CameraX Test Lab on the left, and on the right, a moving image of the inside of a test inclusure with rotating phone mount 
(Left) A single rack in our CameraX Test Lab. Each test enclosure contains two identical Android phones for testing front and back cameras. (Right) A GIF showing the inside of a test inclosure, with a rotating phone mount (for testing portrait and landscape orientations) and a high-resolution test chart (not pictured).

We built the CameraX Test Lab to ensure CameraX works on the Android devices most people have in their pockets. The Test Lab opened in 2019 with 52 phone models. Today, the Test Lab has 150 phone models. We prioritize devices with the most daily active users over the past 28 days (28DAUs) and devices that leverage a diverse range of systems on a chip (SoCs). The Test Lab currently covers over 750 million 28DAUs. We also test many different Android versions, going back to Android 5.1 (Lollipop).

To generate reliable test results, each phone model has its own test enclosure to control for light and other environmental factors. Each enclosure contains two phones of the same model to simplify testing the front and back cameras. On the opposite side of the test enclosure from the phones, there’s a high-resolution test chart. This chart has many industry-standard tests for camera attributes like color correctness, resolution, sharpness, and dynamic range. The chart also has some specific elements for functional tests like face detection.

When you adopt CameraX in your app, you get the assurance of this continuous testing across many devices and API levels. Additionally, we’re continuously making improvements to the Test Lab, including adding new phones based on market trends to ensure that the majority of your users are well represented. See our current test device list for the latest inventory in our Test Lab.

Quirks

Google provides a Camera Image Test Suite so that OEM’s cameras meet a baseline of consistency. Still, when dealing with the wide range of devices that run Android, there can be differences in the end user camera experience. CameraX includes an abstraction layer, called Quirks, to remove these variations in behavior so that CameraX behaves consistently across all devices with no effort from app developers.

We find these quirks based on our own manual testing, the Test Lab’s automatic testing, and bug reports filed in our public CameraX issue tracker. As of today, CameraX has over 30 Quirks that automatically fix behavior inconsistencies for developers. Here are a few examples:

  • OnePixelShiftQuirk: Some phones shift a column of pixels when converting YUV data to RGB. CameraX automatically corrects for this on those devices.
  • ExtensionDisableQuirk: For phones that don’t support extensions or have broken behavior with extensions, CameraX disables certain extensions.
  • CameraUseInconsistentTimebaseQuirk: Some phones do not properly timestamp video and audio. CameraX fixes the timestamps so that the video and audio align properly.

These are just a few examples of how CameraX automatically handles quirky device behavior. We will continue to add more corrections as we find them, so app developers won’t have to deal with these one-offs on their own. If you find inconsistent behavior on a device you’re testing, you can file an issue in the CameraX component detailing the behavior and the device it’s happening on.

Foldable phones

Foldables continue to be the fastest growing smartphone form factor. Their flexibility in screen size adds complexity to camera development. Here are a few ways that CameraX simplifies the development of camera apps on foldables.

CameraX’s Preview use case handles differences between the aspect ratio of the camera and the aspect ratio of the screen. With traditional phone and tablet form factors, this difference should be small because Section 7.5.5 of the Android Compatibility Definition Document requires that the “long dimension of the camera aligns with the screen’s long dimension.” However, with foldable devices the screen aspect ratio can change, so this relationship might not always hold. With CameraX you can always preserve aspect ratio by filling the PreviewView (which may crop the preview image) or fitting the image into the PreviewView (which may result in letterboxing or pillarboxing). Set PreviewView.ScaleType to specify which method to use.

The increase in foldable devices also increases the possibility that your app may be used in a multi-window environment. CameraX is set up for multi-window support out-of-the-box. CameraX handles all aspects of lifecycle management for you, including the multi-window case where other apps can take priority access of singleton resources, such as the microphone or camera. This means no additional effort is required from app developers when using CameraX in a multi-window environment.

We’re always looking for more ways to improve CameraX to make it even easier to use. With respect to foldables, for example, we’re exploring ways to let developers call setTargetResolution() without having to take into account the different configurations a foldable device can be in. Keep an eye on this blog and our CameraX release notes for updates on new features!

Getting started with CameraX

We have a number of resources to help you get started with CameraX. The best starting place is our CameraX codelab. If you want to dig a bit deeper with CameraX, check out our camera code samples, ranging from a basic app to more advanced features like camera extensions. For an overview of everything CameraX has to offer, see our CameraX documentation. If you have any questions, feel free to reach out to us on our CameraX discussion group.

What’s new in Jetpack

Posted by Florina Muntenescu, Android Developer Advocate

what's new in jetpack image

Android Jetpack is a suite of libraries, tools, and guidance to help developers follow best practices, reduce boilerplate code, and write code that works consistently across Android versions and devices. Today, 84% of the top 1000 apps on Google Play rely on Jetpack.

Here’s a round-up of the latest updates in Jetpack - an extended version of our What’s new in Jetpack talk!

New in Stable

CameraX

The CameraX library provides a unified API surface for accessing camera functionality across OS versions, including device-specific compatibility fixes and workarounds. Some of the latest improvements to the library address common feature requests, including support for adjusting exposure compensation and access to more detailed information about camera state and features. Additionally, camera settings like FPS range can now be changed via Camera2Interop while the camera is running. The library also brings support for the latest device and OS features, including high-dynamic-range preview, zoom ratio controls, and support for Android’s Do Not Disturb mode. Perhaps most importantly, though, the library has continued to address performance, resulting in faster image capture and faster initialization, especially on older devices.

Hilt

Hilt is Jetpack’s recommended dependency injection solution built on top of Dagger. As part of the transition to stable, Hilt’s ViewModel support has moved up into the core Hilt Android APIs and SavedStateHandle has been added as a default dependency available in the ViewModelComponent. Also, Hilt is now integrated with Navigation and Compose: you can obtain an annotated Hilt ViewModel that is scoped to a destination or the navigation graph itself. Developers have already started using Hilt in their apps. Read about their experience in this blog post.

Paging 3.0

The Paging library allows you to load and display small chunks of data to improve

network and system resource consumption. This release features a complete rewrite in Kotlin with first-class support for coroutines and Flow, asynchronous loading with RxJava and Guava primitives, and overall improvements to the repository and presentation layers.

The 3.0 release is a substantial improvement in usability over Paging 2, and the rewrite was planned with partial and staged migrations in mind so that developers can transition on their own schedules. Check out the Paging 3.0 documentation and the Paging 3.0 codelab for details and hands-on experience.

ConstraintLayout and MotionLayout

ConstraintLayout, Jetpack’s flexible system for designing layouts, and MotionLayout, an API aimed at managing motion and widget animation, are now stable. MotionLayout now includes support for foldable devices, image filters, and motion effects. To find out more about what’s new in design tools, check out this Google I/O talk.

Security Crypto

The Security Crypto library allows you to safely and easily encrypt files and SharedPreferences. To encrypt SharedPreferences, create an EncryptedSharedPreferences object with the appropriate key and scheme and then use it like a standard SharedPreferences object.

val prefs: SharedPreferences = EncryptedSharedPreferences.create(
        context,
        "prefs_file_name",
        mainKey,
        prefKeyEncryptionScheme = AES256_SIV,
        prefValueEncryptionScheme = AES256_GCM,
)
// Use the resulting SharedPreferences object as usual.
prefs.edit()
    .putBoolean("show_completed", true)
    .apply()
Fragment

Over the past year, the Fragment library has undergone a major effort to clean up its internal implementation and reduce undocumented behavior, making it easier for developers to follow best practices in their apps and write reliable tests. This lays the groundwork for future improvements to the library, like supporting multiple back stacks in Navigation, and it may require some work to accommodate strict enforcement of API contracts. In practice, you should pay careful attention to your tests after updating the library. Check out the Fragment release notes to see specific cases to watch out for.

Recent releases have also introduced ActivityResult integration, making it possible to register for Activity results from a fragment. Fragment has also added a new FragmentOnAttachListener interface to replace the less-flexible onAttachFragment method. Existing code that overrides this method in Fragment or FragmentActivity will still work, but we’ve deprecated onAttachFragment to help prevent new code from accidentally adopting a less-flexible approach.

// Obtain the fragment manager. May be a childFragmentManager,
// if in a fragment, to observe child attachment.
val fm = supportFragmentManager

val listener = FragmentOnAttachListener {
    fragmentManager, fragment ->
  // Respond to the fragment being attached.
}

fm.addFragmentOnAttachListener(listener)

New in Beta

Once a library is feature complete it moves to Beta for stabilization. At this moment, the APIs change only in response to critical issues or community feedback.

DataStore

DataStore provides a robust data storage solution that addresses the shortcomings of SharedPreferences while maintaining a simple, highly usable API surface. DataStore brings support for best practices like Kotlin coroutines with Flow and RxJava. DataStore allows you to store key-value pairs, via Preference DataStore or typed objects backed by protocol buffers, via Proto DataStore. You can also plug in your own serialization solution, like Kotlin Serialization.

New in Alpha

Alpha libraries are libraries under active development—APIs may be added, changed, or removed, but what’s in the library is tested and should be highly functional.

AppSearch

AppSearch is a new on-device search library which provides high performance and feature-rich full-text search functionality. Compared to SQLite, AppSearch supports multiple world languages, simplifies ranking query results, and offers lower latency for indexing and searching over large datasets.

AppSearch 1.0.0-alpha01 is released with LocalStorage support, which allows your application to manage structured data, called “documents”, and then query over it. Your application defines what the structure looks like using “schema types”. For instance, you can model a Message as a schema type with data such as subject, body, and sender.

Use builders to create documents of a schema type and then add them to storage. Querying for “body:fruit” will retrieve all documents with the term “fruit” in the body of the Message.

In Android S, AppSearch will also offer PlatformStorage so you can share your application’s data with other applications securely, and reduce your application’s binary size by not having to link additional native libraries. This is currently not available in Jetpack because the library doesn’t target the Android S SDK yet.

Centralized storage on Android S+ for integrating into device-wide search

Centralized storage on Android S+ for integrating into device-wide search

Room

Room is the recommended data persistence layer, providing increased usability and safety over the platform.

Room 2.4.0-alpha brings support for auto-migrations. When your database schema changes, you now declare an @AutoMigration and indicate from which version to which version you want to migrate, and Room generates the migrations for you. For more complex migrations, you can still use the Migration class:

@Database(
-   version = 1,
+   version = 2,
    entities = { Doggos.class },
+   autoMigrations = {
+         @AutoMigration (from = 1, to = 2)
+     }
  )
public abstract class DoggosDatabase extends RoomDatabase { }

Room 2.3.0 stable version brings experimental support for Kotlin Symbol Processing which, in our benchmarks of Kotlin code showed a 2x speed improvement over KAPT, as well as built-in support for enums and RxJava3.

Room has also introduced a QueryCallback class—which provides a callback when SQLite statements are executed, to simplify tasks like logging—as well as the new @ProvidedTypeConverter annotation, which allows more flexibility when creating type converters.

WorkManager

The WorkManager library—Android’s recommended way to schedule deferrable, asynchronous tasks that run even if the app exits or the device restarts—has made improvements to reliability with task reconciliation, ensuring all tasks are executed, and a variety of workarounds for specific Android OS versions.

The latest versions of WorkManager feature improved support for multi-process apps, including performance benefits from unifying work request scheduling to a single process and limiting database growth when scheduling many requests.

Version 2.7—now in alpha, which is targeted to the Android S SDK—provides additional support for the platform’s new foreground restrictions. See the Effective Background Tasks on Android talk for more details.

The Background Tasks Inspector is available in Android Studio Arctic Fox, allowing you to easily view and debug WorkManager jobs when using the latest versions of the library:

background tasts inspector

Background Tasks Inspector

Navigation

The Navigation library, Jetpack’s framework for moving between destinations in an app, now provides support for multiple backstacks and simplifies cases where destinations sit at the same depth, such as a bottom navigation bar.

Macrobenchmark

The Macrobenchmark library extends Jetpack’s benchmarking coverage to app startup and integrated behaviors like scrolling performance. The library can be used remotely to track metrics in continuous integration testing or locally with profiling results viewable from Android Studio. Check out the Google I/O talk on all the details:

For developers who’d like to integrate more closely with Google Assistant, the Google Shortcuts library provides a way to expose actions to Google Assistant and other Google Services through the existing ShortcutInfo class.

You can send up to fifteen shortcuts at a time through the ShortcutManager to be shown on Google Assistant, among other services, making them available for voice and other interactions.

To implement this, define a shortcut with an Intent and a capability binding; this binding provides semantically-meaningful information that will help Google services figure out the best way to surface it to users.

// expose a "Cappuccino" action to Google Assistant and other services
ShortcutInfoCompat siCompat =
  ShortcutInfoCompat.Builder(ctx, "id_cappuccino")
    .setShortLabel("Cappuccino")
    .setIntent(Intent(ctx, OrderCappuccino::class.java))
    .addCapabilityBinding(
        "actions.intent.ORDER_MENU_ITEM",
        "menuItem.name",
        asList("cappuccino")
    )
    .build()

ShortcutManagerCompat.pushDynamicShortcut(ctx, siCompat)
EmojiCompat

All user-generated content in your app contains ?, and supporting modern emoji is a key part of making your app ✨! The EmojiCompat library, which supports modern emoji on API 19 and higher, has moved to a new artifact :emoji2:emoji2, which replaces the previous :emoji:emoji artifact. The new emoji2 library adds ? automatic configuration using the AppStartup library (you don't have to add any code ??‍? to display ?‍❄️)!

AppCompat adds emoji2 starting with AppCompat 1.4. If your app uses AppCompat, users will see modern emoji ⭐ without any further configuration. Apps that aren't using AppCompat can add :emoji2:emoji2-views. For custom TextViews, you can support modern emoji by using the helpers in :emoji2:emoji2-views-helpers or by subclassing AppCompat views.

Jetpack Compose

Jetpack Compose is Android’s modern toolkit for building native UI. It simplifies and accelerates UI development on Android. Jetpack Compose is currently in beta, and planned to go stable in July. Many of the libraries listed here, as well as others that you might already be using, have introduced features specifically for integration with Jetpack Compose. Ranging from Activity to ViewModel, Navigation, or Hilt, all of these libraries can make adopting Compose in your app smoother. Find out more about about how to use them from this Google I/O talk:

Form factors

Jetpack makes it easier to work with different form factors, including foldables, large screen devices, and Wear devices. We've introduced new guidelines for large screen development along with improvements to Jetpack libraries such as WindowManager and SlidingPaneLayout. Read all the details in this blog post.

Conclusion

This was a (relatively) quick overview of what’s new in Jetpack. Check out the AndroidX release notes for all the update details of each library and the Google I/O talks for more information on some of them.