Debugging your application becomes easier with all the inspectors provided by Android Studio Arctic Fox: for background work, like understanding what’s the status of your WorkManager workers, use Background Task Inspector; for UI use Layout Inspector, for both Android Views and Compose; for database debugging use Database Inspector.
To see the inspectors in action, check out What’s new in Android development tools.
#3: New features in Kotlin
We keep improving Kotlin on Android at all levels, from tools to APIs, and giving you different ways to learn. Kotlin Symbol Processing (KSP), now in alpha, provides a simplified compiler plugin API that can run up to 2 times faster than KAPT. Together with JetBrains, we’re addressing performance issues in the IDE and we’re seeing up to 20x faster auto-import suggestions. We added StateFlow support to DataBinding and new APIs for observing Flows in the UI without DataBinding. To learn about all the improvements we’ve made for Kotlin, check out the State of Kotlin on Android talk:
You can find all of this year’s Google I/O talks covering Modern Android Development in this playlist:
Posted by Greg Hartrell, Head of Product Management, Games on Android & Google Play
With a surge of new gamers and an increase in time spent playing games in the last year, it’s more important than ever for game developers to delight and engage players. To help developers with this opportunity, the games teams at Google are back to announce the return of the Google for Games Developer Summit 2021 on July 12th-13th.
Hear from experts across Google about new game solutions they’re building to make it easier for you to continue creating great games, connecting with players, and scaling your business. Registration is free and open to all game developers.
Register for the free online event at g.co/gamedevsummit to get more details in the coming weeks. We can’t wait to share our latest innovations with the developer community.
Posted by Oscar Wahltinez, Developer Relations Engineer, Google
Users are seeing more value in larger screens, and the benefits of doing more
with a single device. Apps designed for large screen devices increase those
benefits even further.
The ability to fold a screen offers better ergonomics for large devices. When
folded, you can fit a tablet-sized screen in your pocket — unlocking
utility that was previously unavailable on a portable device. Thinking about
our app ecosystem, we’re excited because this is a hardware shift that
is driving new expectations around what you can do from a handheld device. We
see the demand for larger screens extending to tablets too, which have greatly
increased in popularity, given the similar app experience.
Technological breakthroughs and our understanding of ergonomics have played a role in device form factors.
In this blog post, we'll explain what you should do to prepare your apps for
large screens, and how recent updates have made developing your app easier.
But first, let’s talk about what we're seeing with large screens —
and why you should optimize your app.
Why large screens
There are many ways to use foldable devices, including a number of postures illustrated here.
Over the past year, we’ve seen device makers release exciting new
foldable and tablet devices. Demand has increased as users are doing more than
ever from these devices. Altogether, developers can reach more than 250
million active foldables, tablets, and Chromebooks by building for Android
large screen devices today. Sales of tablet devices grew 16% in 2020 with
analysts expecting more than 400 million Android tablets by 2023, and
foldables are redefining what’s possible on premium devices. Android
apps can also run on ChromeOS, which is now the
second most popular
desktop OS.
Large screen ready
Larger screens are changing how users interact with their device. These
devices allow you to edit slide decks while looking at notes, look up
restaurant recommendations while planning a night out, or watch a video while
chatting with friends. Let’s talk about base-level support —
features an app must support to be “large screen ready”. There are
three main areas of focus when it comes to large screen readiness:
The first step is to ensure that your app is designed for large screens. To
make this easier, we’ve defined specific window size breakpoints and
device classes for you to optimize for. Add tablet layouts for displays where
the shortest dimension is >600dp, and ensure your apps
go edge-to-edge. Developers should also plan for their app to be used in both portrait and
landscape modes, since larger screens are more likely to be used in landscape.
We’ve got material adaptive components that we’ll be talking about
to help developers make better use of the increased space.
Since foldable and large screen devices have a variable window size, adaptive
layouts work better than splitting experiences based on screen size.
Multitasking
Going into split screen (or multi-window mode) and gestures like drag and drop
are starting to become natural interactions that users expect to work
seamlessly in their large screen devices. Your apps should handle multitasking
seamlessly by being resizable. Handling folding and unfolding events and
planning for your app to be in multi-window mode prevents your app from
becoming letterboxed.
Drag and drop can be a natural interaction in large screen layouts, even within the same
app.
By enabling multiple instance support, users can run multiple copies of your
app side-by-side. The let’s users compare two products, reference notes
while writing a document or maybe keeping your calendar in view as you are
planning an event.
Input modes
Since many people use larger screens for productivity, tablets should support
basic keyboard, mouse and stylus usage.
Users of Android apps on ChromeOS devices often have a keyboard; apps should ensure that standard keyboard navigation and shortcuts are available to provide improved accessibility.
Component updates
Several UI components across Jetpack and Material Design libraries have been
updated to help you build a flexible user experience to scale your phone's UI
to a larger screen.
SlidingPaneLayout
One of the most common adaptive layouts to optimize your app for large screens
is implementing a list-detail UI. For example, a messaging app that lists
messages on one side with the message detail on the other.
SlidingPaneLayout automatically adapts to configuration changes to provide a good user experience across different layout sizes.
UIs that would be one top of each other on a smaller screen can now easily lay
out side-by-side. For this, you can use the updated version of the SlidingPaneLayout library — updated to support a two-pane style layout, SlidingPaneLayout uses the width of the two panes to determine how to lay out the UI. It does
that by automatically determining if it can lay out side-by-side based on the
content width and available space. For example, if the list pane is measured
to have a minimum width of 200dp and the detail pane needs 400dp, then the SlidingPaneLayout automatically shows the two panes side by side if it has at least 600dp of
width available.
SlidingPaneLayout is used in our sample application
IOSched.
We have updated the library to recognize and adapt to
folds and hinges . For example, if you are on a device with hinges that blocks part of the
screen, it will automatically place your content on either side.
We have also introduced lock modes,which allow control over
the swipe behavior when panes overlap (programmatically switching is also
supported). For example, to prevent users from swiping to an empty pane you
may want them to have to click on a list item to load information about that
pane, but allow them to swipe back to the list. On a foldable device or tablet
that has room to show both views side by side, the lock modes are ignored.
NavRail
A
vertical Navigation Rail
is functionally equivalent to Bottom navigation, and provides a more ergonomic
navigation experience on larger screens. As you scale your UI, NavRail supports better reachability, since larger screens tend to
be held by the side, whereas on the phone users are probably holding the
device from the bottom.
NavRail automatically changes the location of the navigation menu depending on configuration changes.
For example, NavRail can help if vertical scrolling is key to your app. In
those cases, a bottom navigation bar decreases the amount of content
that’s visible, especially when tablet devices are being used in
landscape orientation.
Other Components
We've also made updates across multiple other components. One of the biggest
pitfalls when apps move to a larger screen is when UIs are stretched
edge-to-edge across the whole screen. To help prevent this, we’ve added
default Max Width values to certain Material Components where this commonly
happens, for example:
Buttons
TextFields
Sheets
We will add more components to this list in the future. These changes provide
opinionated defaults to help your apps adapt and look better out of the box on
large screen devices. Find more information about using size constraints with
components in
the Material Design guidelines.
Most foreground UI elements should have a maximum width value.
WindowManager Jetpack library
Beyond component updates to help you scale your UI, we also have the
WindowManager Jetpack library to help you build better experiences on these devices. This library is now
available in alpha and it provides a common API surface for supporting
different device types, starting with foldables and tablets.
You can use WindowManager to detect display features such as folds or hinges.
It also gives information about how the display feature affects your app, so
you can create an optimal experience. For example, reacting to the foldable
device state changes when the device is folded into tabletop mode while the
user is watching a video.
Applications should seamlessly adapt to a growing number of device configurations.
WindowManager also provides a couple of convenience methods to retrieve the
current and maximum
WindowMetrics
information in a backward compatible way, starting from API level 14.
Platform changes
Display API deprecations
Your app needs to determine the screen or display size in order to render
content appropriately for each device. With the introduction of the
WindowMetrics API, a number of methods related to display size have been deprecated. For a
backwards-compatible replacement, you should use the
Window Manager Jetpack library.
Exclusive resources
Android 10 introduced the possibility to have multiple resumed apps running at
the same time, with a single “top resumed” application. Most
applications benefit from this change without the need of updates. The most
notable exception is if your application uses an exclusive resource like the
microphone or the camera. See
our previous blog post
for more details.
Case studies
Optimizing your app for large screens can improve the experience for your
users, as well as deliver on business results. We’re seeing an increased
number of apps take advantage of the opportunities with large screens on
Google Play. As an example,
Google Duo
implemented tablet and foldable support to enhance their user experience, and
saw an increase in app ratings and user engagement.
Google Duo's optimized experience for foldable devices, such as the Samsung Galaxy Z Fold2
In addition to Google Duo's enhanced user experience, we've modernized many
additional apps to use adaptive layouts so they can take advantage of large
screens and foldable devices:
Chrome added improved tab navigation for larger screens
YouTube redesigned its UI to improve usability in foldable devices
Google Photos displays more UI elements, like a search bar, in larger
screens
Google Calendar provides a more ergonomic UI in larger screens
Learn more
To learn more about foldables and large screen devices, see the following
resources:
We’re announcing our biggest update yet to the Wear platform, with new features, APIs and tools to help developers create beautiful, high quality wearable experiences. In this blog post we highlight how we’re making it easier to build great apps for Wear, and how you can start working with pre-release versions of these APIs and tools to prepare your app for the new platform.
First things first: tools
The first thing you’ll want to do is download and install Android Studio Arctic Fox Beta, which includes a developer preview of the new Wear system image as well as improved tools for developing and testing Wear apps without a device:
Emulator with new Wear system image (preview) - A developer preview of the new Wear system image is now available so that you can use and play with the newest platform updates!
Wear app to phone pairing - We’ve made it much simpler to pair Wear emulators with your phone directly from Android Studio, so you can stay in the IDE to develop, test, and iterate. The new pairing assistant guides you step by step through pairing Wear emulators with physical or virtual phones directly in Android Studio! You can start by going to Device Dropdown > Wear OS emulator pairing assistant. Note that this will currently pair with the Wear OS 2 companion, and a Wear companion for the new release will be coming soon. Learn more.
Virtual Heart Rate Sensor - The emulator now has a virtual heart rate sensor, including support for the Heart Rate Sensor API, to help you create and test apps that respond differently to activity levels. Make sure you are running at least Android Emulator v30.4.5 downloaded via the Android Studio SDK Manager.
We also announced a new watch face design tool built by Samsung. This new tool will make it a breeze to develop watch faces for all devices running Wear, and is coming soon.
New developer documentation and design guidance
In preparation for the new version of Wear we’ve completely revamped our developer site with new API documentation, learning pathways, codelabs and samples. And with Wear soon to feature a completely new consumer experience based on the latest from Material Design, we’ve updated our design guidelines to cover the new design system, UI components, UX patterns, and styles. Learn more.
New Jetpack APIs
From new Jetpack APIs tailored for small (round or square) screens and designed to optimize battery life to the Jetpack Tiles API, we’re adding a number of new features to help you build great Wear experiences, reduce boilerplate code, and write code that works consistently across Wear versions and devices:
Tiles - Tiles give users fast, predictable access to the information and actions they rely on most. We’ve now opened up Tiles for developers, and we’ve already been working with several early access partners to add Tiles to their apps. Here are a few coming soon:
The Tiles API is in alpha and supported on devices running Wear OS 2 and up, so you can create Tiles for all the devices in the Wear ecosystem. Tiles will start to show up on consumer watches with the new platform update. Learn more
Task switching and Ongoing Activities - The new version of Wear makes it easy for users to switch back and forth between apps. With a minimal amount of code, you can use the new Ongoing Activities API to let your users return to your app after they’ve navigated away (to start some other task such as music playback) by tapping an activity indicator icon at the bottom of the watch face, double tapping on the the side button, or via the Recents section of the global app launcher. The Ongoing Activities API is now in alpha. Learn more.
Health Services - We also announced today the beginning of a health and fitness platform, created in collaboration with Samsung. This platform provides fitness and health data generated from sensors, contextually-aware algorithms, and all-day health monitoring. You can use the APIs to create high quality, powerful fitness and health experiences for wearables with a simpler development experience. The platform handles all the work to manage your hardware and sensors for you, removing one of the biggest challenges in managing it yourself - knowing when to stop work so the battery doesn't drain. The alpha of this Health Services platform is available today. Learn more.
Other new APIs - We’ve released several other new APIs in Jetpack to make wearable app development easier, including support for curved text, input, watch faces, complications and remote interactions. You can learn more about these APIs here.
Google Play Store changes
We know that user engagement and discovery of an app is an important part of growing your business, so big updates coming to Google Play will soon make it much easier for users to discover great app experiences on the watch, including using search to easily find apps for the watch, look at the Wear category for app recommendations, and install apps to the watch directly from the phone.
Learn more
We’re excited for the next generation of Wear. To learn more about developing apps for smartwatches, see d.android.com/wear. We’re excited to see what you build!
Android Jetpack is a suite of libraries, tools, and guidance to help developers follow best practices, reduce boilerplate code, and write code that works consistently across Android versions and devices. Today, 84% of the top 1000 apps on Google Play rely on Jetpack.
Here’s a round-up of the latest updates in Jetpack - an extended version of our What’s new in Jetpack talk!
New in Stable
CameraX
The CameraX library provides a unified API surface for accessing camera functionality across OS versions, including device-specific compatibility fixes and workarounds. Some of the latest improvements to the library address common feature requests, including support for adjusting exposure compensation and access to more detailed information about camera state and features. Additionally, camera settings like FPS range can now be changed via Camera2Interop while the camera is running. The library also brings support for the latest device and OS features, including high-dynamic-range preview, zoom ratio controls, and support for Android’s Do Not Disturb mode. Perhaps most importantly, though, the library has continued to address performance, resulting in faster image capture and faster initialization, especially on older devices.
Hilt
Hilt is Jetpack’s recommended dependency injection solution built on top of Dagger. As part of the transition to stable, Hilt’s ViewModel support has moved up into the core Hilt Android APIs and SavedStateHandle has been added as a default dependency available in the ViewModelComponent. Also, Hilt is now integrated with Navigation and Compose: you can obtain an annotated Hilt ViewModel that is scoped to a destination or the navigation graph itself. Developers have already started using Hilt in their apps. Read about their experience in this blog post.
Paging 3.0
The Paging library allows you to load and display small chunks of data to improve
network and system resource consumption. This release features a complete rewrite in Kotlin with first-class support for coroutines and Flow, asynchronous loading with RxJava and Guava primitives, and overall improvements to the repository and presentation layers.
The 3.0 release is a substantial improvement in usability over Paging 2, and the rewrite was planned with partial and staged migrations in mind so that developers can transition on their own schedules. Check out the Paging 3.0 documentation and the Paging 3.0 codelab for details and hands-on experience.
ConstraintLayout and MotionLayout
ConstraintLayout, Jetpack’s flexible system for designing layouts, and MotionLayout, an API aimed at managing motion and widget animation, are now stable. MotionLayout now includes support for foldable devices, image filters, and motion effects. To find out more about what’s new in design tools, check out this Google I/O talk.
Security Crypto
The Security Crypto library allows you to safely and easily encrypt files and SharedPreferences. To encrypt SharedPreferences, create an EncryptedSharedPreferences object with the appropriate key and scheme and then use it like a standard SharedPreferences object.
val prefs: SharedPreferences = EncryptedSharedPreferences.create(
context,
"prefs_file_name",
mainKey,
prefKeyEncryptionScheme = AES256_SIV,
prefValueEncryptionScheme = AES256_GCM,
)
// Use the resulting SharedPreferences object as usual.
prefs.edit()
.putBoolean("show_completed", true)
.apply()
Fragment
Over the past year, the Fragment library has undergone a major effort to clean up its internal implementation and reduce undocumented behavior, making it easier for developers to follow best practices in their apps and write reliable tests. This lays the groundwork for future improvements to the library, like supporting multiple back stacks in Navigation, and it may require some work to accommodate strict enforcement of API contracts. In practice, you should pay careful attention to your tests after updating the library. Check out the Fragment release notes to see specific cases to watch out for.
Recent releases have also introduced ActivityResult integration, making it possible to register for Activity results from a fragment. Fragment has also added a new FragmentOnAttachListener interface to replace the less-flexible onAttachFragment method. Existing code that overrides this method in Fragment or FragmentActivity will still work, but we’ve deprecated onAttachFragment to help prevent new code from accidentally adopting a less-flexible approach.
// Obtain the fragment manager. May be a childFragmentManager,
// if in a fragment, to observe child attachment.
val fm = supportFragmentManager
val listener = FragmentOnAttachListener {
fragmentManager, fragment ->
// Respond to the fragment being attached.
}
fm.addFragmentOnAttachListener(listener)
New in Beta
Once a library is feature complete it moves to Beta for stabilization. At this moment, the APIs change only in response to critical issues or community feedback.
DataStore
DataStore provides a robust data storage solution that addresses the shortcomings of SharedPreferences while maintaining a simple, highly usable API surface. DataStore brings support for best practices like Kotlin coroutines with Flow and RxJava. DataStore allows you to store key-value pairs, via Preference DataStore or typed objects backed by protocol buffers, via Proto DataStore. You can also plug in your own serialization solution, like Kotlin Serialization.
New in Alpha
Alpha libraries are libraries under active development—APIs may be added, changed, or removed, but what’s in the library is tested and should be highly functional.
AppSearch
AppSearch is a new on-device search library which provides high performance and feature-rich full-text search functionality. Compared to SQLite, AppSearch supports multiple world languages, simplifies ranking query results, and offers lower latency for indexing and searching over large datasets.
AppSearch 1.0.0-alpha01 is released with LocalStorage support, which allows your application to manage structured data, called “documents”, and then query over it. Your application defines what the structure looks like using “schema types”. For instance, you can model a Message as a schema type with data such as subject, body, and sender.
Use builders to create documents of a schema type and then add them to storage. Querying for “body:fruit” will retrieve all documents with the term “fruit” in the body of the Message.
In Android S, AppSearch will also offer PlatformStorage so you can share your application’s data with other applications securely, and reduce your application’s binary size by not having to link additional native libraries. This is currently not available in Jetpack because the library doesn’t target the Android S SDK yet.
Centralized storage on Android S+ for integrating into device-wide search
Room
Room is the recommended data persistence layer, providing increased usability and safety over the platform.
Room 2.4.0-alpha brings support for auto-migrations. When your database schema changes, you now declare an @AutoMigration and indicate from which version to which version you want to migrate, and Room generates the migrations for you. For more complex migrations, you can still use the Migration class:
@Database(
- version = 1,
+ version = 2,
entities = { Doggos.class },
+ autoMigrations = {
+ @AutoMigration (from = 1, to = 2)
+ }
)
public abstract class DoggosDatabase extends RoomDatabase { }
Room 2.3.0 stable version brings experimental support for Kotlin Symbol Processing which, in our benchmarks of Kotlin code showed a 2x speed improvement over KAPT, as well as built-in support for enums and RxJava3.
Room has also introduced a QueryCallback class—which provides a callback when SQLite statements are executed, to simplify tasks like logging—as well as the new @ProvidedTypeConverter annotation, which allows more flexibility when creating type converters.
WorkManager
The WorkManager library—Android’s recommended way to schedule deferrable, asynchronous tasks that run even if the app exits or the device restarts—has made improvements to reliability with task reconciliation, ensuring all tasks are executed, and a variety of workarounds for specific Android OS versions.
The latest versions of WorkManager feature improved support for multi-process apps, including performance benefits from unifying work request scheduling to a single process and limiting database growth when scheduling many requests.
Version 2.7—now in alpha, which is targeted to the Android S SDK—provides additional support for the platform’s new foreground restrictions. See the Effective Background Tasks on Android talk for more details.
The Background Tasks Inspector is available in Android Studio Arctic Fox, allowing you to easily view and debug WorkManager jobs when using the latest versions of the library:
Background Tasks Inspector
Navigation
The Navigation library, Jetpack’s framework for moving between destinations in an app, now provides support for multiple backstacks and simplifies cases where destinations sit at the same depth, such as a bottom navigation bar.
Macrobenchmark
The Macrobenchmark library extends Jetpack’s benchmarking coverage to app startup and integrated behaviors like scrolling performance. The library can be used remotely to track metrics in continuous integration testing or locally with profiling results viewable from Android Studio. Check out the Google I/O talk on all the details:
For developers who’d like to integrate more closely with Google Assistant, the Google Shortcuts library provides a way to expose actions to Google Assistant and other Google Services through the existing ShortcutInfo class.
You can send up to fifteen shortcuts at a time through the ShortcutManager to be shown on Google Assistant, among other services, making them available for voice and other interactions.
To implement this, define a shortcut with an Intent and a capability binding; this binding provides semantically-meaningful information that will help Google services figure out the best way to surface it to users.
// expose a "Cappuccino" action to Google Assistant and other services
ShortcutInfoCompat siCompat =
ShortcutInfoCompat.Builder(ctx, "id_cappuccino")
.setShortLabel("Cappuccino")
.setIntent(Intent(ctx, OrderCappuccino::class.java))
.addCapabilityBinding(
"actions.intent.ORDER_MENU_ITEM",
"menuItem.name",
asList("cappuccino")
)
.build()
ShortcutManagerCompat.pushDynamicShortcut(ctx, siCompat)
EmojiCompat
All user-generated content in your app contains ?, and supporting modern emoji is a key part of making your app ✨! The EmojiCompat library, which supports modern emoji on API 19 and higher, has moved to a new artifact :emoji2:emoji2, which replaces the previous :emoji:emoji artifact. The new emoji2 library adds ? automatic configuration using the AppStartup library (you don't have to add any code ??? to display ?❄️)!
AppCompat adds emoji2 starting with AppCompat 1.4. If your app uses AppCompat, users will see modern emoji ⭐ without any further configuration. Apps that aren't using AppCompat can add :emoji2:emoji2-views. For custom TextViews, you can support modern emoji by using the helpers in :emoji2:emoji2-views-helpers or by subclassing AppCompat views.
Jetpack Compose
Jetpack Compose is Android’s modern toolkit for building native UI. It simplifies and accelerates UI development on Android. Jetpack Compose is currently in beta, and planned to go stable in July. Many of the libraries listed here, as well as others that you might already be using, have introduced features specifically for integration with Jetpack Compose. Ranging from Activity to ViewModel, Navigation, or Hilt, all of these libraries can make adopting Compose in your app smoother. Find out more about about how to use them from this Google I/O talk:
Form factors
Jetpack makes it easier to work with different form factors, including foldables, large screen devices, and Wear devices. We've introduced new guidelines for large screen development along with improvements to Jetpack libraries such as WindowManager and SlidingPaneLayout. Read all the details in this blog post.
Conclusion
This was a (relatively) quick overview of what’s new in Jetpack. Check out the AndroidX release notes for all the update details of each library and the Google I/O talks for more information on some of them.
Posted by Sara N-Marandi, Product Manager, Android Platform Product
People want an OS and apps that they can trust with their most personal and sensitive information. Privacy is core to Android’s product principles. As shared in the “What’s new in Android Privacy” session, Android 12 continues to expand on this existing foundation by making the platform even more private.
This release will give users more transparency around the data being accessed by apps while providing simple controls to make informed choices. Android is also investing in reducing the scope of permissions so that apps only have access to the data they need for the features they provide. Let’s look at some of these important changes we’ve made in Android 12 to protect user privacy.
Privacy dashboard: Users often tell us that they want to understand what data apps use. With the new Privacy Dashboard, users will have a simple and clear timeline view of the last 24 hour accesses to location, microphone and camera. You can also share more context about your app’s data usage with a new permission intent API in Android 12. The Privacy dashboard will be available to try in Beta 2.
We encourage all developers to review your code and understand data access needs, including those in third-party SDKs, and make sure all accesses have justifiable use cases. To help with that, in Android 11 we added Data access auditing APIs to make it easy for you to audit your current data access. Use the APIs to untangle mapping of your code by tracking which part of your code accesses private data. The Privacy dashboard will be available to try in Beta 2.
Figure 1. Privacy dashboard and location access timeline in the past 24 hours.
Microphone and camera indicators: In Android 12 we’re adding transparency to microphone and camera access. Going forward, users will know in real time when an app accesses their microphone or camera feeds. By simply going into Quick Settings, users can view the apps accessing their data. If the access is unwarranted, users can quickly navigate to the app permission page to revoke permissions.
Developers should review their use of microphone and camera and proactively remove unexpected access. For example, you should ensure that your app does not access these sensors before the user clicks on a feature that needs access. The Microphone and camera indicators will be available to try in Beta 2.
Figure 2. Microphone and camera indicators and toggles.
Microphone and camera toggles: You may have seen people placing stickers on cameras or plugging audio blockers into their phones. In Android 12, we’re introducing two new controls that allow users to quickly and easily cut off apps’ access to the microphone and camera on the device. To ensure user safety, emergency calls will be exempted.
If an app with permissions attempts to access the microphone or camera but the user has the sensors turned off, the system will display a message to inform the user that they must turn the sensors back on in order to use the app’s features. If your app follows permissions best practices, then you don’t need to do anything different to incorporate the toggle state. The Microphone and camera toggles will be available to try in Beta 2.
Approximate location: Over the last two releases, we’ve made location permission fine grained. First, we separated background and foreground access. Then, we added an “only this time” option to further restrict access to background location. We’re seeing users respond positively to these controls and are choosing them more often. When given the option, users elect to share less through foreground location access about 80% of the time.
In Android 12, we will give users more control over their location data. Users will have a clear choice regarding the precision of location provided to the app by selecting approximate location.
We encourage you to review your use case for location and request ACCESS_COARSE_LOCATION if your features don’t need the user’s precise location. You should also be prepared for users to reduce location precision. Please make sure your app still works when users select approximate. Approximate location will be available to try in Beta 1.
Figure 3. Location permission request dialog with approximate and precise selection
Clipboard read notification: Content copied to the clipboard can contain sensitive information as users frequently copy emails, addresses, and even passwords. Android 12 notifies users every time an app reads from their clipboard. Users will see a toast at the bottom of the screen each time an app calls getPrimaryClip() . The toast won’t appear if clipboard data originates from the same app. You can minimize access by first checking getPrimaryClipDescription() to learn about the type of data in the clipboard. The recommended best practice is to only access the clipboard when the user understands why the access occured. Clipboard read notification will be available to try in Beta 2.
Nearby device permissions: Android 12 minimizes data access by adding a new runtime permission for nearby experiences that do not use location. Up until now, apps such as watch and headphone companion apps required the location permission to scan for nearby Bluetooth devices for pairing. We heard from users and developers that this was confusing and led to granting the permission to access location data when it wasn’t needed. For apps targeting Android 12, you’ll have the option to decouple nearby device discovery from the fine location permission for use cases like pairing devices by using the new BLUETOOTH_SCAN permission and by declaring usesPermissionFlags=neverForLocation. Once the device is paired, apps can use the new BLUETOOTH_CONNECT permission to interact with it. Apps that use Bluetooth scanning for location must still have the location permission. Nearby device permissions will be available to try in Beta 1.
App hibernation: Last year we launched permissions auto-reset. If an app isn’t used for an extended period of time, Android automatically revokes permissions for the app. In the last 14 days permissions were reset for 8.5M apps. This year we’re building on permissions auto-reset by intelligently hibernating apps that have gone unused for an extended period - optimizing for device storage, performance and safety. The system not only revokes permissions granted previously by the user, but it also force-stops the app and reclaims memory, storage and other temporary resources. Users can bring apps out of hibernation simply by launching the app. App hibernation will be available to try in Beta 1.
Android 12 is our most ambitious privacy release to date. Along the way, we have engaged closely with our developer community to build a platform that puts privacy at the forefront while taking into consideration the impact on developers. We thank you for your continued feedback and support in making our platform private and safe for everyone. Learn more about these changes on the developer site.
Note: As we announced late last year, we've changed our version numbering scheme to match the number for the IntelliJ IDE that Android Studio is based on, 2020.3, plus our own patch number, as well as a handy code name to make it easier to remember and refer to. We'll be using code names in alphabetical order; the first is Arctic Fox, now in beta, and the next is Bumblebee, now in canary.
Today, we are excited to unveil Android Studio Arctic Fox (2020.3.1) Beta ❄️?: the latest release of the official Android IDE focuses on Design, Devices, and Developer Productivity. It is available for download now on the beta channel for you to try out all the new features launched this week during Google I/O 2021!
Inspired by developer communities around the world, who despite having to adjust to challenges this past year still continue to create amazing and innovative apps, we have delivered and updated the suite of tools to empower three major themes:
Rapid UI design - with
Jetpack Compose, it's never been easier to create modern UIs, and we have tools to help
complete that journey: you can create previews in different configurations
and navigate your code with Compose Preview, test it in isolation with
Deploy Preview to Device, and inspect the full app with Layout inspector.
Throughout iterations, you can quickly edit strings and numbers and see
immediate updates. Moreover, with the Accessibility Scanner in Layout
Editor, your View based layouts are audited for accessibility problems.
New devices, both large and small - reimagine and extend your app
beyond phones--whether it's for Wear OS, Google TV, or Android Auto, we have
prepared new emulators and system images, and even authentic simulations for
different testing scenarios: pair your watch and phone emulators with Wear
OS Pairing, take a virtual run with Wear OS heart rate sensors, switch
channels with GoogleTV Remote Control, and drive with Automotive OS Sensor
Replay.
Developer productivity boost - we want to ensure your workspace and
environment are ready for the latest systems and optimized for speed and
quality. Now you can enjoy a whole slew of new features and improvements
that come with a major update to Intellij 2020.3, test your app with what
Android 12 has to offer, improve your app performance with the updated UI
for Memory Profiler, understand background task relationships with
WorkManager Inspector, and use Non-Transitive R classes IDE Refactoring to
increase build speed.
In short, this is an upgrade you do not want to miss! ✨ There are many more
features and improvements surrounding these themes you can find in this Beta
version, so read or watch below for further highlights. Or, skip the reading,
download Android Studio Arctic Fox (2020.3.1) Beta in the
beta channel
and try out the latest features yourselves today! Give us feedback and help us
to continue to focus on the areas you care about most in the next version of
Android Studio.
What's new in Android development tools (I/O 2021)
What’s in Android Studio Arctic Fox (2020.3.1) Beta
Below is a full list of new features in Android Studio Arctic Fox (2020.3.1)
Beta, organized by the three major themes:
Design
Compose Preview - You can create previews of your Compose UI with
Compose Preview! By using the @Preview annotation, Compose previews can be
made to visualize multiple components at once in different configurations
(i.e themes, device) as well as create a mental mapping for you to
navigate your code.
Compose Preview
Layout Inspector for Compose - You can now inspect layouts written in
Compose with Layout Inspector. Whether your app uses layouts
fully written in Compose or layouts that use a hybrid of Compose and Views,
the Layout Inspector helps you understand how your layouts are rendered on
your running device or emulator, obtain rich details (such as parameters and
modifiers passed to each composable), and debug issues that might arise. As
you interact with the app, you now also have the option to either enable
Live Updates to constantly stream data from your device, or reduce
performance impact on your device by disabling live updates and clicking the
Refresh action as needed.
Compose Layout Inspector
Deploy Preview to Device - Use this feature to deploy a snippet of
your UI to a device or emulator. This will help to test small parts of your
code in the device without having to start the full application. Your
preview will benefit the same context (permissions, resources) as your
application. You can click the Deploy to device icon on the top of any
Compose preview or next to the @Preview annotation in the code editor gutter
and Android Studio will deploy that @Preview to your connected device or
emulator.
Using Deploy to device from preview and gutter icon
Live Edit of literals - Live Editing of literals allows developers
using Compose to quickly edit literals (strings, numbers, booleans) in their
code and see the results immediately without needing to wait for
compilation. The goal of the feature is to increase your productivity
by having code changes appear near instantaneously in the previews,
emulator, or physical device.
Editing numbers and strings update immediately in the preview and on device
Accessibility Scanner for Layout Editor - Android Studio now integrates with the Android Accessibility Test Framework to help you find accessibility issues in your layouts. When using the Layout Editor, click on the error report button to launch the panel. The tool will report accessibility related issues and also offers suggested fixes for some common problems (e.g. missing content descriptions, or low contrast)
Accessibility Test Framework Scanner in Layout Editor
Devices
Wear OS Pairing - We created a new Wear OS pairing assistant to
guide developers step by step through pairing Wear OS emulators with
physical or virtual phones directly in Android Studio! You can start by
going to device dropdown > Wear OS emulator pairing assistant. Note that this will currently pair with Wear OS 2 companion, and Wear OS 3 will be coming soon.
Learn more.
Wear OS emulator pairing assistant dialog
Phone + Watch emulators paired successful state
New Wear OS system images - a developer preview of the Wear OS 3 system image is now available so that you can use and play with the newest version of Wear OS!
Wear OS system image
Heart Rate Sensor for Wear OS Emulators - To help you test your Wear
OS apps, the Android Emulator now has support for the
Heart Rate Sensor API
when you run the Wear OS emulator.
Make sure you are running at least Android Emulator v30.4.5 downloaded via the Android Studio SDK Manager
Heart Rate Sensor for Wear OS Emulators
Google TV Remote Control - On top of running the new Google TV UI, we now have an updated Remote control panel, which has mapping for the new Google TV remote controls features like: user profile, and settings.
Google TV remote controls
New Google TV system images - We have updated the system images to reflect the new Google TV experience allowing you to freely explore the UI.
Google TV system image
Automotive OS Sensor Replay - You can now use the Android Automotive emulator to simulate driving scenarios, with the ability to replay car sensor data (e.g. speed, gear), completing your development and testing workflow.
Android Automotive OS Sensor replay
Developer Productivity
IntelliJ Platform Update - Android Studio Arctic Fox (2020.3.1) Beta
includes the IntelliJ 2020.3 platform release ?, which has many new
features such as Debugger interactive hints, new Welcome screen, and a ton
of new code editor enhancements to speed up your workflow.
Learn more.
Android 12 lint checks - We’ve added lint checks that are specific
to building your app for Android 12 so that you can get guidance in
context. To name a few -- we have built checks for custom declarations of
splash screens, coarse location permission for fine location usage, media
formats, and high sensor sampling rate permission.
Non-transitive R classes Refactoring - Using non-transitive R classes
with the Android Gradle Plugin can lead to faster builds for applications
with multiple modules. It prevents resource duplication by ensuring that
each module only contains references to its own resources, without pulling
references from dependencies. You can access this feature by going to
Refactor > Migrate to Non-transitive R Classes.
Apple Silicon Support Preview - For those using MacOS on Apple
Silicon (arm64) hardware, Android Studio Arctic Fox provides preview support
for this new architecture. The arm64 platform support is still under
active development, but we wanted to provide you a release order to get your
feedback. Since this is a preview release for the arm64 architecture, you
will have to separately download this version from the
Android Studio download archive
page and look for Mac (Apple Silicon).
Extended controls in the Emulator tool window - Developers now have access to all extended emulator controls when the
emulator is opened in a tool window. The extended controls will give
developers powerful tools for testing their apps such as navigation
playback, virtual sensors, and snapshots all within Android studio. To
launch the Emulator within Android Studio go to Android Studio's
Preferences > Tools > Emulator and select “Launch in a tool window."
Extended controls in the Emulator tool window
Background Task Inspector - You can now utilize the Background
Task Inspector to visualize, monitor, and debug your app's background
workers when using
WorkManager library
2.5.0 or higher. You can access it by going to View > Tool Windows > App Inspection from the menu bar. When you deploy an app on a device running API level 26 and higher, you should see active workers in
the Background Task Inspector tab, as shown below.
Learn more.
Background Task Inspector
Parallel device testing with Test Matrix - Instrumentation tests can now be run across multiple devices in parallel and investigated using a new specialized instrumentation test results panel, called the Test Matrix, which streams the test results in real time.
Learn more
Test matrix running tests across multiple devices in parallel
Memory Profiler new recording UI - We have consolidated the Memory Profiler UI for different recording
activities, such as capturing a heap dump and recording Java, Kotlin,
and native memory allocations.
Memory Profiler: recorded Java / Kotlin Allocations
Updated system requirements - In order to ensure that we provide
the best experience for Android developers, we are updating the system
requirements when using Android Studio. These requirements also represent
the configurations we use to thoroughly test Android Studio to maintain
high quality and performance, and we plan to update them more frequently
going forward. So, while you’re still able to use systems that fall below
the requirements, we can’t guarantee compatibility or support when doing
so. You can see the
updated system requirements
on the official developer site.
To recap, Android Studio Arctic Fox (2020.3.1) Beta includes these new
enhancements & features:
Design
Compose Preview
Compose Layout Inspector
Deploy Preview to Device
Live Edit of literals
Accessibility Scanner in Layout Editor
Devices
Wear OS Pairing
Heart Rate Sensor
New Wear OS system images
Google TV Remote Control
Google TV system Images
Automotive OS Sensor Replay
Productivity
Intellij 2020.3.1
Android 12 lint checks
Non-transitive R classes Refactoring
Apple Silicon Support Preview
Android Emulator Extended Controls
Background Task Inspector
Test matrix
Memory Profiler new recording UI
You might also have seen other new features at I/O which are not included in
the list above; they are included in Android Studio (2021.1.1) Bumblebee
Canary since these features were not quite ready for a beta channel
release:
Design
Interactive Compose preview
Compose Animation preview
Preview Configuration Picker
Animated vector drawable preview
Compose Blueprint Mode
Compose Constraints Preview for ConstraintLayout
Devices
Automotive OS USB Passthrough - Coming soon
Automotive OS Rotary Controls - Coming soon
Productivity
Kotlin Coroutines debugger
Device Manager
Gradle Instrumented Test Runner Integration in Android Studio
Gradle Managed Devices
Sessions at Google I/O 2021
With this exciting release, the Android Studio team also presented a series of
sessions about Android Studio. Watch the following videos to see the latest
features in action and to get tips & tricks on how to use Android Studio
?:
Android Studio Arctic Fox (2020.3.1) is a big release, and now is a good time
to
download
and check out the Beta release to incorporate the new features into your
workflow. The beta release is near stable release quality, but as with any
beta release, bugs may still exist, so, if you do find an issue, let us know
so we can work to fix it. If you’re already using Android Studio, you can
check for updates on the Beta channel from the navigation menu
(Help > Check for Update [Windows/Linux] , Android Studio > Check
for Updates [OS X]). When you update to beta, you will get access to the new version of Android
Studio and Android Emulator.
As always, we appreciate any feedback on things you like, and issues or
features you would like to see. If you find a bug or issue, please file an
issue. Follow us -- the Android Studio development team -- on
Twitter
and on
Medium.
Posted by Mickey Kataria, Director of Product Management
For over a decade, Google has been committed to automotive, with a vision of creating a safe and seamless connected experience in every car. Developers like all of you are a crucial part of helping people stay connected while on the go. We’re seeing strong momentum across our in-car experiences, Android Auto and Android Automotive OS, and today, we’re excited to share the latest updates and opportunities to reach users in the car.
Check out our I/O session: What's new with Android for Cars
Android Auto
Android Auto, which allows users to connect their phone to their car display, now has over 100 million compatible cars on the road and is supported by nearly every major car manufacturer. Porsche is our newest partner and they will begin shipping Android Auto on new cars, starting this summer with the Porsche 911.
We’ve been working closely with car manufacturers to build an even better Android Auto experience by enabling wireless projection in more vehicles, extending availability to more countries, and continuing to launch new features, like integration into the instrument cluster. To see some of the newest Android Auto technology in the BMW iX, check out the video below.
Android Auto projecting to the cluster display in a BMW iX.
Android Automotive OS
Our newest in-car experience, Android Automotive OS with Google apps and services built-in, also has strong momentum. With this experience, the entire infotainment system is powered by Android and users can access Google Assistant, Google Maps, and more apps from Google Play directly from the car screen without relying on a phone. Cars from Polestar and Volvo, like the Polestar 2 and the Volvo XC40 Recharge, are already available to customers. And by the end of 2021, this experience will be available to order in more than 10 car models from Volvo, General Motors and Renault. You can get a sneak peek of this customized experience in the new GMC HUMMER EVbelow.
The all-electric GMC HUMMER EV infotainment features Android Automotive OS with Google built-in. Preproduction model shown. Actual production models may vary. Initial availability Fall 2021.
Developing new apps for cars
To support this growing ecosystem, we recently made the Android for Cars App Library available as part of Jetpack. It allows developers of navigation, EV charging and parking apps to bring their apps to Android Auto compatible cars. Many of these developers have already published their Android Auto apps to the Play Store and we’re now extending this library to also support Android Automotive OS, making it easy for you to build once and generate apps that are compatible with both platforms. We’re already working with Early Access Partners — including Parkwhiz, Plugshare, Sygic, ChargePoint, Flitsmeister, SpotHero and others — to bring apps in these categories to cars powered by Android Automotive OS.
PlugShare, an app for finding EV chargers, has used the Android for Cars App Library and Google Assistant App Actions to build for Android Auto.
We plan to expand to more app categories in the future, so if you’re interested in joining our Early Access Program, please fill out this interest form. You can also get started with the Android for Cars App Library today, by visiting g.co/androidforcars. Lastly, you can always get help from the developer community at Stack Overflow using the android-automotive and android-auto tags. We can’t wait to see what you build next!
Posted by Ben Serridge, Director of Product Management - TV Platforms and Dan Aharon, Product Manager
Today at Google I/O 2021, we announced a significant milestone for our
team: we have over 80 million monthly active devices on Android TV OS, with
more than 80% growth in the US alone. We would not be here without the hard
work of the developer community, so a huge and heartfelt thank you to you
all.
Android TV OS is the operating system that powers a number of devices
around the world including the new Google TV experience launched last fall.
Google TV has generated a lot of excitement from consumers, developers, and
industry partners alike, offering a content forward TV experience that helps
the user discover more of the movies and shows they love. Google TV is
available on streaming devices like the Chromecast with Google TV, smart TVs from Sony (and soon TCL!), and as an app on Android devices. Check out this presentation
on how to get your app ready for Google TV.
Our goal is to always enable you to build better and more engaging
experiences on Android TV OS. One example of this is the widely utilized
Watch Next API which increases app re-engagement by ~30% in certain
cases1. Well over 100 major media partners are already using WatchNext API and
you can learn more about how to add your app here.
We are also announcing several new tools and helpful features to make
developing for Android TV OS easier and enable you to create engaging
experiences for your users. Some are already available and some will be
available soon:
Cast Connect with Stream Transfer and Stream Expansion: Cast Connect
allows users to cast from their phone/ tablet or Chrome browser onto your
app on Android TV. Stream Transfer and Stream Expansion allow users to
transfer media to other devices and/or play audio on multiple
devices.
Emulator updates:
To help you make your app work better on Google TV without requiring new
hardware, we are now making our first Google TV Emulator available,
running on Android 11. There will also be an Android 11 image with the traditional Android TV
experience. You can now also use a remote that more closely mimics TV
remotes directly within the Emulator.
Firebase Test Lab:
Firebase Test Lab runs millions of tests every week on behalf of
developers. Following requests from developers, we are excited to share
that Firebase Test Lab is adding Android TV support. Firebase Test Lab
Virtual Devices run your app in the cloud on Android TV emulators and
allow you to scale your test across hundreds or thousands of virtual
devices. Physical Devices will be coming soon.
Android 12 Beta 1:
We are making the Android 12 Beta 1 available for TV on ADT-3 today. With
this release the developer community will be able to take advantage of
many of the changes and improvements coming with Android 12. We encourage
you to try it and provide us with feedback.
Thank you for your continued support of the Android TV OS platform. The
future of TV is bright and we can’t wait to see what you build
next!
1 Average gain in number of days active in the app in a 28-day period
amongst app 28DAUs, based on 3 apps analyzed during the 11/2020 - 2/2021
period.
Posted by Chiko Shimizu, Partner Developer Advocate and Tamao Imura, Developer Marketing Manager
Mercari allows millions of people to shop and sell almost anything. The company was founded in 2013 in Japan, and it now is the largest smartphone-focused C2C marketplace in Japan. Mercari’s Client Architect Team started using Jetpack Compose in 2020 with the goal of using modern solutions and technologies that can scale for the long term to build their tech stack for new applications.
What they did
The Mercari team needed to implement a design system with complex state management and styling on Android Views — a very complex task. Using Jetpack Compose, they were not only able to implement this complex system, it helped them spend less time developing each screen.
Jetpack Compose also helped the team write UI code for their new app utilizing the design system, making their UI code concise and easy to understand. As a result, the team can spend more time writing screens and business logic, such as practical support for the dark theme.
In addition, the Mercari team wrote a proof-of-concept tool for integrating Figma with the design system, which automatically generates UI code from the component designs. The team said that developing this tool was easier with Compose due to its declarative nature.
“Once Android developers get used to writing Jetpack Compose code, they wouldn’t wish to go back.” - Anthony Allan Conda, Android Tech Lead at Mercari
Results
Between Jetpack Compose and their new design system, Mercari was able to use far less code to write screens. On screens with infinitely-scrollable content — a common use case — they actually reduced their code by about 56%. As a result, they were able to write more screens in the same amount of time, giving them more time to write business logic and other parts of the code.
Also, they were able to do more with the UI itself, such as incorporating animations and using intuitive APIs such as AnimatedVisibility, Crossfade, and Animatable.
Mercari is planning to continue using Jetpack Compose in their new application until its release. Their design system, with the Android SDK written in Jetpack Compose, is also designed to work with multiple applications within Mercari.