Tag Archives: Android

MAD Skills Kotlin and Jetpack: wrap-up

Posted by Florina Muntenescu, Developer Relations Engineer

Kotlin and Jetpack image

We just wrapped up another series of MAD Skills videos and articles - this time on Kotlin and Jetpack. We covered different ways in which we made Android code more expressive and concise, safer, and easy to run asynchronous code with Kotlin.

Check out the episodes below to level up your Kotlin and Jetpack knowledge! Each episode covers a specific set of APIs, talking both about how to use the APIs but also showing how APIs work under the hood. All the episodes have accompanying blog posts and most of them link to either a sample or a codelab to make it easier to follow and dig deeper into the content. We also had a live Q&A featuring Jetpack and Kotlin engineers.

Episode 1 - Using KTX libraries

In this episode we looked at how you can make your Android and Jetpack coding easy, pleasant and Kotlin-idiomatic with Jetpack KTX extensions. Currently, more than 20 libraries have a KTX version. This episode covers some of the most important ones: core-ktx that provides idiomatic Kotlin functionality for APIs coming from the Android platform, plus a few Jetpack KTX libraries that allow us to have a better user experience when working with APIs like LiveData and ViewModel.

Check out the video or the article:

Episode 2 - Simplifying APIs with coroutines and Flow

Episode 2, covers how to simplify APIs using coroutines and Flow as well as how to build your own adapter using suspendCancellableCoroutine and callbackFlow APIs. To get hands-on with this topic, check out the Building a Kotlin extensions library codelab.

Watch the video or read the article:

Episode 3 - Using and testing Room Kotlin APIs

This episode opens the door to Room, peeking in to see how to create Room tables and databases in Kotlin and how to implement one-shot suspend operations like insert, and observable queries using Flow. When using coroutines and Flow, Room moves all the database operations onto the background thread for you. Check out the video or blog post to find out how to implement and test Room queries. For more hands-on work - check out the Room with a view codelab.

Episode 4 - Using WorkManager Kotlin APIs

Episode 4 makes your job easier with WorkManager, for scheduling asynchronous tasks for immediate or deferred execution that are expected to run even if the app is closed or the device restarts. In this episode we go over the basics of WorkManager and look a bit more in depth at the Kotlin APIs, like CoroutineWorker.

Find the video here and the article here, but nothing compares to practical experience so go through the WorkManager codelab.

Episode 5 - Community tip

Episode 5 is by Magda Miu - a Google Developer Expert on Android who shared her experience of leveraging foundational Kotlin APIs with CameraX. Check it out here:

Episode 6 - Live Q&A

In the final episode we launched into a live Q&A, hosted by Chet Haase, with guests Yigit Boyar - Architecture Components tech lead, David Winer - Kotlin product manager, and developer relations engineers Manuel Vivo and myself. We answered questions from you on YouTube, Twitter and elsewhere.

The best of Google, now in new devices

Wherever you are and whatever you're doing, technology should work for you. This week during a virtual CES and Galaxy Unpacked, we were introduced to a lineup of new products that do exactly that, all with the best of Google built in.


Whether you're heading out or staying in, there's something new for you to get excited about.


When you’re on the go

The new Galaxy S21 series comes with a more cohesive Android experience and updates that make it easier to stay in touch with friends and family. From your phone, you can now mirror Google Duo to your Samsung TV, so video calls feel a little more like the real thing. With the Messages app, you can use Rich Communication Services (RCS) to chat over Wi-Fi, know when messages are read, share reactions as well as high resolution videos, and enjoy a more dynamic communication experience with features such as Smart Actions and spam protection (varies by carrier and market availability). A single swipe from your home screen will give you the option for personalized content with Discover, while our screen reader TalkBack has been revamped so that people with blindness or trouble seeing their displays can use spoken feedback and gestures to navigate their phone without having to look at the screen (varies by carrier and market availability).


We’re also working with Samsung to make it easier to manage smart home products from your device. You can control Nest devices, like Nest thermostats, cameras and doorbells, from the SmartThings app on Galaxy smartphones and tablets. See all your connected devices on one screen by tapping on "Devices" in the Quick Panel of the Galaxy S21. Starting next week, SmartThings will also be available in Android Auto, so you can do things like turn off your kitchen lights from your car’s display as you pull out of the driveway.


For parents who want a productive tablet that can easily be shared with their kids, the new Lenovo Tab P11 comes with Kids Space, our new kids mode that features recommended apps, books and videos to help kids under 9 learn and have fun. 


And to get help from your wrist, new Wear OS by Google smartwatches keep you connected wherever you are. For Android phone users in the U.S., you can send texts and make calls on Fossil's Gen 5 LTE Touchscreen Smartwatch without your phone. And Michael Kors Access Gen 5E MKGO and Gen 5E Darci smartwatches are a fashionable option for keeping track of your health and wellness, staying in touch with friends and family and even making payments.


You’ll even be able to leave your phone in your pocket when you’re outside. New headphones, including the JBL Tour ONE and Tour Pro+ and Kenwood WS-A1G come with help from Google. Simply press the earbud to send a message, access your calendar or change songs.


Of course, for those times when you’re perfectly happy...


Hanging out at home

Google TV will be available on 2021 smart TVs from Sony and TCL. Google TV is a new entertainment experience that brings together movies, shows, live TV and more from across your apps and subscriptions and organizes them just for you. You can ask “Hey Google, find action movies” or “show me sci-fi adventure TV shows” and browse a wide selection of content with your voice. In fact your voice can be used in all sorts of ways: Your LG TV (from 2019 models) can now be controlled by a Google-enabled smart speaker in 15 countries and six languages. You will also soon be able to control your Verizon Fios set top box by voice when connected with a Google-enabled smart speaker or Smart Display. And to do more on your TV, you can stay connected with your loved ones with one-on-one or group video calls with the Duo app on Samsung TV (an optional USB-camera needed).


New connected lights from LIFX, Nanoleaf and Yeelight now work with Hey Google. These new devices support Seamless setup, which makes it possible to connect compatible smart home devices directly through the Google Home app and a Nest speaker or Smart Display without the need for an additional hub or bridge. 


If you’re looking for an assist with cleaning up around the house, you can just say “Hey Google” to control the Smart MEDION’s vacuum cleaner MD 19601. To help more manufacturers bring voice capabilities like this to their smart home devices, we also recently launched the Authorized Solution Provider program. Our certified partners Tuya and CoolKit can now help manufacturers  build smart home Actions for Google Assistant. 


Whatever new device you pick out, Google will be there to help you get things done and get the most of your tech.

Source: Android


Treble Plus One Equals Four

Posted by Iliyan Malchev (Project Treble Architect), Amith Dsouza (Technical Account Manager) , and Veerendra Bhora (Strategic Partnerships Manager)

Illustration of phone with settings logo in the screen

Extending Android updates on Qualcomm’s Mobile Platforms

In the past few years, the latest Android OS has been adopted earlier by OEMs and deployed in larger numbers to our users. The growth in adoption has been driven by OEMs delivering faster OS updates, taking advantage of the architecture introduced by Project Treble.

At the time Android 11 launched there were 667M active users on Android 10, 82% of whom got their Android 10 build via an over the air (OTA) update. Despite the events throughout 2020, there is a continued momentum among our partners to either launch their devices on Android 11 or offer Android 11 OTAs on their devices earlier.

Line graph comparing Android Pie, Android 10, and Android 11

Our efforts till now have been focussed on making OS updates easier and faster to deploy. The other side of this coin is supporting updates for a longer period of time, and today we’d like to provide an overview of the changes we are making to help our partners achieve this.

Project Treble was an ambitious re-architecture of Android that created a split between the OS framework and device-specific low-level software (called the vendor implementation) through a well-defined, stable vendor interface. As a part of this split, the Android OS framework guarantees backward compatibility with the vendor implementation, which is checked through a standardized compliance test suite - VTS. With each Android release, Project Treble publishes Generic System Images (GSIs) that are built from AOSP sources, and are guaranteed to be backwards-compatible with the previous 3 versions of vendor implementations, in addition of course to the current release—for a total span of four years. Devices launching with the new Android release must have vendor implementations compatible with that GSI. This is the primary vehicle for reducing fragmentation within the OS framework. While we allow and encourage our partners to modify the framework itself, the modifications post-Treble must be done in a way that reduces upgrade costs from one version to the next.

Besides the reuse of a vendor implementation across OS updates, the Treble architecture also facilitates the re-use of the same OS framework code across different vendor implementations.

Chart comparing Original OS framework to Updated OS framework

Another important change introduced by Project Treble is that new vendor-impacting requirements for Android devices are never retroactive. They apply only to devices launching on that Android version and not to devices upgrading from an older version. The term vendor-impacting here refers to requirements for new HALs, or for the shipping of a newer Linux kernel, to the device's vendor implementation. A good example might be a new revision of the camera HAL to support multiple rear camera sensors. Since the Android framework guarantees compatibility with the older HALs, we enable older vendor implementations to be reused by OEMs for upgrades without the considerable cost of updating them with new requirements.

This principle, combined with the backwards-compatibility guarantee, gives device manufacturers (OEMs) the flexibility to support upgrades both faster (since they have to upgrade just the framework, which would cover all of their devices, including those with older versions of the vendor implementation), as well as at a lower cost (since they do not have to touch the older vendor implementations).

However, seen from a System-on-Chip manufacturers’ perspective, this design introduces additional complexity. For each SoC model, the SoC manufacturers now needed to create multiple combinations of vendor implementations to support OEMs who would use that chipset to launch new devices and deploy OS upgrades on previously launched devices.

The result is that three years beyond the launch of a chipset, the SoC vendor would have to support up to 6 combinations of OS framework software and vendor implementations. The engineering costs associated with this support limited the duration for which SoC vendors offered Android OS software support on a chipset. For every single chipset, the software support timeline would look like this:

Timeline of OS framework

Considering that SoC providers have dozens of SoC models at any point of time, the full picture looks closer to this:

More accurate support timeline

The crux of the problem was that, while device requirements were never retroactive, the requirements for SoCs were. For example on Android Pie, SoCs had to support two versions of the Camera HAL API on a chipset if it was used to support new device launches and upgrades.

From this perspective, the solution was simple: we had to extend the no-retroactivity principle to the SoCs as well as to devices. With this change, the SoC provider would be able to support Android with the same vendor implementations on their SoCs for device launches as well as upgrades.

During the past year, we have been working hard to implement this solution. Building on our deep collaboration with our colleagues at Qualcomm, today we’re announcing the results of this work. Going forward, all new Qualcomm mobile platforms that take advantage of the no-retroactivity principle for SoCs will support 4 Android OS versions and 4 years of security updates. All Qualcomm customers will be able to take advantage of this stability to further lower both the costs of upgrades as well as launches and can now support their devices for longer periods of time.

Going one step further, we’re also reusing the same OS framework software across multiple Qualcomm chipsets. This dramatically lowers the number of OS framework and vendor implementation combinations that Qualcomm has to support across their mobile platforms and results in lowered engineering, development, and deployment costs. The diagram below indicates how significant the simplification is. From a software-support perspective, it's an altogether different situation:

Framework timeline with simplification

This change is taking effect with all SoCs launching with Android 11 and later. By working closely with Qualcomm to offer an extended period of OS and security updates, we are looking forward to delivering the best of Android to our users faster, and with greater security for an extended period of time.

Opening the Google Play Store for more car apps

Posted by Eric Bahna, Product Manager

In October, we published the Android for Cars App Library to beta so you could start bringing your navigation, parking, and charging apps to Android Auto. Thanks for sending your feedback with our issue tracker, so we know where to improve and clarify things. Now we’re ready to take the next step in delivering great in-car experiences.

Today, you can publish your apps to closed testing tracks in the Google Play Store. This is a great way to get feedback on how well your app meets the app quality guidelines, plus get your in-car experience in front of your first Android Auto users.

 Image of T map
Image of PlugShare
 Image of 2GIS

Three of our early access partners: T map, PlugShare,and 2GIS

We’re preparing the Play Store for open testing tracks soon. You can get your app ready today by publishing to closed testing. We’re eager to see what you’ve built!

What’s your MAD score?

Posted by Christopher Katsaros; Your #MADscore tabulator

We’ve been talking to you a lot recently about modern Android development (MAD), through the MAD Skills series. Now it’s time to see: what’s your MAD score? From how many Jetpack libraries you’re using to what percent of your app is coded in Kotlin, today we’re launching a MAD scorecard that shows just how modern an Android developer you are.

Your MAD scorecard uses Android Studio to tell you interesting things like how much size savings your app is seeing through the Android App Bundle. It spotlights each of the key MAD technologies, including specific Jetpack libraries and Kotlin features you could be using. You’ll even get a special MAD character based on your MADest skill (who knows, you just might be a MAD scientist…).

Here’s how to get your scorecard

You can get a personalized look into your MAD score through a new Android Studio plugin, here’s how to get and share your scorecard:

  • Step 1 - Install the plugin: Through Android Studio’s plugin marketplace, find and download the MAD Scorecard plugin. Install easily and quickly through your Studio.
  • Step 2 - Run the plugin: You can always find your MAD Scorecard plugin under Analyze in your main Studio menu. Click on Analyze, and Run to start creating your very own Scorecard.
  • Step 3 - View and share your scorecard: When you’ve completed running the plugin, Studio will show you a notification with your personal link where you can view all the details of your scorecard. Enjoy your results and share it with others!

Level up with the MAD Skills series

Once you’re done with your scorecard, check out the episodes in MAD Skills, a series of videos and articles we’re creating to teach you how to use the latest technologies of Modern Android Development to create better applications more easily. Arranged as a series of three-week topics, from Navigation to Kotlin to Android Studio, each topic will conclude with a Q&A where we’ll answer your questions. You can check out some of our earlier topics, like Material Design Components, App Bundles, and Navigation, and tune into Android Developers on YouTube for future topics.

See your MAD scorecard and share it with all of your friends, here!

MAD Skills Material Design Components: Wrap-Up

Posted by Nick Rout

wrap up header image

It’s a wrap_content!

The third topic in the MAD Skills series of videos and articles on Modern Android Development is complete. This time around we covered Material Design Components (a.k.a MDC). This library provides the Material Components as Android widgets and makes it easy to implement design patterns seen on material.io, such as Material Theming, Dark Theme, and Motion.

Check out the episodes and links below to see what we covered. We designed these videos to closely follow our recent series of MDC articles as well as existing sample apps and codelabs, so you’ve got a variety of ways to engage with the content. We also had a Q&A episode featuring engineers from the MDC team!

Episode 1: Why use MDC?

The first episode by Nick Butcher is an overview video of this entire MAD Skills series, including why we recommend MDC, then deep-dives on Material Theming, Dark Theme and Motion. It also covers MDC interop with Jetpack Compose and updates to Android Studio templates that include MDC and themes/styles best practices.

Or in article form:

https://medium.com/androiddevelopers/we-recommend-material-design-components-81e6d165c2dd

Episode 2: Material Theming

Episode 2 by Nick Rout covers Material Theming and goes through a tutorial on how to implement it on Android using MDC. Key topics include setting up a `Theme.MaterialComponents.*` app theme, choosing color, type, and shape attributes — using tools on material.io —and finally adding them to your theme to see how widgets automatically react and adapt their UI. Also covered are handy utility classes that MDC provides for certain scenarios, like resolving theme color attributes and applying shape to images.

Or in article form:

https://medium.com/androiddevelopers/material-theming-with-mdc-color-860dbba8ce2f

https://medium.com/androiddevelopers/material-theming-with-mdc-type-8c2013430247

https://medium.com/androiddevelopers/material-theming-with-mdc-shape-126c4e5cd7b4

Episode 3: Dark Theme

This episode by Chris Banes gets really dark… It takes you through implementing a dark theme for an Android app using MDC. Topics covered include using “force dark” for quick conversion (and how to exclude views from this), manually crafting a dark theme with design choices, `.DayNight` MDC app themes, and `.PrimarySurface` MDC widget styles, and how to handle the system UI.

Or in article form:

https://medium.com/androiddevelopers/dark-theme-with-mdc-4c6fc357d956

Episode 4: Material Motion

Episode 4 by Nick Rout is all about Material’s motion system. It closely follows the steps in the existing “Building Beautiful Transitions with Material Motion for Android” codelab. It uses the Reply sample app to demonstrate how you can use transition patterns —container transform, shared axis, fade through, and fade —for a smoother, more understandable user experience. It goes through scenarios involving Fragments (including the Navigation component), Activities, and Views, and will feel familiar if you’ve used the AndroidX and platform transition frameworks before.

Or in article form:

https://medium.com/androiddevelopers/material-motion-with-mdc-c1f09bb90bf9

Episode 5: Community tip

Episode 5 is by a member of the Android community—Google Developer Expert (GDE) for Android Zarah Dominguez—who takes us through using the MDC catalog app as a reference for widget functionality and API examples. She also explains how it’s been beneficial to build a ‘Theme Showcase’ page in the app she works on, to ensure a cohesive design language across different screens and flows.

Episode 6: Live Q&A

To wrap things up, Chet Haase hosted us for a Q&A session along with members of the MDC engineering team —Dan Nizri and Connie Shi. We answered questions asked by you on YouTube Live, Twitter, and elsewhere. We explored the origins of MDC, how it relates to AppCompat, and how it’s evolved over the years. Other topics include best practices for organizing your themes and resources, using different fonts and typography styles, and shape theming… A lot of shape theming. We also revealed all of our favorite Material components! Lastly we looked to the future with new components coming out in MDC and Jetpack Compose, Android’s next generation UI toolkit which has Material Design built in by default.

Sample apps

During the series we used two different sample applications to demonstrate MDC :

  • “Build a Material Theme” (a.k.a MaterialThemeBuilder) is an interactive project that lets you create your own Material theme by customizing values for color, typography, and shape
  • Reply is one of the Material studies; an email app that uses Material Design components and Material Theming to create an on-brand communication experience

These can both found alongside another Material study sample app — Owl — in the MDC examples GitHub repository.

https://github.com/material-components/material-components-android-examples

Improving urban GPS accuracy for your app

Posted by Frank van Diggelen, Principal Engineer and Jennifer Wang, Product Manager

At Android, we want to make it as easy as possible for developers to create the most helpful apps for their users. That’s why we aim to provide the best location experience with our APIs like the Fused Location Provider API (FLP). However, we’ve heard from many of you that the biggest location issue is inaccuracy in dense urban areas, such as wrong-side-of-the-street and even wrong-city-block errors.

This is particularly critical for the most used location apps, such as rideshare and navigation. For instance, when users request a rideshare vehicle in a city, apps cannot easily locate them because of the GPS errors.

The last great unsolved GPS problem

This wrong-side-of-the-street position error is caused by reflected GPS signals in cities, and we embarked on an ambitious project to help solve this great problem in GPS. Our solution uses 3D mapping aided corrections, and is only feasible to be done at scale by Google because it comprises 3D building models, raw GPS measurements, and machine learning.

The December Pixel Feature Drop adds 3D mapping aided GPS corrections to Pixel 5 and Pixel 4a (5G). With a system API that provides feedback to the Qualcomm® Snapdragon™ 5G Mobile Platform that powers Pixel, the accuracy in cities (or “urban canyons”) improves spectacularly.

Picture of a pedestrian test, with Pixel 5 phone, walking along one side of the street, then the other. Yellow = Path followed, Red = without 3D mapping aided corrections, Blue = with 3D mapping aided corrections.  The picture shows that without 3D mapping aided corrections, the GPS results frequently wander to the wrong side of the street (or even the wrong city block), whereas, with 3D mapping aided corrections, the position is many times more accurate.

Picture of a pedestrian test, with Pixel 5 phone, walking along one side of the street, then the other. Yellow = Path followed, Red = without 3D mapping aided corrections, Blue = with 3D mapping aided corrections.

Why hasn’t this been solved before?

The problem is that GPS constructively locates you in the wrong place when you are in a city. This is because all GPS systems are based on line-of-sight operation from satellites. But in big cities, most or all signals reach you through non line-of-sight reflections, because the direct signals are blocked by the buildings.

Diagram of the 3D mapping aided corrections module in Google Play services, with corrections feeding into the FLP API.   3D mapping aided corrections are also fed into the GNSS chip and software, which in turn provides GNSS measurements, position, and velocity back to the module.

The GPS chip assumes that the signal is line-of-sight and therefore introduces error when it calculates the excess path length that the signals traveled. The most common side effect is that your position appears on the wrong side of the street, although your position can also appear on the wrong city block, especially in very large cities with many skyscrapers.

There have been attempts to address this problem for more than a decade. But no solution existed at scale, until 3D mapping aided corrections were launched on Android.

How 3D mapping aided corrections work

The 3D mapping aided corrections module, in Google Play services, includes tiles of 3D building models that Google has for more than 3850 cities around the world. Google Play services 3D mapping aided corrections currently supports pedestrian use-cases only. When you use your device’s GPS while walking, Android’s Activity Recognition API will recognize that you are a pedestrian, and if you are in one of the 3850+ cities, tiles with 3D models will be downloaded and cached on the phone for that city. Cache size is approximately 20MB, which is about the same size as 6 photographs.

Inside the module, the 3D mapping aided corrections algorithms solve the chicken-and-egg problem, which is: if the GPS position is not in the right place, then how do you know which buildings are blocking or reflecting the signals? Having solved this problem, 3D mapping aided corrections provide a set of corrected positions to the FLP. A system API then provides this information to the GPS chip to help the chip improve the accuracy of the next GPS fix.

With this December Pixel feature drop, we are releasing version 2 of 3D mapping aided corrections on Pixel 5 and Pixel 4a (5G). This reduces wrong-side-of-street occurrences by approximately 75%. Other Android phones, using Android 8 or later, have version 1 implemented in the FLP, which reduces wrong-side-of-street occurrences by approximately 50%. Version 2 will be available to the entire Android ecosystem (Android 8 or later) in early 2021.

Android’s 3D mapping aided corrections work with signals from the USA’s Global Positioning System (GPS) as well as other Global Navigation Satellite Systems (GNSSs): GLONASS, Galileo, BeiDou, and QZSS.

Our GPS chip partners shared the importance of this work for their technologies:

“Consumers rely on the accuracy of the positioning and navigation capabilities of their mobile phones. Location technology is at the heart of ensuring you find your favorite restaurant and you get your rideshare service in a timely manner. Qualcomm Technologies is leading the charge to improve consumer experiences with its newest Qualcomm® Location Suite technology featuring integration with Google's 3D mapping aided corrections. This collaboration with Google is an important milestone toward sidewalk-level location accuracy,” said Francesco Grilli, vice president of product management at Qualcomm Technologies, Inc.

“Broadcom has integrated Google's 3D mapping aided corrections into the navigation engine of the BCM47765 dual-frequency GNSS chip. The combination of dual frequency L1 and L5 signals plus 3D mapping aided corrections provides unprecedented accuracy in urban canyons. L5 plus Google’s corrections are a game-changer for GNSS use in cities,” said Charles Abraham, Senior Director of Engineering, Broadcom Inc.

“Google's 3D mapping aided corrections is a major advancement in personal location accuracy for smartphone users when walking in urban environments. MediaTek’s Dimensity 5G family enables 3D mapping aided corrections in addition to its highly accurate dual-band GNSS and industry-leading dead reckoning performance to give the most accurate global positioning ever for 5G smartphone users,” said Dr. Yenchi Lee, Deputy General Manager of MediaTek’s Wireless Communications Business Unit.

How to access 3D mapping aided corrections

Android’s 3D mapping aided corrections automatically works when the GPS is being used by a pedestrian in any of the 3850+ cities, on any phone that runs Android 8 or later. The best way for developers to take advantage of the improvement is to use FLP to get location information. The further 3D mapping aided corrections in the GPS chip are available to Pixel 5 and Pixel 4a (5G) today, and will be rolled out to the rest of the Android ecosystem (Android 8 or later) in the next several weeks. We will also soon support more modes including driving.

Android’s 3D mapping aided corrections cover more than 3850 cities, including:

  • North America: All major cities in USA, Canada, Mexico.
  • Europe: All major cities. (100%, except Russia & Ukraine)
  • Asia: All major cities in Japan and Taiwan.
  • Rest of the world: All major cities in Brazil, Argentina, Australia, New Zealand, and South Africa.

As our Google Earth 3D models expand, so will 3D mapping aided corrections coverage.

Google Maps is also getting updates that will provide more street level detail for pedestrians in select cities, such as sidewalks, crosswalks, and pedestrian islands. In 2021, you can get these updates for your app using the Google Maps Platform. Along with the improved location accuracy from 3D mapping aided corrections, we hope we can help developers like you better support use cases for the world’s 2B pedestrians that use Android.

Continuously making location better

In addition to 3D mapping aided corrections, we continue to work hard to make location as accurate and useful as possible. Below are the latest improvements to the Fused Location Provider API (FLP):

  • Developers wanted an easier way to retrieve the current location. With the new getCurrentLocation() API, developers can get the current location in a single request, rather than having to subscribe to ongoing location changes. By allowing developers to request location only when needed (and automatically timing out and closing open location requests), this new API also improves battery life. Check out our latest Kotlin sample.
  • Android 11's Data Access Auditing API provides more transparency into how your app and its dependencies access private data (like location) from users. With the new support for the API's attribution tags in the FusedLocationProviderClient, developers can more easily audit their apps’ location subscriptions in addition to regular location requests. Check out this Kotlin sample to learn more.



Qualcomm and Snapdragon are trademarks or registered trademarks of Qualcomm Incorporated.

Qualcomm Snapdragon and Qualcomm Location Suite are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

6 new ways Android can help this holiday season


With the holidays around the corner, we’re sharing six new Google features for Android—a few more ways your phone gets more helpful over time, even outside of major OS updates. Whether you’re texting holiday greetings to loved ones or winding down with a book, here’s how Android can help.


1. Mix up more of your favorite emoji

Emoji Kitchen gets new stickers and mixing experiences

Click on the image above to see a video of the latest mixing experience coming to Emoji Kitchen 

With Emoji Kitchen on Gboard, people have mixed their favorite emoji into customized stickers over 3 billion times since it was released earlier this year. With this latest update, Emoji Kitchen is going from hundreds of unique design combinations to over 14,000. Each mix makes it easier for you to express yourself with a little extra flair. Now you can simply tap two emoji to quickly see suggested combinations, or double tap on one emoji to reveal some more intense emotions. 

Already available on Gboard beta, the new version of Emoji Kitchen will be available on Android 6.0 and above over the coming weeks. Download Gboard on Google Play to enjoy the new emoji kitchen stickers this holiday season. ❄️️⛄️

2. Enjoy more stories as audiobooks

Auto-narrated audiobooks give voices to more ebooks

Click on the image above to see a video of how Google Play is bringing more audiobooks to Android

The holiday season is the perfect time to wind down and catch up on some books, and audiobooks make it even more convenient to immerse yourself in a story. But not all books, like the one written by your favorite indie author, are converted into an audiobook. Now Google Play, working with publishers in the U.S. and the UK, will use auto-generated narrators so books without audio versions can be narrated—meaning you’ll have more audio titles to choose from on Play Store. The publisher tool to create auto-narrated audiobooks is currently in beta, but it will roll out to all publishers in early 2021.

3. Use Voice Access to navigate your device 

Easily use and navigate your phone by speaking out loud with Voice Access

Click on the image above to see a video showing how Voice Access can help you navigate your smartphone

Built with people with motor disabilities in mind, Voice Access lets you control your phone using your voice. Now using machine learning technology, you can add labels to the screens of your Android apps to help you work within them with your voice. For example, you can say “open Photos”, “tap Search”, “tap Your Map” to see a map of all your photos. This makes navigation convenient and hands-free, using just your voice. Previously rolled out on Android 11, this new version of Voice Access is now available globally on all devices running Android 6.0 and above. You can download Voice Access on Google Play and try the new version out by joining the Beta today.

4. Get around with the Go Tab

New Go Tab in Google Maps is available today

Click on the image above to see the video of the new Go Tab in Google Maps

With the new Go Tab in Google Maps, you can more easily navigate to frequently-visited places with just one tap. Pin your favorite driving destinations like school or a grocery store to quickly see directions, live traffic trends, disruptions on your route, and an accurate ETA—all without typing the place’s address. If you take public transit, you can pin specific routes, which will let you see accurate departure and arrival times, alerts from your local transit agency, and an up-to-date ETA right from the Go Tab. You can even pin multiple routes (including a driving route and a transit route) to the same destination to see which one will get you there most efficiently. The Go Tab starts rolling out on Android and iOS in the coming weeks. 


Be sure to check out other helpful Google Maps features on Android, like live transit crowdedness and Assistant driving mode to help you navigate and get things done this holiday season.


5. Android Auto expands to more countries

Android Auto is rolling out to more countries


Over the next few months, Android Auto will be expanding to new countries, bringing your favorite apps and services from your phone onto your car display. With Android Auto, you can talk to Google to play music, send messages, get directions, and more, so you can keep your eyes on the road and your hands on the wheel. With phones running Android 10 and above, all you need to do to get started is plug your Android phone into a compatible car. For Android 9 and earlier phones, you can download the app.


6. Share your favorite apps with Nearby Share

Send and receive apps without cell or wifi connection

An upcoming update to Nearby Share will let you share apps from Google Play with the people around you with an Android phone, even if you don’t have a cell or Wi-Fi connection. Simply open Google Play, go to the “Share Apps” menu in “My Apps & Games,” select the apps you want to share, and let your friend accept the incoming apps. This update will roll out in the coming weeks.

Use Voice Access to control your Android device with your voice

In 2018, we launched Voice Access, an Android app that lets you control your phone using your voice. The ability to use your phone hands-free has been helpful to people with disabilities, and also those without.

Today, on International Day of Persons with Disabilities, we’re rolling out an updated version of Voice Access, available in Beta, that is easier to use and available to more people. This version of Voice Access, which was previously available on Android 11, is now available globally to devices running Android 6.0 and above. 

Thanks to machine learning and a refreshed interface, it’s easier to use your voice to control your phone. Previously, Voice Access would draw numbers over your phone screen so you could   say commands like “tap 1,” “scroll down on 5” and so on. With the new version, you can ask for labels instead of numbers.  Say "show labels” and use them to voice commands so it’s easier for you to remember and use again later.

This update also adds new commands to help you get things done faster in your favorite apps. Instead of saying “tap search” and then “type kittens,” you can simply say "search for kittens" inside YouTube, Photos and many other apps where you’re looking for a kitten fix. 

When you first install or upgrade to the new version, you can choose to have Voice Access start whenever you use your phone  Or if you like, you can just say  "Hey Google, Voice Access" when you need it.

Voice Access was designed for and with people with motor disabilities (like ALS, spinal cord injuries or arthritis). But it’s also helpful for anyone with a temporary disability, like a broken arm, or whose hands are otherwise occupied, like if they’re cooking. Regardless of the reason, the updated Voice Access app makes it easier for anyone to use their phones hands-free! 

You can download Voice Access on Google Play and try the new version out by joining the Beta today. 

Emoji Kitchen cooks up a new batch of mashups

In a year when most of our relationships happen at a distance, digital communication can play a role in keeping us connected. When my brother shares pictures of his new puppy (when will I ever get to meet her?), Heart Eyes 😍 falls short of conveying just how cute she is. And not a day goes by when someone doesn’t send me a meme—and sometimes they’re so on point that Tears of Joy 😂 just won’t cut it as a response.


Since we introduced Emoji Kitchen earlier this year with a few hundred emoji combinations, people have shared more than 3 billion stickers to express the range of emotions they’ve felt in 2020. With today’s update, we're expanding to more than 14,000 combinations and improving the mixing experience so you can convey your feelings in more ways.


alt=”A GIF showing how to mix emoji with Emoji Kitchen.”>
10:25

Click on the image above to see how to mix emoji in Emoji Kitchen

When we first launched, tapping on an emoji yielded a curated selection of designs to express yourself with a little extra zhuzh. Now you have more control and can pick two of any smileys (and then some!) to create a wider array of expressions. Combine Earth 🌍 with Face with Medical Mask 😷 to convey the state of the world, or Fire 🔥 with Smiling Face with Sunglasses 😎  for when ‘this is fine’.

alt=”A Combine Earth emoji mixed with a Face with Medical Mask emoji creates an emoji kitchen sticker of an earth wearing a mask, and a Smiling Face with Sunglasses emoji mixed with a Fire emoji creates an emoji kitchen sticker of a flame wearing fiery sunglasses”>

Other times, one heart ❤️ isn’t enough so you throw in a few extra hearts to convey the depth of your affection ❤️❤️❤️❤️❤️. You can now amplify the sentiment with Emoji Kitchen when the occasion arises. Just double tap thinking face 🤔 and get a very introspective thinker. Or double tap rolling on the floor laughing 🤣 and your emoji will be falling apart at the seams.

alt=”A variety of combinations that are possible with Emoji Kitchen when you mix two of the same emoji including a very introspective thinker, a 1000 emoji, and an emoji with very big eyes.”>

The new Emoji Kitchen is available in Gboard beta today, and coming to all Gboard users in the coming weeks. We can’t wait to see what you make!