Tag Archives: Android

#Android11: The Beta Launch Show – Here’s how to join in and watch next week!

Posted by The #Android11 team

In just under a week, we’ll kick off #Android11: The Beta Launch Show, your opportunity to find out what’s new in Android from the people who build Android. Join us on June 3, 11AM ET (8AM PT, 4PM BST, 8:30PM IST) as we unveil new features packed inside the next release, Android 11, as well as updates to help developers get the most out of modern Android development. You’ll be able to watch the show live on YouTube (don’t forget to set a reminder) or Twitter, and can sign-up for updates here.

Get your #AskAndroid questions answered live

Got a burning question? We’ve got experts ready to answer your #AskAndroid questions, and we’ll be wrapping up the show with a live Q&A session. All you have to do is share your question on Twitter using #AskAndroid, and we’ll be selecting questions for Android engineering and product leads Dave Burke and Stephanie Cuthbertson to answer live on-the-air.

Check out the list of talks

Also on June 3, we’ll be sharing 12 talks on a range of topics from Jetpack to Android Studio and Google Play–talks that we had originally planned for Google I/O–to help you take advantage of the latest in Android development. We just posted the full list of talks on the event page.

Sketchnote with us

Sketchnote with us gif

We want to see your take on the show, so grab your best pens, markers, and paper, download the template, and get ready to show off your sketchnote skills during The Beta Launch Show. Don’t forget to share your work using the hashtag #Android11 for a chance to be featured.

We can’t wait to share with you the latest we’ve been working on with you in just over a week at #Android11: The Beta Launch Show!

Accessibility updates that help tech work for everyone

Editor’s note: Today is Global Accessibility Awareness Day, and we’ll be sharing resources and tools for education, as well as new accessibility features for Android and Google Maps

In 1993, Paul Amadeus Lane was an EMT with his whole life planned out. But at age 22, he was in a multi-car collision that left him fighting for his life and in recovery for eight months. After the crash, Paul became quadriplegic. He soon realized that his voice was one of his most powerful assets—professionally and personally. He went back to school to study broadcasting and became a radio producer and morning show host. Along the way, Paul discovered how he could use technology as an everyday tool to help himself and others. Today, he uses accessibility features, like Voice Access, to produce his own radio show and share his passion for technology.

Stories like Paul’s remind us why accessible technology matters to all of us every single day. Products built with and for people with disabilities help us all pursue our interests, passions and careers. Today, in honor of Global Accessibility Awareness Day, we’re announcing helpful Android features and applications for people with hearing loss, deafness, and cognitive differences. While these updates were designed for people with disabilities, the result is better products that can be helpful for everyone. 

Action Blocks: One-tap actions on Android for people with cognitive disabilities

Every day, people use their phones for routine tasks—whether it’s video calling family, checking the weather or reading the news. Typically, these activities require multiple steps. You might have to scroll to find your video chat app, tap to open it and then type in the name of the contact you’re looking for. 

For people with cognitive disabilities or age-related cognitive conditions, it can be difficult to learn and remember each of these steps. For others, it can be time consuming and cumbersome—especially if you have limited mobility. Now, you can perform these tasks with one tap—thanks to Action Blocks, a new Android app that allows you to create customizable home screen buttons

Android Blocks

With Action Blocks, tap on the customized button to launch an activity.

Create an Action Block for any action that the Google Assistant can perform, like making calls, sending texts, playing videos and controlling devices in your home. Then pick an image for the Action Block from your camera or photo gallery, and place it on your home screen for one-touch access.

Action Blocks is part of our ongoing effort to make technology more helpful for people with cognitive disabilities and their caregivers. The app is available on the Play Store, and works on Android devices on Android 5.0 and above. 

Live Transcribe: Real-time transcriptions for everyday conversations

In 2019, we launched Live Transcribe, an app that provides real-time, speech-to-text transcriptions of everyday conversations for people who are deaf or hard of hearing. Based on feedback we’ve received from people using the product, we’re rolling out new features:

  • Set your phone to vibrate when someone nearby says your name. If you’re looking elsewhere or want to maintain social distance, your phone will let you know when someone is trying to get your attention. 
  • Add custom names or terms for different places and objects not commonly found in the dictionary. With the ability to customize your experience, Live Transcribe can better recognize and spell words that are important to you. 
  • It’s now easier to search past conversations. Simply use the search bar to look through past transcriptions. To use the feature, turn on ‘Saving Transcriptions’ in Settings. Once turned on, transcriptions will be saved locally on your device for three days.
  • We’re expanding our support of 70 languages to include: Albanian, Burmese, Estonian, Macedonian, Mongolian, Punjabi, and Uzbek.

Live Transcribe is pre-installed on Pixel devices and is available on Google Play for devices Android 5.0 and up. 

Sound Amplifier: Making the sounds around you clearer and louder

Sound Amplifier, a feature that clarifies the sound around you, now works with Bluetooth headphones. Connect your Bluetooth headphones and place your phone close to the source of the sound, like a TV or a lecturer, so that you can hear more clearly. On Pixel, now you can also boost the audio from media playing on your device—whether you are watching YouTube videos, listening to music, or enjoying a podcast. Sound Amplifier is available on Google Play for devices Android 6.0 and above.

Sound Amplifier

Use Sound Amplifier to clarify sound playing on your phone.

Accessibility matters for everyone

We strive to build products that are delightful and helpful for people of all abilities. After all, that’s our mission: to make the world’s information universally accessible for everyone. If you have questions on how these features can be helpful for you, visit our Help Center, connect with our Disability Support team or learn more about our accessibility products on Android

Source: Android


Exposure Notification API launches to support public health agencies

Note: The following is a joint statement from Apple and Google.

One of the most effective techniques that public health officials have used during outbreaks is called contact tracing. Through this approach, public health officials contact, test, treat and advise people who may have been exposed to an affected person. One new element of contact tracing is Exposure Notifications: using privacy-preserving digital technology to tell someone they may have been exposed to the virus. Exposure Notification has the specific goal of rapid notification, which is especially important to slowing the spread of the disease with a virus that can be spread asymptomatically.   

To help, Apple and Google cooperated to build Exposure Notifications technology that will enable apps created by public health agencies to work more accurately, reliably and effectively across both Android phones and iPhones. Over the last several weeks, our two companies have worked together, reaching out to public health officials, scientists, privacy groups and government leaders all over the world to get their input and guidance. 

Starting today, our Exposure Notifications technology is available to public health agencies on both iOS and Android. What we’ve built is not an app—rather public health agencies will incorporate the API into their own apps that people install. Our technology is designed to make these apps work better. Each user gets to decide whether or not to opt-in to Exposure Notifications; the system does not collect or use location from the device; and if a person is diagnosed with COVID-19, it is up to them whether or not to report that in the public health app. User adoption is key to success and we believe that these strong privacy protections are also the best way to encourage use of these apps.  

Today, this technology is in the hands of public health agencies across the world who will take the lead and we will continue to support their efforts. 

Distribute certificates for mobile devices via MDM

What’s changing 

We’re making it possible to issue digital certificates to iOS and Android devices for secure access even when those devices are not connected to the corporate network. This will make it easier to provide new mobile devices with identification, authentication, and access to G Suite and other corporate resources. This is available to G Suite Enterprise, G Suite Enterprise for Education, and Cloud Identity Premium customers using Google Endpoint Management via an on-premises connector.

Who’s impacted 

Admins

Why it’s important 

Certificates are an important way to identify and authenticate mobile devices so they are able to securely access corporate resources. These resources can include G Suite, enterprise WiFi hotspots, and more.

Some customers include a requirement for devices to be on-premise and protected by a firewall in order to distribute device certificates. As some users can no longer access corporate locations and networks, customers need a way to issue these certificates remotely.

By providing this feature, we are helping these customers keep their employees connected and productive even when they’re not in the office.

Getting started 



Rollout pace 


  • This feature is available now. 

Availability 


  • Available to G Suite Enterprise, G Suite Enterprise for Education, and Cloud Identity Premium customers 
  • Not available to G Suite Basic, G Suite Business, G Suite for Education, G Suite for Nonprofits, and Cloud Identity Free customers 

Resources 


Answers to your questions about app signing by Google Play

Posted by Dom Elliott, Product Manager, Google Play

Google Play's first priority is to build a trusted, safe, and secure platform for billions of users and millions of developers for many years into the future. The sustainability and success of the ecosystem depends on this.

As part of this goal, almost two years ago, we announced app signing by Google Play. With app signing by Google Play, Google manages and protects your app's signing key for you and uses it to sign your APKs for distribution. It’s a secure way to store your app signing key that helps protect you if your key is ever lost or compromised. If you’re not enrolled in app signing and you lose your signing key, you’ll lose the ability to update your app.

App signing by Play also enables you to publish your app or game with the Android App Bundle, the recommended publishing format in use by over 500,000 apps in production on Google Play. The Android App Bundle reduces the size of your app, simplifies your releases, and unlocks next generation distribution features such as dynamic features and dynamic asset delivery.

Developers often have questions when enrolling in app signing for the first time so my colleague has written a Medium post with answers to some frequently asked questions. Read the post to find out more about the benefits of app signing, how we protect developer keys, and to learn about features like key upgrade for new installs and the new source stamp that bundletool will start adding to apps published with app bundles to give you more peace of mind about Play-signed apps.



Android 11: Beta Plans

Posted by Dave Burke, VP of Engineering

Android 11 Dial logo

When we started planning Android 11, we didn’t expect the kinds of changes that would find their way to all of us, across nearly every region in the world. These have challenged us to stay flexible and find new ways to work together, especially with our developer community.

To help us meet those challenges we’re announcing an update to our release timeline. We’re bringing you a fourth Developer Preview today and moving Beta 1 to June 3. And to tell you all about the release and give you the technical resources you need, we’re hosting an online developer event that we’re calling #Android11: the Beta Launch Show.

Join us for #Android11: The Beta Launch Show

While the circumstances prevent us from joining together with you in-person at Shoreline Amphitheatre for Google I/O, our annual developer conference, we’re organizing an online event where we can share with you all the best of what’s new in Android. We hope you’ll join us for #Android11: The Beta Launch Show, your opportunity to find out what’s new in Android from the people who build Android. Hosted by me, Dave Burke, we’ll be kicking off at 11AM ET on June 3. And we’ll be wrapping it up with a post-show live Q&A; tweet your #AskAndroid questions to get them answered live!

Later that day, we’ll be sharing a number of talks on a range of topics from Jetpack Compose to Android Studio and Google Play–talks that we had originally planned for Google I/O–to help you take advantage of the latest in Android development. You can sign-up to receive updates on this digital event at developer.android.com/android11.

Android 11 schedule update

Our industry moves really fast, and we know that many of our device-maker partners are counting on us to help them bring Android 11 to new consumer devices later this year. We also know that many of you have been working to prioritize early app and game testing on Android 11, based in part on our Platform Stability and other milestones. At the same time, all of us are collaborating remotely and prioritizing the well-being of our families, friends and colleagues.

So to help us meet the needs of the ecosystem while being mindful of the impacts on our developers and partners, we’ve decided to add a bit of extra time in the Android 11 release schedule. We’re moving out Beta 1 and all subsequent milestones by about a month, which gives everyone a bit more room but keeps us on track for final release later in Q3.

Here are some of the key changes in the new schedule:

  • We’re releasing a fourth Developer Preview today for testing and feedback.
  • Beta 1 release moves to June 3. We’ll include the final SDK and NDK APIs with this release and open up Google Play publishing for apps targeting Android 11.
  • Beta 2 moves to July. We’ll reach Platform Stability with this release.
  • Beta 3 moves to August and will include release candidate builds for final testing

By bringing you the final APIs on the original timeline while shifting the other dates, we’re giving you an extra month to compile and test with the final APIs, while also ensuring that you have the same amount of time between Platform Stability and the final release, planned for later in Q3. Here’s a look at the timeline.

Android 11 timeline

You can read more about what the new timeline means to app developers in the preview program overview.

App compatibility

The schedule change adds some extra time for you to test your app for compatibility and identify any work you’ll need to do. We recommend releasing a compatible app update by Android 11 Beta on June 3rd to get feedback from the larger group of Android Beta users who will be getting the update.

With Beta 1 the SDK and NDK APIs will be final, and as we reach Platform Stability in July, the system behaviors and non-SDK greylists will also be finalized. At that time, plan on doing your final compatibility testing and releasing your fully compatible app, SDK, or library as soon as possible so that it is ready for the final Android 11 release. You can read more in the timeline for developers.

You can start compatibility testing today on a Pixel 2, 3, 3a, or 4 device, or you can use the Android Emulator. Just flash the latest build, install your current production app, and test the user flows. Make sure to review the behavior changes for areas where your app might be affected. There’s no need to change the app’s targetSdkVersion at this time, although we recommend evaluating the work since many changes apply once your app is targeting the new API level.

Get started with Android 11

Today we're pushing a Developer Preview 4 with the latest bug fixes, API tweaks, and features to try in your apps. It’s available by manual download and flash for Pixel 2, 3, 3a, or 4 devices, and if you’re already running a Developer Preview build, you’ll get an over-the-air (OTA) update to today’s release.

For complete information on Android 11, visit the Android 11 developer site, and please continue to let us know what you think!

High refresh rate rendering on Android

Posted by Ady Abraham, Software Engineer

For a long time, phones have had a display that refreshes at 60Hz. Application and game developers could just assume that the refresh rate is 60Hz, frame deadline is 16.6ms, and things would just work. This is no longer the case. New flagship devices are built with higher refresh rate displays, providing smoother animations, lower latency, and an overall nicer user experience. There are also devices that support multiple refresh rates, such as the Pixel 4, which supports both 60Hz and 90Hz.

A 60Hz display refreshes the display content every 16.6ms. This means that an image will be shown for the duration of a multiple of 16.6ms (16.6ms, 33.3ms, 50ms, etc.). A display that supports multiple refresh rates, provides more options to render at different speeds without jitter. For example, a game that cannot sustain 60fps rendering must drop all the way to 30fps on a 60Hz display to remain smooth and stutter free (since the display is limited to present images at a multiple of 16.6ms, the next framerate available is a frame every 33.3ms or 30fps). On a 90Hz device, the same game can drop to 45fps (22.2ms for each frame), providing a much smoother user experience. A device that supports 90Hz and 120Hz can smoothly present content at 120, 90, 60 (120/2), 45(90/2), 40(120/3), 30(90/3), 24(120/5), etc. frames per second.

Rendering at high rates

The higher the rendering rate, the harder it is to sustain that frame rate, simply because there is less time available for the same amount of work. To render at 90Hz, applications only have 11.1ms to produce a frame as opposed to 16.6ms at 60Hz.

To demonstrate that, let’s take a look at the Android UI rendering pipeline. We can break frame rendering into roughly five pipeline stages:

  1. Application’s UI thread processes input events, calls app’s callbacks, and updates the View hierarchy’s list of recorded drawing commands
  2. Application’s RenderThread issues the recorded commands to the GPU
  3. GPU draws the frame
  4. SurfaceFlinger, which is the system service in charge of displaying the different application windows on the screen, composes the screen and submits the frame to the display HAL
  5. Display presents the frame

The entire pipeline is controlled by the Android Choreographer. The Choreographer is based on the display vertical synchronization (vsync) events, which indicate the time the display starts to scanout the image and update the display pixels. The Choreographer is based on the vsync events but has different wakeup offsets for the application and for SurfaceFlinger. The diagram below illustrates the pipeline on a Pixel 4 device running at 60Hz, where the application is woken up 2ms after the vsync event and SurfaceFlinger is woken up 6ms after the vsync event. This gives 20ms for an app to produce a frame, and 10ms for SurfaceFlinger to compose the screen.

Diagram that illustrates the pipeline on a Pixel 4 device

When running at 90Hz, the application is still woken up 2ms after the vsync event. However, SurfaceFlinger is woken up 1ms after the vsync event to have the same 10ms for composing the screen. The app, on the other hand, has just 10ms to render a frame, which is very short.

Diagram of running on a device at 90Hz

To mitigate that, the UI subsystem in Android is using “render ahead” (which delays a frame presentation while starting it at the same time) to deepen the pipeline and postpone frame presentation by one vsync. This gives the app 21ms to produce a frame, while keeping the throughput at 90Hz.

Diagram app 21ms to produce a frame

Some applications, including most games, have their own custom rendering pipelines. These pipelines might have more or fewer stages, depending on what they are trying to accomplish. In general, as the pipeline becomes deeper, more stages could be performed in parallel, which increases the overall throughput. On the other hand, this can increase the latency of a single frame (the latency will be number_of_pipeline_stages x longest_pipeline_stage). This tradeoff needs to be considered carefully.

Taking advantage of multiple refresh rates

As mentioned above, multiple refresh rates allow a broader range of available rendering rates to be used. This is especially useful for games which can control their rendering speed, and for video players which need to present content at a given rate. For example, to play a 24fps video on a 60Hz display, a 3:2 pulldown algorithm needs to be used, which creates jitter. However, if the device has a display that can present 24fps content natively (24/48/72/120Hz), it will eliminate the need for pulldown and the jitter associated with it.

The refresh rate that the device operates at is controlled by the Android platform. Applications and games can influence the refresh rate via various methods (explained below), but the ultimate decision is made by the platform. This is crucial when more than one app is present on the screen and the platform needs to satisfy all of them. A good example is a 24fps video player. 24Hz might be great for video playback, but it’s awful for responsive UI. A notification animating at only 24Hz feels janky. In situations like this, the platform will set the refresh rate to ensure that the content on the screen looks good.

For this reason, applications may need to know the current device refresh rate. This can be done in the following ways:

Applications can influence the device refresh rate by setting a frame rate on their Window or Surface. This is a new capability introduced in Android 11 and allows the platform to know the rendering intentions of the calling application. Applications can call one of the following methods:

Please refer to the frame rate guide on how to use these APIs.

The system will choose the most appropriate refresh rate based on the frame rate programmed on the Window or Surface.

On Older Android versions (before Android 11) where the setFrameRate API doesn’t exist, applications can still influence the refresh rate by directly setting WindowManager.LayoutParams.preferredDisplayModeId to one of the available modes from Display.getSupportedModes. This approach is discouraged starting with Android 11 since the platform doesn’t know the rendering intention of the app. For example, if a device supports 48Hz, 60Hz and 120Hz and there are two applications present on the screen that call setFrameRate(60, …) and setFrameRate(24, …) respectively, the platform can choose 120Hz and make both applications happy. On the other hand, if those applications used preferredDisplayModeId they would probably set the mode to 60Hz and 48Hz respectively, leaving the platform with no option to set 120Hz. The platform will choose either 60Hz or 48Hz, making one app unhappy.

Takeaways

Refresh rate is not always 60Hz - don’t assume 60Hz and don’t hardcode assumptions based on that historical artifact.

Refresh rate is not always constant - if you care about the refresh rate, you need to register a callback to find out when the refresh rate changes and update your internal data accordingly.

If you are not using the Android UI toolkit and have your own custom renderer, consider changing your rendering pipeline according to the current refresh rate. Deepening the pipeline can be done by setting a presentation timestamp using eglPresentationTimeANDROID on OpenGL or VkPresentTimesInfoGOOGLE on Vulkan. Setting a presentation timestamp indicates to SurfaceFlinger when to present the image. If it is set to a few frames in the future, it will deepen the pipeline by the number of frames it is set to. The Android UI in the example above is setting the present time to frameTimeNanos1 + 2 * vsyncPeriod2

Tell the platform your rendering intentions using the setFrameRate API. The platform will match different requests by selecting the appropriate refresh rate.

Use preferredDisplayModeId only when necessary, either when setFrameRate API is not available or when you need to use a very specific mode.

Lastly, familiarize yourself with the Android Frame Pacing library. This library handles proper frame pacing for your game and uses the methods described above to handle multiple refresh rates.

Notes


  1. frameTimeNanos received from Choreographer 

  2. vsyncPeriod received from Display.getRefreshRate()  

Fast Pair makes it easier to use your Bluetooth headphones

Bluetooth headphones help us take calls, listen to music while working out, and use our phones anywhere without getting tangled up in wires. And though pairing Bluetooth accessories is an increasingly common activity, it can be a frustrating process for many people.

Fast Pair makes Bluetooth pairing easier on Android 6.0+ phones (learn how to check your Android version). When you turn on your Fast Pair-enabled accessory, it automatically detects and pairs with your Android phone in a single tap. So far, there have been over three million Fast pairings between Bluetooth accessories, like speakers and earbuds, and Android phones. Here are some new capabilities to make Fast Pair experience even easier.

Easily find your lost accessory

It can be frustrating when you put your Bluetooth headphones down and immediately forget where you placed them. If they’re connected to your phone, you can locate your headphones by ringing them. If you have true wireless earbuds (earbuds that aren’t attached by cables or wires), you can choose to ring only the left or right bud. And, when you misplace your headphones, in the coming months, you can check their last known location in the Find My Device app if you have Location History turned on.

Ringing Screen (1).png

Know when to charge your true wireless earbuds

Upon opening the case of your true wireless earbuds, you’ll receive a phone notification about the battery level of each component (right bud, left bud, and the case itself if supported). You’ll also receive a notification when your earbuds and the case battery is running low, so you know when to charge them.

Battery (1).gif

Manage and personalize your accessory easily

To personalize your headset or speakers, your accessory name will include your first name after it successfully pairs with Bluetooth. For example, Pixel Buds will be renamed “Alex’s Pixel Buds.”


On phones running Android 10, you can also adjust headphone settings, like linking it to Google Assistant and accessing Find My Device, right from the device details page. The setting varies depending on your headphone model.

Device Details.png

Harmon Kardon FLY and the new Google Pixel Buds will be the first true wireless earbuds to enjoy all of these new features, with many others to come. We’ll continue to work with our partners to bring Fast Pair to more headset models. Learn about how to connect your Fast Pair accessory here.

Fast Pair makes it easier to use your Bluetooth headphones

Bluetooth headphones help us take calls, listen to music while working out, and use our phones anywhere without getting tangled up in wires. And though pairing Bluetooth accessories is an increasingly common activity, it can be a frustrating process for many people.

Fast Pair makes Bluetooth pairing easier on Android 6.0+ phones (learn how to check your Android version). When you turn on your Fast Pair-enabled accessory, it automatically detects and pairs with your Android phone in a single tap. So far, there have been over three million Fast pairings between Bluetooth accessories, like speakers and earbuds, and Android phones. Here are some new capabilities to make Fast Pair experience even easier.

Easily find your lost accessory

It can be frustrating when you put your Bluetooth headphones down and immediately forget where you placed them. If they’re connected to your phone, you can locate your headphones by ringing them. If you have true wireless earbuds (earbuds that aren’t attached by cables or wires), you can choose to ring only the left or right bud. And, when you misplace your headphones, in the coming months, you can check their last known location in the Find My Device app if you have Location History turned on.

Ringing Screen (1).png

Know when to charge your true wireless earbuds

Upon opening the case of your true wireless earbuds, you’ll receive a phone notification about the battery level of each component (right bud, left bud, and the case itself if supported). You’ll also receive a notification when your earbuds and the case battery is running low, so you know when to charge them.

Battery (1).gif

Manage and personalize your accessory easily

To personalize your headset or speakers, your accessory name will include your first name after it successfully pairs with Bluetooth. For example, Pixel Buds will be renamed “Alex’s Pixel Buds.”


On phones running Android 10, you can also adjust headphone settings, like linking it to Google Assistant and accessing Find My Device, right from the device details page. The setting varies depending on your headphone model.

Device Details.png

Harmon Kardon FLY and the new Google Pixel Buds will be the first true wireless earbuds to enjoy all of these new features, with many others to come. We’ll continue to work with our partners to bring Fast Pair to more headset models. Learn about how to connect your Fast Pair accessory here.

The new Google Pixel Buds are available today for your listening pleasure

In October, we introduced the all-new Google Pixel Buds—with high-quality sound, an unobtrusive design that fits securely and comfortably in your ear and helpful AI features. We wanted to make sure whether you're streaming content while working out or sitting in a noisy room talking on a conference call, you have the best possible audio experience. Today, Pixel Buds are available for $179 in Clearly White in the U.S. 


We sat down with some of the team behind Pixel Buds to learn more about what’s new, and also to hear how they’ve been using them. 


Get started easily with Fast Pair

“I always used to use wired headphones because I had concerns about the reliability of Bluetooth® connectivity, as lots of other earbuds have pairing problems, including the original Pixel Buds. With the new Pixel Buds, we focused on improving Fast Pair to eliminate these pain points and easily connect to your phone.”

- Ethan Grabau, Product Manager

presto_fastpair_tap.gif

Clear calls with special mics and sensor

“To give you clear calls, even in noisy and windy environments, Pixel Buds combine signals from beamforming mics and a special sensor that detects when your jaw is moving. This helps so you don't have to look for a quiet place to take a call. It’s come in particularly handy these past few weeks for me working from home with two young daughters.”

- Jae Lee, Audio Systems Engineer


Adaptive Sound for better audio  

“Adaptive Sound is perfect for those moments like when you’re steaming milk for a latte, or when you're washing your hands or the dishes. Those noises can eclipse your audio experience for a bit, until the latte, or your dishes are done.” 

- Basheer Tome, Senior Hardware Interface Designer


“To help, Adaptive Sound temporarily and subtly adjusts your volume to accommodate for the new noise in your environment, and goes back to normal after it’s dissipated. It works kind of like auto-brightness on your phone screen: It momentarily adjusts to the world around you to make the experience of using your device a little simpler.”  

- Frank Li, UX Engineer  

Hands-free help with Google Assistant

"When I’m working in the yard and wearing gloves, I can use  ’Hey, Google’ on my Pixel Buds and easily control my music. I can also hear my notifications, and reply to a text message with just my voice and Google Assistant. 


And when I'm taking my dog on our daily walk and using my Pixel Buds, I use Google Assistant to navigate and check my fitness progress hands-free while juggling a leash and bag of dog treats. The Pixel Buds are slim enough they fit snag-free under a hat or hoodie, too." 

- Max Ohlendorf, Technology Manager 

HeyGoogle.png

Real-time translations with conversation mode 

“We set out to see how we could use Google Translate on Pixel Buds to reduce language barriers. Making the conversation as natural as possible even with the use of the phone was important, so we decided to create the split screen UI to show exactly what was being said, and translating it in real time on the screen with conversation mode. Any exposure to a different language is also an opportunity to learn, so we wanted to make the feature is not only as helpful as possible for things like being in a different country, but also as simple as being able to help bilingual households across generations connect through language.” 

- Tricia Fu, Product Manager


Peace of mind with Find My Device

“The fear of losing expensive wireless earbuds is real, and in many cases a reason why people are afraid of trying them. We tried to reduce that fear a bit with Find My Device. If an earbud falls out when you’re walking or running, you know right away. But you may be less aware when you return home and absentmindedly put them down somewhere. So we built the ability to let you ring your earbuds from your phone. We also wanted to make sure we were thoughtful in what that experience is like. You can ring one earbud at a time, to focus on finding either the left or right earbud. The moment your hands touch the lost earbud, the ringing will stop. We hope people won’t need to use this feature often, but if they do, they can find misplaced earbuds more easily.”

- Alex Yee, Interaction Designer

RingEarbuds.png

Like Pixel phones and other Google devices, Pixel Buds will get better over time with new features, including an update to Find My Device which will show the last known location of your earbuds. Check out more cool features on Pixel Buds and see which features will work with your device.


Pixel Buds are available through the Google Store and retailers including AT&T, Best Buy, Target (coming early May), T-Mobile, U.S. Cellular, Verizon and Walmart. Other colors—Almost Black, Quite Mint and Oh So Orange—will be available in the coming months. Pixel Buds will come to more countries in the coming months as well.