Tag Archives: Pixel

Pixel artists take time to refresh, reflect and create

For Tim Kellner, a nomadic photographer and filmmaker in the program, the COVID-19 pandemic led to  taking a step back from his art. “Quarantine gave me time to think more deeply about the types of things I wanted to create,” he says. “I was surprised after that break to feel a drive to create again that I hadn't felt for a few years.”

Tim is one of Google Pixel’s Creator Labs artists who’s been exploring the side effects of spending more time alone. Creator Labs is an incubator for emerging photographers, directors and YouTubers that launched last winter, pre-pandemic. All nine of the program’s recurring artists pivoted to working virtually with us this past summer.  

Image showing a person standing in a desert at dusk; the sky is dark blue. A small ping light is shining around him while he holds a lit up object in his hand.

Tim Kellner

Armed with the Pixel 5 and their imaginations, the artists set out to create work grounded in social impact and cultural narrative (captured in a COVID-safe way, of course). 

One theme all of the current Creator Labs artists are embracing is the idea of space. Los Angeles-based Creator Labs veteran Glassface has been exploring isolation and mental health throughout his tenure in the program. “We’re all going through a mass shared traumatic experience right now. It feels like a really necessary time for meaningful art. I’ve been able to hone in on the art and music I want to be making, and I’ve been reminded of why I create in the first place,” Glassface says. “I think art can be a guiding light during difficult times like right now, and that’s informed and inspired my approach heavily. I’m taking a lot more risks and only putting my energy into the creative projects that mean the most to me.”
Image showing a person sitting close to the camera, looking up at the sky and at a white house.

Glassface

New York-based program-newcomer,Andre Wagner, like Tim, decided to turn the camera on himself “I’m always making self portraits but something about this time in particular led me to putting more focus on myself as the subject matter. There have definitely been surprises, and for me that’s needed because it helps sustain the effort.”

A black and white image showing a person sitting on a bench in between two trees. The person is sitting on top of the back of the bench looking up at the trees.

Andre Wagner

Other self portraits celebrated the artists’ heritage, including Los Angeles-based photographers June Canedo and Andrew Thomas Huang. June photographed herself wearing an embroidered handkerchief, representing her family’s history of domestic work, while Andrew’s photos pay homage to the Chinese Zodiac—with a Sci-Fi twist. 

Image showing a person with their back to the camera. There's a kerchief in their hair.

June Canedo

Creator Labs also includes artists Mayan Toledano, Kennedi Carter, Natalia Mantini and Anthony Prince Leslie. You can find their work on Pixel’s Instagram page.

Image showing a person wearing an ornate blue and green suit against a blue green background. They're wearing an intricate mask and holding up their hand, which is painted blue.

Andrew Thomas Huang

The work of our Creator Labs artists is a reminder for all of us that isolation can have a silver lining—in this case, giving us more space to think, reflect, refresh and create. 

Portrait Light: Enhancing Portrait Lighting with Machine Learning

Professional portrait photographers are able to create compelling photographs by using specialized equipment, such as off-camera flashes and reflectors, and expert knowledge to capture just the right illumination of their subjects. In order to allow users to better emulate professional-looking portraits, we recently released Portrait Light, a new post-capture feature for the Pixel Camera and Google Photos apps that adds a simulated directional light source to portraits, with the directionality and intensity set to complement the lighting from the original photograph.

Example image with and without Portrait Light applied. Note how Portrait Light contours the face, adding dimensionality, volume, and visual interest.

In the Pixel Camera on Pixel 4, Pixel 4a, Pixel 4a (5G), and Pixel 5, Portrait Light is automatically applied post-capture to images in the default mode and to Night Sight photos that include people — just one person or even a small group. In Portrait Mode photographs, Portrait Light provides more dramatic lighting to accompany the shallow depth-of-field effect already applied, resulting in a studio-quality look. But because lighting can be a personal choice, Pixel users who shoot in Portrait Mode can manually re-position and adjust the brightness of the applied lighting within Google Photos to match their preference. For those running Google Photos on Pixel 2 or newer, this relighting capability is also available for many pre-existing portrait photographs.

Pixel users can adjust a portrait’s lighting as they like in Google Photos, after capture.

Today we present the technology behind Portrait Light. Inspired by the off-camera lights used by portrait photographers, Portrait Light models a repositionable light source that can be added into the scene, with the initial lighting direction and intensity automatically selected to complement the existing lighting in the photo. We accomplish this by leveraging novel machine learning models, each trained using a diverse dataset of photographs captured in the Light Stage computational illumination system. These models enabled two new algorithmic capabilities:

  1. Automatic directional light placement: For a given portrait, the algorithm places a synthetic directional light in the scene consistent with how a photographer would have placed an off-camera light source in the real world.
  2. Synthetic post-capture relighting: For a given lighting direction and portrait, synthetic light is added in a way that looks realistic and natural.

These innovations enable Portrait Light to help create attractive lighting at any moment for every portrait — all on your mobile device.

Automatic Light Placement
Photographers usually rely on perceptual cues when deciding how to augment environmental illumination with off-camera light sources. They assess the intensity and directionality of the light falling on the face, and also adjust their subject’s head pose to complement it. To inform Portrait Light’s automatic light placement, we developed computational equivalents to these two perceptual signals.

First, we trained a novel machine learning model to estimate a high dynamic range, omnidirectional illumination profile for a scene based on an input portrait. This new lighting estimation model infers the direction, relative intensity, and color of all light sources in the scene coming from all directions, considering the face as a light probe. We also estimate the head pose of the portrait’s subject using MediaPipe Face Mesh.

Estimating the high dynamic range, omnidirectional illumination profile from an input portrait. The three spheres at the right of each image, diffuse (top), matte silver (middle), and mirror (bottom), are rendered using the estimated illumination, each reflecting the color, intensity, and directionality of the environmental lighting.

Using these clues, we determine the direction from which the synthetic lighting should originate. In studio portrait photography, the main off-camera light source, or key light, is placed about 30° above the eyeline and between 30° and 60° off the camera axis, when looking overhead at the scene. We follow this guideline for a classic portrait look, enhancing any pre-existing lighting directionality in the scene while targeting a balanced, subtle key-to-fill lighting ratio of about 2:1.

Data-Driven Portrait Relighting
Given a desired lighting direction and portrait, we next trained a new machine learning model to add the illumination from a directional light source to the original photograph. Training the model required millions of pairs of portraits both with and without extra light. Photographing such a dataset in normal settings would have been impossible because it requires near-perfect registration of portraits captured across different lighting conditions.

Instead, we generated training data by photographing seventy different people using the Light Stage computational illumination system. This spherical lighting rig includes 64 cameras with different viewpoints and 331 individually-programmable LED light sources. We photographed each individual illuminated one-light-at-a-time (OLAT) by each light, which generates their reflectance field — or their appearance as illuminated by the discrete sections of the spherical environment. The reflectance field encodes the unique color and light-reflecting properties of the subject’s skin, hair, and clothing — how shiny or dull each material appears. Due to the superposition principle for light, these OLAT images can then be linearly added together to render realistic images of the subject as they would appear in any image-based lighting environment, with complex light transport phenomena like subsurface scattering correctly represented.

Using the Light Stage, we photographed many individuals with different face shapes, genders, skin tones, hairstyles, and clothing/accessories. For each person, we generated synthetic portraits in many different lighting environments, both with and without the added directional light, rendering millions of pairs of images. This dataset encouraged model performance across diverse lighting environments and individuals.

Photographing an individual as illuminated one-light-at-a-time in the Google Light Stage, a 360° computational illumination rig.
Left: Example images from an individual’s photographed reflectance field, their appearance in the Light Stage as illuminated one-light-at-a-time. Right: The images can be added together to form the appearance of the subject in any novel lighting environment.

Learning Detail-Preserving Relighting Using the Quotient Image
Rather than trying to directly predict the output relit image, we trained the relighting model to output a low-resolution quotient image, i.e., a per-pixel multiplier that when upsampled can be applied to the original input image to produce the desired output image with the contribution of the extra light source added. This technique is computationally efficient and encourages only low-frequency lighting changes, without impacting high-frequency image details, which are directly transferred from the input to maintain image quality.

Supervising Relighting with Geometry Estimation
When photographers add an extra light source into a scene, its orientation relative to the subject’s facial geometry determines how much brighter each part of the face appears. To model the optical behavior of light sources reflecting off relatively matte surfaces, we first trained a machine learning model to estimate surface normals given the input photograph, and then applied Lambert’s law to compute a “light visibility map” for the desired lighting direction. We provided this light visibility map as input to the quotient image predictor, ensuring that the model is trained using physics-based insights.

The pipeline of our relighting network. Given an input portrait, we estimate per-pixel surface normals, which we then use to compute a light visibility map. The model is trained to produce a low-resolution quotient image that, when upsampled and applied as a multiplier to the original image, produces the original portrait with an extra light source added synthetically into the scene.

We optimized the full pipeline to run at interactive frame-rates on mobile devices, with total model size under 10 MB. Here are a few examples of Portrait Light in action.

Portrait Light in action.

Getting the Most Out of Portrait Light
You can try Portrait Light in the Pixel Camera and change the light position and brightness to your liking in Google Photos. For those who use Dual Exposure Controls, Portrait Light can be applied post-capture for additional creative flexibility to find just the right balance between light and shadow. On existing images from your Google Photos library, try it where faces are slightly underexposed, where Portrait Light can illuminate and highlight your subject. It will especially benefit images with a single individual posed directly at the camera.

We see Portrait Light as the first step on the journey towards creative post-capture lighting controls for mobile cameras, powered by machine learning.

Acknowledgements
Portrait Light is the result of a collaboration between Google Research, Google Daydream, Pixel, and Google Photos teams. Key contributors include: Yun-Ta Tsai, Rohit Pandey, Sean Fanello, Chloe LeGendre, Michael Milne, Ryan Geiss, Sam Hasinoff, Dillon Sharlet, Christoph Rhemann, Peter Denny, Kaiwen Guo, Philip Davidson, Jonathan Taylor, Mingsong Dou, Pavel Pidlypenskyi, Peter Lincoln, Jay Busch, Matt Whalen, Jason Dourgarian, Geoff Harvey, Cynthia Herrera, Sergio Orts Escolano, Paul Debevec, Jonathan Barron, Sofien Bouaziz, Clement Ng, Rachit Gupta, Jesse Evans, Ryan Campbell, Sonya Mollinger, Emily To, Yichang Shih, Jana Ehmann, Wan-Chun Alex Ma, Christina Tong, Tim Smith, Tim Ruddick, Bill Strathearn, Jose Lima, Chia-Kai Liang, David Salesin, Shahram Izadi, Navin Sarma, Nisha Masharani, Zachary Senzer.


1  Work conducted while at Google. 

Source: Google AI Blog


Take holiday photos with Night Sight in Portrait Mode

It’s officially the holiday season, which means I can finally decorate my house—so there are lights everywhere. Usually, I’d have friends and family over to see my setup, but this year I’ll be celebrating with just my household. Instead of gathering in person, my friends, family and I are sending each other digital holiday cards, and I’ll be using Night Sight in Portrait Mode on my new Pixel 5 to get the perfect photo.

Night Sight in Portrait Mode is a new feature only on Pixel 4a (5G) and Pixel 5, and it lets you capture beautiful low-light images with sharp subjects and artistically-blurred backgrounds. While Night Sight in Portrait Mode takes incredible photos year-round, it’s perfect for capturing a selfie or photo in front of holiday lights—whether those are on your house, a tree or from a menorah or kinara’s candlelight.

Image showing the author with his mother in front on a decorate tree in a dark room. The subjects are well-lit and in focus while the background is blurred.

Night Sight in Portrait Mode was designed to create professional quality low-light portraits with the tap of a button. Night Sight automatically engages in Portrait Mode when it’s dark enough, and, when you press the shutter button, Pixel’s new exposure bracketing technology will capture, align and merge up to 15 photos to improve low-light detail. To produce bright and vibrant portraits, Portrait Light was integrated directly into Pixel Camera to automatically enhance the lighting on people, and, in really dark scenes, Night Sight in Portrait Mode will autofocus using machine learning to keep your subjects sharp. After predicting the depth of the photo, Pixel will blur the background to create the beautiful bokeh that we love in professional portraits. 

Here are a few tips and tricks to help you nail the perfect holiday shot using Night Sight in Portrait Mode on your Pixel 4a (5G) or Pixel 5:

Tip #1:Accentuate the background lighting. Holiday lights can make for a perfect background because Portrait Mode will turn these small lights into beautiful bokeh circles. Just make sure you also tap on the subject you’d like to be in focus.

Image showing two women smiling at the camera. They are in focus while the decorated tree in the background is blurry.

Tip #2:Distance is important, so get properly set up. The photographer should be close to the subject, and the subject should have some distance from the background. My best photos position the photographer within four feet of the subject and the subject more than six feet from the background. If you’re socially distancing while taking a picture of a friend or family member that’s not of the same household, try placing Pixel on a tripod with the timer enabled, so that you can compose the photo, press the shutter button and move away as the subject enters the frame.

Image showing a woman in a yellow dress standing in a dark room in front of a decorated tree. The woman is well-lit and in focus while the tree in the background is blurred.


Tip #3: If you’re taking a Portrait Mode selfie or photo of someone else, make sure their face has some soft and ambient lighting; otherwise, the photo may be backlit and too dark. Portrait Light in Google Photos can also help you adjust the lighting on your photos after you take them.

Animated GIF showing a Pixel phone using the photo editor to choose what area of a photo shot in low light will be well-lit and in focus. The editor chooses the face of a woman who's smiling in front of a lit up tree.

Tip #4: If you want to capture a close-up of an ornament or other holiday decorations, make sure Pixel is really close to the subject for a macro shot. If you compose the photo such that small lights are far in the background, they will turn into large and beautiful bokeh discs that capture the beauty of the holidays.

Image showing a macro shot of an ornament hanging in a tree. The lights are low and there is decorative lighting, but the ornament remains in focus while the background is blurry.

Tip #5: If your photo isn’t coming out perfect, don’t worry—there are a few things you can try. If you see lens reflections in the viewfinder, try to angle the camera differently so that they disappear. And make sure the lens is cleaned and fingerprint-free; using a clean microfiber cloth can fix shots that are coming out soft and hazy. Lastly, remember to experiment! If you’re not happy with the lighting on your subject, try moving the subject or lighting around to get a better result.

On behalf of #teampixel, I hope you enjoy the holidays safely and capture beautiful memories with Night Sight in Portrait Mode on your Pixel 4a (5G) or Pixel 5.

The latest features for Pixel owners are here



One of the best parts of Pixel is regular feature drops that make the phone better and better (and better). With the December update, even more Pixel owners will get to experience our most recent updates, along with a few new surprises.

The latest and greatest, now on more Pixels

Many of the new features launched with the Pixel 5 are now rolling out to Pixel 3 and newer devices. That includes Extreme Battery Saver. When this is turned on, it lets your Pixel automatically limit some apps and only run the essentials so your battery lasts as long as possible.*

And now friends and families can share in the joy of watching the same video, cheer on live sports together and plan activities—even when they’re far apart. While Duo screen sharing in one-to-one calls is already available, screen sharing is also becoming available in group calls, too, so long as you’re using Wi-Fi or a 5G connection. 




Finally, we showed off a redesigned, more helpful editor in Google Photos with a new tab that gives you suggestions powered by machine learning that are tailored to the picture you’re editing. Now on Pixel, we’re rolling out new suggestions, including Dynamic, which enhances brightness, contrast and colour, and a set of sky suggestions, which help you create stunning sunset and sunrise images in just one tap.

Adapting to you, for you

Google devices are most helpful when they seamlessly assist you throughout the day—wherever you are. We call this ambient computing, and it drives our approach to how Pixel should adapt to your needs in real time.

For example, Adaptive Sound improves the sound quality of your phone speaker based on your surroundings. It uses the microphone to assess the acoustics near you, then adjusts the sound equalizer settings in certain apps. Bringing your Pixel from the bedroom to the bathroom while getting ready in the morning? Your audio will sound great wherever you are.

Speaking of where you’re going, the GPS on Pixel 5 and Pixel 4a (5G) is now more accurate when you’re on foot than previous generations. This means your rideshare service can find you more easily, and there’s no more guessing which side of the street you need to be on when you’re walking somewhere. (Requires an internet connection and Android 8.0 or later.)

Your Pixel can also now detect if you’re viewing a website or app in a different language and translate it using Google Lens. Just take a screenshot or swipe into App Overview, and tap the Lens chip to see the translation. For available Google Lens languages go to g.co/help/lens.

And for a little more help between charges, there are new context-aware battery features. Additional improvements to Adaptive Battery for Pixel 5 and Pixel 4a (5G) can automatically save even more power if a user is likely to miss their next charge, keeping the device powered even longer. Adaptive Charging helps preserve battery health over time by dynamically controlling how quickly a Pixel device charges. Just plug in your phone in the evening, set an alarm and the Adaptive Charging will work its magic.

And for Pixel 5 and Pixel 4a (5G) owners, our new Adaptive Connectivity feature helps you get the most out of your battery by automatically switching from 5G to 4G based on the app you’re using. It’ll choose 4G for things like browsing the web or sending texts, and switch to 5G when you’re watching movies or downloading large files. (Not available on all carriers or for all apps or features.)

Make your Pixel even more yours

Your phone should feel uniquely yours. Now you can personalize your home screen with new icons, grid views and app shapes, or even choose custom wallpapers of famous artworks provided by cultural institutions from around the world on Google Arts & Culture (wallpapers coming soon directly into the wallpaper categories in settings).

Plus, a special treat for Star Wars fans: Google, Disney and Lucasfilm worked together to launch “The Mandalorian” AR Experience, an augmented reality app available on Google Play for 5G Google Pixel devices and other select 5G Android phones. Now all Pixel 3 and newer devices can customize the home screen with original new Mandalorian wallpapers.




And something music lovers can appreciate: Your Pixel can already recognize songs that are playing around you if you enable Now Playing; all the tracks you hear are stored in your Now Playing History. Now you can select all the songs you heard while you were driving or watching TV and export them to a playlist in YouTube Music.





For those who use other Android devices, there's plenty of new things to get excited about: Check out our blog post on everything new for Android phones.



* Battery life depends upon many factors, and usage of certain features will decrease battery life. Actual battery life may be lower. 



Posted by Harrison Lingren, Technical Program Manager

The latest features for Pixel owners are here

One of the best parts of Pixel is regular feature drops that make the phone better and better (and better). With the December update, even more Pixel owners will get to experience our most recent updates, along with a few new surprises. 


The latest and greatest, now on more Pixels 

Many of the new features launched with the Pixel 5 are now rolling out to Pixel 3 and newer devices. That includes Hold for Me, which helps save you time when you're put on hold by a business. Available for Pixel owners in the U.S. in English, Google Assistant waits on the line for you and lets you know when someone’s ready to talk. We've found that when Hold for Me is enabled, it saves eight minutes per call on average.  

Animated GIF of a Pixel phone showing the Hold for Me feature in use.

Another helpful feature rolling out to previous Pixel devices is Extreme Battery Saver. When this is turned on, it lets your Pixel automatically limit some apps and only run the essentials so your battery lasts as long as possible. 

And now friends and families can share in the joy of watching the same video, cheer on live sports together and plan activities—even when they’re far apart. While Duo screen sharing in one-to-one calls is already available, screen sharing is also becoming available in group calls, too, so long as you’re using Wi-FI or a 5G connection.

Image showing a Pixel phone with the new Photos Editor on the screen.

Finally, we showed off a redesigned, more helpful editor in Google Photos with a new tab that gives you suggestions powered by machine learning that are tailored to the picture you’re editing. Now on Pixel, we’re rolling out new suggestions, including Dynamic, which enhances brightness, contrast and color, and a set of sky suggestions, which help you create stunning sunset and sunrise images in just one tap.

Animated GIF showing a Pixel Phone swiping through various editing sets for a photo of a sunset at a beach.

Adapting to you, for you


Google devices are most helpful when they seamlessly assist you throughout the day—wherever you are. We call this ambient computing, and it drives our approach to how Pixel should adapt to your needs in real time.

For example, Adaptive Sound improves the sound quality of your phone speaker based on your surroundings. It uses the microphone to assess the acoustics near you, then adjusts the sound equalizer settings in certain apps. Bringing your Pixel from the bedroom to the bathroom while getting ready in the morning? Your audio will sound great wherever you are.

Animated GIF of a Pixel phone screen selecting Adaptive Sound; the image then shows a woman in her bathroom doing her hair while the Adaptive Sound feature adjust the music to the noise of her environment.

Speaking of where you’re going, the GPS on Pixel 5 and Pixel 4a (5G) is now more accurate when you’re on foot. This means your rideshare service can find you more easily, and there’s no more guessing which side of the street you need to be on when you’re walking somewhere. (Not available at launch in all countries or locations, and requires an internet connection and Android 8.0 or later.)

Your Pixel can also now detect if you’re viewing a website or app in a different language and translate it using Google Lens. Just take a screenshot or swipe into App Overview, and tap the Lens chip to see the translation. For available Google Lens languages go to g.co/help/lens

And for a little more help between charges, there are new context-aware battery features. Additional improvements to Adaptive Batteryfor Pixel 5 and Pixel 4a (5G) can automatically save even more power if a user is likely to miss their next charge, keeping the device powered even longer. Adaptive Charging helps preserve battery health over time by dynamically controlling how quickly a Pixel device charges. Just plug in your phone in the evening, set an alarm and the Adaptive Charging will work its magic. 

And for Pixel 5 and Pixel 4a (5G) owners, our new Adaptive Connectivity feature helps you get the most out of your battery by automatically switching from 5G to 4G based on the app you’re using. It’ll choose 4G for things like browsing the web or sending texts, and switch to 5G when you’re watching movies or downloading large files. (Not available on all carriers or for all apps or features.)


Make your Pixel even more yours 

Your phone should feel uniquely yours. Now you can personalize your home screen with new icons, grid views and app shapes, or even choose custom wallpapers of famous artworks provided by cultural institutions from around the world on Google Arts & Culture (wallpapers coming soon directly into the wallpaper categories in settings).

Plus, a special treat for Star Wars fans: Google, Disney and Lucasfilm worked together to launch “The Mandalorian” AR Experience, an augmented reality app available on Google Play for 5G Google Pixel devices and other select 5G Android phones. Now all Pixel 3 and newer devices can customize the home screen with original new Mandalorian wallpapers.

Image showing Baby Yoda on the screen of a Pixel phone.

And something music lovers can appreciate: Your Pixel can already recognize songs that are playing around you if you enable Now Playing; all the tracks you hear are stored in your Now Playing History. Now you can select all the songs you heard while you were driving or watching TV and export them to a playlist in YouTube Music.


Image showing a Pixel phone using the Now Playing feature.

For those who use other Android devices, there's plenty of new things to get excited about: Check out ourblog post on everything new for Android phones.


Improving urban GPS accuracy for your app

Posted by Frank van Diggelen, Principal Engineer and Jennifer Wang, Product Manager

At Android, we want to make it as easy as possible for developers to create the most helpful apps for their users. That’s why we aim to provide the best location experience with our APIs like the Fused Location Provider API (FLP). However, we’ve heard from many of you that the biggest location issue is inaccuracy in dense urban areas, such as wrong-side-of-the-street and even wrong-city-block errors.

This is particularly critical for the most used location apps, such as rideshare and navigation. For instance, when users request a rideshare vehicle in a city, apps cannot easily locate them because of the GPS errors.

The last great unsolved GPS problem

This wrong-side-of-the-street position error is caused by reflected GPS signals in cities, and we embarked on an ambitious project to help solve this great problem in GPS. Our solution uses 3D mapping aided corrections, and is only feasible to be done at scale by Google because it comprises 3D building models, raw GPS measurements, and machine learning.

The December Pixel Feature Drop adds 3D mapping aided GPS corrections to Pixel 5 and Pixel 4a (5G). With a system API that provides feedback to the Qualcomm® Snapdragon™ 5G Mobile Platform that powers Pixel, the accuracy in cities (or “urban canyons”) improves spectacularly.

Picture of a pedestrian test, with Pixel 5 phone, walking along one side of the street, then the other. Yellow = Path followed, Red = without 3D mapping aided corrections, Blue = with 3D mapping aided corrections.  The picture shows that without 3D mapping aided corrections, the GPS results frequently wander to the wrong side of the street (or even the wrong city block), whereas, with 3D mapping aided corrections, the position is many times more accurate.

Picture of a pedestrian test, with Pixel 5 phone, walking along one side of the street, then the other. Yellow = Path followed, Red = without 3D mapping aided corrections, Blue = with 3D mapping aided corrections.

Why hasn’t this been solved before?

The problem is that GPS constructively locates you in the wrong place when you are in a city. This is because all GPS systems are based on line-of-sight operation from satellites. But in big cities, most or all signals reach you through non line-of-sight reflections, because the direct signals are blocked by the buildings.

Diagram of the 3D mapping aided corrections module in Google Play services, with corrections feeding into the FLP API.   3D mapping aided corrections are also fed into the GNSS chip and software, which in turn provides GNSS measurements, position, and velocity back to the module.

The GPS chip assumes that the signal is line-of-sight and therefore introduces error when it calculates the excess path length that the signals traveled. The most common side effect is that your position appears on the wrong side of the street, although your position can also appear on the wrong city block, especially in very large cities with many skyscrapers.

There have been attempts to address this problem for more than a decade. But no solution existed at scale, until 3D mapping aided corrections were launched on Android.

How 3D mapping aided corrections work

The 3D mapping aided corrections module, in Google Play services, includes tiles of 3D building models that Google has for more than 3850 cities around the world. Google Play services 3D mapping aided corrections currently supports pedestrian use-cases only. When you use your device’s GPS while walking, Android’s Activity Recognition API will recognize that you are a pedestrian, and if you are in one of the 3850+ cities, tiles with 3D models will be downloaded and cached on the phone for that city. Cache size is approximately 20MB, which is about the same size as 6 photographs.

Inside the module, the 3D mapping aided corrections algorithms solve the chicken-and-egg problem, which is: if the GPS position is not in the right place, then how do you know which buildings are blocking or reflecting the signals? Having solved this problem, 3D mapping aided corrections provide a set of corrected positions to the FLP. A system API then provides this information to the GPS chip to help the chip improve the accuracy of the next GPS fix.

With this December Pixel feature drop, we are releasing version 2 of 3D mapping aided corrections on Pixel 5 and Pixel 4a (5G). This reduces wrong-side-of-street occurrences by approximately 75%. Other Android phones, using Android 8 or later, have version 1 implemented in the FLP, which reduces wrong-side-of-street occurrences by approximately 50%. Version 2 will be available to the entire Android ecosystem (Android 8 or later) in early 2021.

Android’s 3D mapping aided corrections work with signals from the USA’s Global Positioning System (GPS) as well as other Global Navigation Satellite Systems (GNSSs): GLONASS, Galileo, BeiDou, and QZSS.

Our GPS chip partners shared the importance of this work for their technologies:

“Consumers rely on the accuracy of the positioning and navigation capabilities of their mobile phones. Location technology is at the heart of ensuring you find your favorite restaurant and you get your rideshare service in a timely manner. Qualcomm Technologies is leading the charge to improve consumer experiences with its newest Qualcomm® Location Suite technology featuring integration with Google's 3D mapping aided corrections. This collaboration with Google is an important milestone toward sidewalk-level location accuracy,” said Francesco Grilli, vice president of product management at Qualcomm Technologies, Inc.

“Broadcom has integrated Google's 3D mapping aided corrections into the navigation engine of the BCM47765 dual-frequency GNSS chip. The combination of dual frequency L1 and L5 signals plus 3D mapping aided corrections provides unprecedented accuracy in urban canyons. L5 plus Google’s corrections are a game-changer for GNSS use in cities,” said Charles Abraham, Senior Director of Engineering, Broadcom Inc.

“Google's 3D mapping aided corrections is a major advancement in personal location accuracy for smartphone users when walking in urban environments. MediaTek’s Dimensity 5G family enables 3D mapping aided corrections in addition to its highly accurate dual-band GNSS and industry-leading dead reckoning performance to give the most accurate global positioning ever for 5G smartphone users,” said Dr. Yenchi Lee, Deputy General Manager of MediaTek’s Wireless Communications Business Unit.

How to access 3D mapping aided corrections

Android’s 3D mapping aided corrections automatically works when the GPS is being used by a pedestrian in any of the 3850+ cities, on any phone that runs Android 8 or later. The best way for developers to take advantage of the improvement is to use FLP to get location information. The further 3D mapping aided corrections in the GPS chip are available to Pixel 5 and Pixel 4a (5G) today, and will be rolled out to the rest of the Android ecosystem (Android 8 or later) in the next several weeks. We will also soon support more modes including driving.

Android’s 3D mapping aided corrections cover more than 3850 cities, including:

  • North America: All major cities in USA, Canada, Mexico.
  • Europe: All major cities. (100%, except Russia & Ukraine)
  • Asia: All major cities in Japan and Taiwan.
  • Rest of the world: All major cities in Brazil, Argentina, Australia, New Zealand, and South Africa.

As our Google Earth 3D models expand, so will 3D mapping aided corrections coverage.

Google Maps is also getting updates that will provide more street level detail for pedestrians in select cities, such as sidewalks, crosswalks, and pedestrian islands. In 2021, you can get these updates for your app using the Google Maps Platform. Along with the improved location accuracy from 3D mapping aided corrections, we hope we can help developers like you better support use cases for the world’s 2B pedestrians that use Android.

Continuously making location better

In addition to 3D mapping aided corrections, we continue to work hard to make location as accurate and useful as possible. Below are the latest improvements to the Fused Location Provider API (FLP):

  • Developers wanted an easier way to retrieve the current location. With the new getCurrentLocation() API, developers can get the current location in a single request, rather than having to subscribe to ongoing location changes. By allowing developers to request location only when needed (and automatically timing out and closing open location requests), this new API also improves battery life. Check out our latest Kotlin sample.
  • Android 11's Data Access Auditing API provides more transparency into how your app and its dependencies access private data (like location) from users. With the new support for the API's attribution tags in the FusedLocationProviderClient, developers can more easily audit their apps’ location subscriptions in addition to regular location requests. Check out this Kotlin sample to learn more.



Qualcomm and Snapdragon are trademarks or registered trademarks of Qualcomm Incorporated.

Qualcomm Snapdragon and Qualcomm Location Suite are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

The U.K.’s top nostalgic films: Access now on Pixel’s 5G

With so many countries now returning to various forms of lock down, and winter steadily drawing in, many of us are turning to our favorite films and movie moments to find some familiarity in a time of uncertainty. 

In other words, we’re embracing nostalgia.

And why not? The movies we love are usually steeped in happy memories, attached to dreamy locations or feature music that temporarily transports us out of the present moment. They bring us joy and a sense of change, breaking up some of the monotony of life in lock down.

We asked Dr. Wing Yee Cheung, a Senior Lecturer in Psychology at the University of Winchester, about this, and learned that films are a great way to relive memories of happier times. “Movies are embedded with sensory memories of when we first watched them and whom we watched them with,” she writes. “Sensory inputs and social interactions aretwo key triggers of nostalgia. Watching these can be a way to walk down memory lane and reminisce [about] the way life used to be, what we used to do, and the people surrounding us.”

And because it’s the season of giving, we have our own gift for you: If you’re in the U.K., you can download classic films, such as "Four Weddings and a Funeral" or "Monty Python’s Life of Brian," all from a unique Google Map, now until Dec. 10. Transport yourself to a world of nostalgia by searching the map for symbols that represent the films in relevant locations. If you find one, you’ll receive a code to rediscover and enjoy the movie in Google Play*1

Image showing Four Weddings and a Funeral on a Pixel phone.

Anyone in the U.K. can take part, regardless of what type of phone you have—but of course if you do happen to own a new Pixel 5G-enabled device, you’ll be able to start your viewing party in a matter of seconds1. Thanks to movies on-demand combined with the technology of 5G networks2, you can choose your film, download1 it and settle in on the couch, all while the popcorn is still warm. Currently, 5G2 is one of the fastest ways to download a movie on any device. Both Pixel 5 and Pixel 4a with 5G2 enable you to download a film in seconds1. Whether you’re curled up on your sofa, pottering around the house, or outside on a walk, Pixel with 5G2 gives you access to the stories and characters you know and love, on the go; the speed of a 5G2 device immediately transports you to where you want to be.

So let’s lean into the nostalgia. As Dr. Cheung notes, it actually helps us cope with uncertainty: “Immersing ourselves in nostalgic moments is not about hiding our heads in the past. On the contrary, it can create new memories which can feed into future nostalgic experiences.”

It’s a great way to spend lock down with your family: Watching much-loved classics is a natural way for parents to share their experiences with their children and to make new memories together. And even if you’re physically on your own, you can use Google Duo on Pixel 5 to share your screen and watch your favorites with socially distant family and friends3

“An old movie that makes us feel nostalgia can inject us with a complex range of emotions,” concludes Dr. Cheung. “We feel sentimental, predominantly happy, but with a tinge of longing.” And that’s something we can probably all relate to right now. 


*Offer begins on 25th November 25, 2020 and ends 10th December 10, 2020.  Limited number of codes available. Subject to availability. Terms Apply. See here for full terms. 

1.  Testing based on download speeds for content file sizes between 449MB and 749MB at off-peak times. Average download time was twenty seconds or less. Download speed depends upon many factors, such as file size, content provider and carrier network connection and capabilities. Testing conducted by Google on pre-production hardware in the UK in August 2020. Actual download speeds may be slower.  

2. Requires a 5G data plan (sold separately).  5G service and roaming not available on all carrier networks or in all areas and may vary by country. Contact carrier for details about current 5G network performance, compatibility, and availability. Phone connects to 5G networks but, 5G service, speed and performance depend on many factors including, but not limited to, carrier network capabilities, device configuration and capabilities, network traffic, location, signal strength and signal obstruction. Actual results may vary. Some features not available in all areas. Data rates may apply. See g.co/pixel/networkinfo for info.

3. Requires a Google Duo account. Screen sharing not available on group calls.  Requires Wi-Fi or 5G internet connection.Not available on all apps and content. Data rates may apply. 5G service, speed and performance depend on many factors including, but not limited to, carrier network capabilities, device configuration and capabilities, network traffic, location, signal strength, and signal obstruction.

*Promotional code offer is provided by Google Commerce Limited (Google) for use on Google  Play Store UK only, and subject to the following terms. Offer begins on 25th November 2020 and ends 10th December, 2020 (‘Offer Period’). One (1) promotional code per user per film release, and up to a maximum of five (5) promotional codes per User during the Offer Period. Limited number of codes available. Subject to availability.

Available only to Users 18 or older with a delivery and billing address in the United Kingdom. Users must have internet access and must have or add a form of payment at checkout . Promotional codes cannot be used with Guest checkout, Users must be signed-in to their Google account to redeem the code. 

Promotional codes can be redeemed by visiting play.google.com/redeem or the Google Play Store app and entering the 16 digit code to receive a £5 or £10 discount for purchase or rental of any product on the Google Play Store UK. The discount will be applied at checkout. Promotional code must be redeemed by 31st December, 2021 or it will expire. Promotional codes may only be used once and may not be used in conjunction with any other offer or promotion. Any unused promotional balance will be applied to the associated Google account. Users may continue to use the unused promotional balance for Google Play purchases until such balance is £0, or any remaining promotional balance expires. Promotional codes are a discount off price for up to the promotional amount, are for one-time use only, cannot be transferred to other users, are not reloadable, cannot be exchanged for cash. Google and its third party partners if applicable, are not liable for lost or stolen promotional codes, or for expired promotional codes that are not redeemed within the redemption period. Terms subject to applicable laws. Void where prohibited.

The U.K.’s top nostalgic films: Access now on Pixel’s 5G

With so many countries now returning to various forms of lock down, and winter steadily drawing in, many of us are turning to our favorite films and movie moments to find some familiarity in a time of uncertainty. 

In other words, we’re embracing nostalgia.

And why not? The movies we love are usually steeped in happy memories, attached to dreamy locations or feature music that temporarily transports us out of the present moment. They bring us joy and a sense of change, breaking up some of the monotony of life in lock down.

We asked Dr. Wing Yee Cheung, a Senior Lecturer in Psychology at the University of Winchester, about this, and learned that films are a great way to relive memories of happier times. “Movies are embedded with sensory memories of when we first watched them and whom we watched them with,” she writes. “Sensory inputs and social interactions aretwo key triggers of nostalgia. Watching these can be a way to walk down memory lane and reminisce [about] the way life used to be, what we used to do, and the people surrounding us.”

And because it’s the season of giving, we have our own gift for you: If you’re in the U.K., you can download classic films, such as "Four Weddings and a Funeral" or "Monty Python’s Life of Brian," all from a unique Google Map, now until Dec. 10. Transport yourself to a world of nostalgia by searching the map for symbols that represent the films in relevant locations. If you find one, you’ll receive a code to rediscover and enjoy the movie in Google Play*1

Image showing Four Weddings and a Funeral on a Pixel phone.

Anyone in the U.K. can take part, regardless of what type of phone you have—but of course if you do happen to own a new Pixel 5G-enabled device, you’ll be able to start your viewing party in a matter of seconds1. Thanks to movies on-demand combined with the technology of 5G networks2, you can choose your film, download1 it and settle in on the couch, all while the popcorn is still warm. Currently, 5G2 is one of the fastest ways to download a movie on any device. Both Pixel 5 and Pixel 4a with 5G2 enable you to download a film in seconds1. Whether you’re curled up on your sofa, pottering around the house, or outside on a walk, Pixel with 5G2 gives you access to the stories and characters you know and love, on the go; the speed of a 5G2 device immediately transports you to where you want to be.

So let’s lean into the nostalgia. As Dr. Cheung notes, it actually helps us cope with uncertainty: “Immersing ourselves in nostalgic moments is not about hiding our heads in the past. On the contrary, it can create new memories which can feed into future nostalgic experiences.”

It’s a great way to spend lock down with your family: Watching much-loved classics is a natural way for parents to share their experiences with their children and to make new memories together. And even if you’re physically on your own, you can use Google Duo on Pixel 5 to share your screen and watch your favorites with socially distant family and friends3

“An old movie that makes us feel nostalgia can inject us with a complex range of emotions,” concludes Dr. Cheung. “We feel sentimental, predominantly happy, but with a tinge of longing.” And that’s something we can probably all relate to right now. 


*Offer begins on 25th November 25, 2020 and ends 10th December 10, 2020.  Limited number of codes available. Subject to availability. Terms Apply. See here for full terms. 

1.  Testing based on download speeds for content file sizes between 449MB and 749MB at off-peak times. Average download time was twenty seconds or less. Download speed depends upon many factors, such as file size, content provider and carrier network connection and capabilities. Testing conducted by Google on pre-production hardware in the UK in August 2020. Actual download speeds may be slower.  

2. Requires a 5G data plan (sold separately).  5G service and roaming not available on all carrier networks or in all areas and may vary by country. Contact carrier for details about current 5G network performance, compatibility, and availability. Phone connects to 5G networks but, 5G service, speed and performance depend on many factors including, but not limited to, carrier network capabilities, device configuration and capabilities, network traffic, location, signal strength and signal obstruction. Actual results may vary. Some features not available in all areas. Data rates may apply. See g.co/pixel/networkinfo for info.

3. Requires a Google Duo account. Screen sharing not available on group calls.  Requires Wi-Fi or 5G internet connection.Not available on all apps and content. Data rates may apply. 5G service, speed and performance depend on many factors including, but not limited to, carrier network capabilities, device configuration and capabilities, network traffic, location, signal strength, and signal obstruction.

*Promotional code offer is provided by Google Commerce Limited (Google) for use on Google  Play Store UK only, and subject to the following terms. Offer begins on 25th November 2020 and ends 10th December, 2020 (‘Offer Period’). One (1) promotional code per user per film release, and up to a maximum of five (5) promotional codes per User during the Offer Period. Limited number of codes available. Subject to availability.

Available only to Users 18 or older with a delivery and billing address in the United Kingdom. Users must have internet access and must have or add a form of payment at checkout . Promotional codes cannot be used with Guest checkout, Users must be signed-in to their Google account to redeem the code. 

Promotional codes can be redeemed by visiting play.google.com/redeem or the Google Play Store app and entering the 16 digit code to receive a £5 or £10 discount for purchase or rental of any product on the Google Play Store UK. The discount will be applied at checkout. Promotional code must be redeemed by 31st December, 2021 or it will expire. Promotional codes may only be used once and may not be used in conjunction with any other offer or promotion. Any unused promotional balance will be applied to the associated Google account. Users may continue to use the unused promotional balance for Google Play purchases until such balance is £0, or any remaining promotional balance expires. Promotional codes are a discount off price for up to the promotional amount, are for one-time use only, cannot be transferred to other users, are not reloadable, cannot be exchanged for cash. Google and its third party partners if applicable, are not liable for lost or stolen promotional codes, or for expired promotional codes that are not redeemed within the redemption period. Terms subject to applicable laws. Void where prohibited.

“The Mandalorian” in AR? This is the way.

In a galaxy far, far away, the Mandalorian and the Child continue their journey, facing enemies and rallying allies in the tumultuous era after the collapse of the Galactic Empire. But you don’t need a tracking fob to explore the world of the hit STAR WARS streaming series. Google and Lucasfilm have teamed up to bring iconic moments from the first season of “The Mandalorian” to life with “The Mandalorian” AR Experience (available on the Play Store for 5G Google Pixels and other select 5G Android phones) as fans follow the show’s second season. (Check your phone to see if it meets app requirements.)

Animated GIF showing a person's hand holding a Pixel phone while using the Mandalorian AR app.

From dinosaurs to astronauts, Google has been bringing objects and creatures to life with augmented reality. Now, people using compatible Android 5G devices can interact with heroes from the Mandalorian in their own space.

“The Mandalorian” AR Experience puts you in the shoes of a bounty hunter following the trail of Mando himself, Din Djarin and the Child. Explore the world of “The Mandalorian,” interact with characters in augmented reality and capture your very own scenes to share with friends.

To create this original experience, Google, Disney and Lucasfilm worked together to imagine a next-generation augmented reality app optimized for 5G devices. Our teams collaborated to build hyper-detailed models and life-like animations—all while packing scenes with fun surprises.

UsingARCore,Google’s developer platform for building augmented reality experiences, we created scenes that interact with your environment and respond to your surroundings. You can discover and unlock even more effects based on your actions. And thanks to the new ARCore Depth API, we also enabled occlusion, allowing 3D scenes to blend more naturally with our world.

Animated GIF showing the character the Mandalorian in AR standing in someone's kitchen on the screen of a Pixel phone.

New content will keep rolling out in the app each week onMando Mondays, so stay tuned—and Pixel owners should keep an eye out for additional exclusive content outside of the app as well.

Lucasfilm, the Lucasfilm logo, STAR WARS and related properties are trademarks and/or copyrights, in the United States and other countries, of Lucasfilm Ltd. and/or its affiliates. © & ™ 2020 Lucasfilm Ltd. All rights reserved.

Source: Android