Tag Archives: Pixel

From the seas, to more ZZZs: Your new Pixel features

The best part of your Pixel is that it keeps getting even more helpful, and even more unique. With regular updates, Pixels get smarter, more capable and more fun. This latest drop is no exception, and for Pixel 3 and newer devices, includes the ability to easily access and share audio recordings, a new way to use the Pixel Camera app underwater and new wallpapers to celebrate International Women's Day. 

A more shareable Recorder 
Whether it’s that guitar riff you've been working on or reviewing transcripts from a class lecture, Recorder makes it easy for Pixel owners to easily record, transcribe (English only) and search the audio moments that matter to you. Now you can share links to your Recorder audio files, so anyone can listen, even if they don’t have a Pixel. At recorder.google.com, you can hear recordings, see transcripts and even search through files — you get the entire Recorder playback experience in one shareable link. 
You can also back up recordings to your Google Account to help keep them safe, and easily access them from any device. See more at g.co/pixel/recorder

Capture the seas with Kraken Sports 
Now Pixel users can capture the same kinds of high quality images they’re accustomed to above water, and do it underwater without the cumbersome cameras and cases scuba drivers have traditionally used. Pixel camera software engineer, José Ricardo Lima, was scuba diving with his husband in the Philippines when he wondered what it would be like to use his Pixel camera underwater. His idea was to create a custom integration that combined Pixel’s camera with a case made for diving. Now, divers will be able to use their Pixel camera with the Kraken Sports’ Universal Smart Phone Housing to capture marine life and seascapes. Get access to your Pixel’s camera features, including Night Sight, Portrait Mode, Motion Photos and video directly through Pixel’s Camera app for high-quality images of you and your underwater friends. See g.co/pixel/diveconnector for more information. 
Photo captured on Pixel 5 using KRH03 Kraken Sports Universal Smart Housing. Kraken Sports is a registered trademark of Kraken Sports Ontario, Canada. 

Attention-grabbing graphics 
Part of Pixel’s latest drop also includes new wallpapers that celebrate different cultural moments throughout the year with artwork from artists around the world. And for International Women’s Day on March 8, Pixel will add new wallpapers illustrated by Spanish duo Cachetejack, which focus on the strength and transformation of women. 
Adapting to you and your routine 
Your Pixel can help you catch more ZZZs with a more seamless bedtime schedule on your Pixel Stand. When you use the bedtime features in Clock with your Pixel Stand, you’ll see a new, updated bedtime screen, along with redesigned notifications to help you ease into sleep. This feature is available on Pixel phones with wireless charging capability: Pixel 3, Pixel 3 XL, Pixel 4, Pixel 4 XL and Pixel 5. Pixel Stand is sold separately. 
For more information on the new features that just dropped and to see phone compatibility, head to http://g.co/pixel/updates. And if you’re looking for more helpfulness across your device, check out all of the latest updates announced from Android

Pixel 5G devices can now access 5G in dual SIM mode 
Software updates also mean that Pixel 4a with 5G and Pixel 5 devices will now be able to access 5G even when in dual SIM mode (eSIM+physical SIM DSDS).


And as a bonus, we recently announced a new Google Fit feature that allows you to measure your heart rate and respiratory rate using just your phone’s camera. This feature will roll out to Pixel owners next week (and is not intended for medical purposes). 



1. Works with Pixel 2 or newer phones. Requires Android R, Camera Update 8.1 (Nov. 2020), Dive Case Connector app for Google Camera, KRH04 or KRH03 Kraken Sports Universal Smart Phone Housing (sold separately). See g.co/pixel/dive-case-connector-setup for more information on setup. Google is not responsible for the operation of Kraken Sports products or their compliance with any applicable safety or other requirements. Photo captured on Pixel 5 using KRH03 Kraken Sports Universal Smart Phone Housing. Kraken Sports is a registered trademark of Kraken Sports Ontario, Canada. 
2. Transcription is available in English only. Recorder sharing requires an Internet connection and a Google Account. 
3. Cloud storage requires an Internet connection and a Google Account. 
4. Your Pixel will receive feature drops during the applicable Android update and support periods for the phone. See g.co/pixel/updates for details.
5. Requires a 5G data plan (sold separately). 5G service not available on all carrier networks or in all areas. Contact carrier for details. 5G service, speed and performance depend on many factors including, but not limited to, carrier network capabilities, device configuration and capabilities, network traffic, location, signal strength and signal obstruction. Actual results may vary. Some features are not available in all areas. Data rates may apply. See g.co/pixel/networkinfo for info. 
 


From the seas, to more ZZZs: Your new Pixel features


The best part of your Pixel is that it keeps getting even more helpful, and even more unique. With regular updates, Pixels get smarter, more capable and more fun. This latest drop is no exception, and includes the ability to easily access and share audio recordings at recorder.google.com, a new way to use the Pixel Camera app underwater and new wallpapers to celebrate International Women's Day.

A more shareable Recorder

Whether it’s that guitar riff you've been working on or reviewing transcripts from a class lecture, Recorder makes it easy for Pixel owners to easily record, transcribe (English only) and search the audio moments that matter to you. Now you can share links to your Recorder audio files, so anyone can listen, even if they don’t have a Pixel. At recorder.google.com, you can hear recordings, see transcripts and even search through files — you get the entire Recorder playback experience in one shareable link.



You can also back up recordings to your Google Account to help keep them safe, and easily access them from any device. See more at g.co/pixel/recorder.

Capture the seas

Now Pixel users can capture the same kinds of high quality images they’re accustomed to above water, and do it underwater without the cumbersome cameras and cases scuba drivers have traditionally used. Pixel camera software engineer, José Ricardo Lima, was scuba diving with his husband in the Philippines when he wondered what it would be like to use his Pixel camera underwater. His idea for a custom integration that combined Pixel Camera with a case safe for diving led to a collaboration between Pixel and Kraken Sports. Now, divers will be able to use their Pixel camera with the Kraken Sports’ Universal Smart Phone Housing (sold separately) to capture marine life and landscapes. Get access to your Pixel’s camera features, including Night Sight, Portrait Mode, Motion Photos and video directly through Pixel’s Camera app for high-quality images of you and your underwater friends. See g.co/pixel/dive-case-connector-setup for more information.

 
Photo captured on Pixel 5 using KRH03 Kraken Sports Universal Smart Phone Housing. Kraken Sports is a registered trademark of Kraken Sports Ontario, Canada.

Attention-grabbing graphics

Part of Pixel’s latest drop also includes new wallpapers that celebrate different cultural moments throughout the year with artwork from artists around the world. And for International Women’s Day on March 8, Pixel will add new wallpapers illustrated by Spanish duo Cachetejack, which focus on the strength and transformation of women.





Adapting to you and your routine

Your Pixel can also help you catch more zzz’s with a more seamless bedtime schedule on your Pixel Stand. When you use the bedtime features in Clock with your Pixel Stand, you’ll see a new, updated bedtime screen, along with redesigned notifications to help you ease into sleep. This feature is available on Pixel phones with wireless charging capability: Pixel 3, Pixel 3 XL, Pixel 4, Pixel 4 XL and Pixel 5. Pixel Stand is sold separately.


Your Pixel will receive these feature drops during the applicable Android update and support periods for the phone (see g.co/pixel/updates for details). And if you’re looking for more helpfulness across your device, check out all of the latest updates announced from Android.

And as a bonus, we recently announced a new Google Fit feature that helps you to track your heart rate and respiratory rate using just your phone’s camera. This feature will roll out to Pixel owners next week (and is not intended for medical purposes).




Posted by: Shenaz Zack, Group Product Manager  




From the seas, to more ZZZs: Your new Pixel features

The best part of your Pixel is that it keeps getting even more helpful, and even more unique. With regular updates, Pixels get smarter, more capable and more fun. This latest drop is no exception, and includes the ability to easily access and share audio recordings, a new way to use the Pixel Camera app underwater and new wallpapers to celebrate International Women's Day. 

 

A more shareable Recorder

Whether it’s that guitar riff you've been working on or reviewing transcripts from a class lecture, Recorder makes it easy for Pixel owners to easily record, transcribe (English only) and search the audio moments that matter to you. Now you can share links to your Recorder audio files, so anyone can listen, even if they don’t have a Pixel. At recorder.google.com, you can hear recordings, see transcripts and even search through files — you get the entire Recorder playback experience in one shareable link. 


Animated GIF showing Recorder in use.

You can also back up recordings to your Google Account to help keep them safe, and easily access them from any device. See more at g.co/pixel/recorder

 

Capture the seas

Now Pixel users can capture the same kinds of high quality images they’re accustomed to above water, and do it underwater without the cumbersome cameras and cases scuba drivers have traditionally used. Pixel camera software engineer, José Ricardo Lima, was scuba diving with his husband in the Philippines when he wondered what it would be like to use his Pixel camera underwater. His idea was to create a custom integration that combined Pixel’s camera with a case made for diving. Now, divers will be able to use their Pixel camera with the Kraken Sports’ Universal Smart Phone Housing to capture marine life and seascapes. Get access to your Pixel’s camera features, including  Night Sight, Portrait Mode, Motion Photos and video directly through Pixel’s Camera app for high-quality images of you and your underwater friends. See g.co/pixel/diveconnector for more information. 

A Pixel phone with a photo of a whale on the screen in the camera app.

Photo captured on Pixel 5 using KRH03 Kraken Sports Universal Smart Phone Housing. Kraken Sports is a registered trademark of Kraken Sports Ontario, Canada.

Attention-grabbing graphics

Part of Pixel’s latest drop also includes new wallpapers that celebrate different cultural moments throughout the year with artwork from artists around the world. And for International Women’s Day on March 8, Pixel will add new wallpapers illustrated by Spanish duo Cachetejack, which focus on the strength and transformation of women.
Pixels showing International Women's Day wallpapers.

Adapting to you and your routine

Smart Compose uses machine learning to help you complete your sentences as you type, so sending and replying to messages is easier than ever — and it’s now available for select messaging apps on your Pixel. Smart Compose suggests common phrases to help you cut back on repetitive typing and potential typos. Smart Compose is currently in the U.S. and in English only, and see more at g.co/gboardsuggestions.

A Pixel phone using Smart Compose.

Your Pixel can also help you catch more ZZZs with a more seamless bedtime schedule on your Pixel Stand. When you use the bedtime features in Clock with your Pixel Stand, you’ll see a new, updated bedtime screen, along with redesigned notifications to help you ease into sleep. This feature is available on Pixel phones with wireless charging capability: Pixel 3, Pixel 3 XL, Pixel 4, Pixel 4 XL and Pixel 5. Pixel Stand is sold separately.

A Pixel phone using bedtime feature.

For more information on the new features that just dropped and to see phone compatibility, head to http://g.co/pixel/updates. And if you’re looking for more helpfulness across your device, check out all of the latest updates announced from Android.    

And as a bonus, we recently announced a new Google Fit feature that allows you to measure your heart rate and respiratory rate using just your phone’s camera. This feature will roll out to Pixel owners next week (and is not intended for medical purposes).

Pixel artists take time to refresh, reflect and create

For Tim Kellner, a nomadic photographer and filmmaker in the program, the COVID-19 pandemic led to  taking a step back from his art. “Quarantine gave me time to think more deeply about the types of things I wanted to create,” he says. “I was surprised after that break to feel a drive to create again that I hadn't felt for a few years.”

Tim is one of Google Pixel’s Creator Labs artists who’s been exploring the side effects of spending more time alone. Creator Labs is an incubator for emerging photographers, directors and YouTubers that launched last winter, pre-pandemic. All nine of the program’s recurring artists pivoted to working virtually with us this past summer.  

Image showing a person standing in a desert at dusk; the sky is dark blue. A small ping light is shining around him while he holds a lit up object in his hand.

Tim Kellner

Armed with the Pixel 5 and their imaginations, the artists set out to create work grounded in social impact and cultural narrative (captured in a COVID-safe way, of course). 

One theme all of the current Creator Labs artists are embracing is the idea of space. Los Angeles-based Creator Labs veteran Glassface has been exploring isolation and mental health throughout his tenure in the program. “We’re all going through a mass shared traumatic experience right now. It feels like a really necessary time for meaningful art. I’ve been able to hone in on the art and music I want to be making, and I’ve been reminded of why I create in the first place,” Glassface says. “I think art can be a guiding light during difficult times like right now, and that’s informed and inspired my approach heavily. I’m taking a lot more risks and only putting my energy into the creative projects that mean the most to me.”
Image showing a person sitting close to the camera, looking up at the sky and at a white house.

Glassface

New York-based program-newcomer,Andre Wagner, like Tim, decided to turn the camera on himself “I’m always making self portraits but something about this time in particular led me to putting more focus on myself as the subject matter. There have definitely been surprises, and for me that’s needed because it helps sustain the effort.”

A black and white image showing a person sitting on a bench in between two trees. The person is sitting on top of the back of the bench looking up at the trees.

Andre Wagner

Other self portraits celebrated the artists’ heritage, including Los Angeles-based photographers June Canedo and Andrew Thomas Huang. June photographed herself wearing an embroidered handkerchief, representing her family’s history of domestic work, while Andrew’s photos pay homage to the Chinese Zodiac—with a Sci-Fi twist. 

Image showing a person with their back to the camera. There's a kerchief in their hair.

June Canedo

Creator Labs also includes artists Mayan Toledano, Kennedi Carter, Natalia Mantini and Anthony Prince Leslie. You can find their work on Pixel’s Instagram page.

Image showing a person wearing an ornate blue and green suit against a blue green background. They're wearing an intricate mask and holding up their hand, which is painted blue.

Andrew Thomas Huang

The work of our Creator Labs artists is a reminder for all of us that isolation can have a silver lining—in this case, giving us more space to think, reflect, refresh and create. 

Portrait Light: Enhancing Portrait Lighting with Machine Learning

Professional portrait photographers are able to create compelling photographs by using specialized equipment, such as off-camera flashes and reflectors, and expert knowledge to capture just the right illumination of their subjects. In order to allow users to better emulate professional-looking portraits, we recently released Portrait Light, a new post-capture feature for the Pixel Camera and Google Photos apps that adds a simulated directional light source to portraits, with the directionality and intensity set to complement the lighting from the original photograph.

Example image with and without Portrait Light applied. Note how Portrait Light contours the face, adding dimensionality, volume, and visual interest.

In the Pixel Camera on Pixel 4, Pixel 4a, Pixel 4a (5G), and Pixel 5, Portrait Light is automatically applied post-capture to images in the default mode and to Night Sight photos that include people — just one person or even a small group. In Portrait Mode photographs, Portrait Light provides more dramatic lighting to accompany the shallow depth-of-field effect already applied, resulting in a studio-quality look. But because lighting can be a personal choice, Pixel users who shoot in Portrait Mode can manually re-position and adjust the brightness of the applied lighting within Google Photos to match their preference. For those running Google Photos on Pixel 2 or newer, this relighting capability is also available for many pre-existing portrait photographs.

Pixel users can adjust a portrait’s lighting as they like in Google Photos, after capture.

Today we present the technology behind Portrait Light. Inspired by the off-camera lights used by portrait photographers, Portrait Light models a repositionable light source that can be added into the scene, with the initial lighting direction and intensity automatically selected to complement the existing lighting in the photo. We accomplish this by leveraging novel machine learning models, each trained using a diverse dataset of photographs captured in the Light Stage computational illumination system. These models enabled two new algorithmic capabilities:

  1. Automatic directional light placement: For a given portrait, the algorithm places a synthetic directional light in the scene consistent with how a photographer would have placed an off-camera light source in the real world.
  2. Synthetic post-capture relighting: For a given lighting direction and portrait, synthetic light is added in a way that looks realistic and natural.

These innovations enable Portrait Light to help create attractive lighting at any moment for every portrait — all on your mobile device.

Automatic Light Placement
Photographers usually rely on perceptual cues when deciding how to augment environmental illumination with off-camera light sources. They assess the intensity and directionality of the light falling on the face, and also adjust their subject’s head pose to complement it. To inform Portrait Light’s automatic light placement, we developed computational equivalents to these two perceptual signals.

First, we trained a novel machine learning model to estimate a high dynamic range, omnidirectional illumination profile for a scene based on an input portrait. This new lighting estimation model infers the direction, relative intensity, and color of all light sources in the scene coming from all directions, considering the face as a light probe. We also estimate the head pose of the portrait’s subject using MediaPipe Face Mesh.

Estimating the high dynamic range, omnidirectional illumination profile from an input portrait. The three spheres at the right of each image, diffuse (top), matte silver (middle), and mirror (bottom), are rendered using the estimated illumination, each reflecting the color, intensity, and directionality of the environmental lighting.

Using these clues, we determine the direction from which the synthetic lighting should originate. In studio portrait photography, the main off-camera light source, or key light, is placed about 30° above the eyeline and between 30° and 60° off the camera axis, when looking overhead at the scene. We follow this guideline for a classic portrait look, enhancing any pre-existing lighting directionality in the scene while targeting a balanced, subtle key-to-fill lighting ratio of about 2:1.

Data-Driven Portrait Relighting
Given a desired lighting direction and portrait, we next trained a new machine learning model to add the illumination from a directional light source to the original photograph. Training the model required millions of pairs of portraits both with and without extra light. Photographing such a dataset in normal settings would have been impossible because it requires near-perfect registration of portraits captured across different lighting conditions.

Instead, we generated training data by photographing seventy different people using the Light Stage computational illumination system. This spherical lighting rig includes 64 cameras with different viewpoints and 331 individually-programmable LED light sources. We photographed each individual illuminated one-light-at-a-time (OLAT) by each light, which generates their reflectance field — or their appearance as illuminated by the discrete sections of the spherical environment. The reflectance field encodes the unique color and light-reflecting properties of the subject’s skin, hair, and clothing — how shiny or dull each material appears. Due to the superposition principle for light, these OLAT images can then be linearly added together to render realistic images of the subject as they would appear in any image-based lighting environment, with complex light transport phenomena like subsurface scattering correctly represented.

Using the Light Stage, we photographed many individuals with different face shapes, genders, skin tones, hairstyles, and clothing/accessories. For each person, we generated synthetic portraits in many different lighting environments, both with and without the added directional light, rendering millions of pairs of images. This dataset encouraged model performance across diverse lighting environments and individuals.

Photographing an individual as illuminated one-light-at-a-time in the Google Light Stage, a 360° computational illumination rig.
Left: Example images from an individual’s photographed reflectance field, their appearance in the Light Stage as illuminated one-light-at-a-time. Right: The images can be added together to form the appearance of the subject in any novel lighting environment.

Learning Detail-Preserving Relighting Using the Quotient Image
Rather than trying to directly predict the output relit image, we trained the relighting model to output a low-resolution quotient image, i.e., a per-pixel multiplier that when upsampled can be applied to the original input image to produce the desired output image with the contribution of the extra light source added. This technique is computationally efficient and encourages only low-frequency lighting changes, without impacting high-frequency image details, which are directly transferred from the input to maintain image quality.

Supervising Relighting with Geometry Estimation
When photographers add an extra light source into a scene, its orientation relative to the subject’s facial geometry determines how much brighter each part of the face appears. To model the optical behavior of light sources reflecting off relatively matte surfaces, we first trained a machine learning model to estimate surface normals given the input photograph, and then applied Lambert’s law to compute a “light visibility map” for the desired lighting direction. We provided this light visibility map as input to the quotient image predictor, ensuring that the model is trained using physics-based insights.

The pipeline of our relighting network. Given an input portrait, we estimate per-pixel surface normals, which we then use to compute a light visibility map. The model is trained to produce a low-resolution quotient image that, when upsampled and applied as a multiplier to the original image, produces the original portrait with an extra light source added synthetically into the scene.

We optimized the full pipeline to run at interactive frame-rates on mobile devices, with total model size under 10 MB. Here are a few examples of Portrait Light in action.

Portrait Light in action.

Getting the Most Out of Portrait Light
You can try Portrait Light in the Pixel Camera and change the light position and brightness to your liking in Google Photos. For those who use Dual Exposure Controls, Portrait Light can be applied post-capture for additional creative flexibility to find just the right balance between light and shadow. On existing images from your Google Photos library, try it where faces are slightly underexposed, where Portrait Light can illuminate and highlight your subject. It will especially benefit images with a single individual posed directly at the camera.

We see Portrait Light as the first step on the journey towards creative post-capture lighting controls for mobile cameras, powered by machine learning.

Acknowledgements
Portrait Light is the result of a collaboration between Google Research, Google Daydream, Pixel, and Google Photos teams. Key contributors include: Yun-Ta Tsai, Rohit Pandey, Sean Fanello, Chloe LeGendre, Michael Milne, Ryan Geiss, Sam Hasinoff, Dillon Sharlet, Christoph Rhemann, Peter Denny, Kaiwen Guo, Philip Davidson, Jonathan Taylor, Mingsong Dou, Pavel Pidlypenskyi, Peter Lincoln, Jay Busch, Matt Whalen, Jason Dourgarian, Geoff Harvey, Cynthia Herrera, Sergio Orts Escolano, Paul Debevec, Jonathan Barron, Sofien Bouaziz, Clement Ng, Rachit Gupta, Jesse Evans, Ryan Campbell, Sonya Mollinger, Emily To, Yichang Shih, Jana Ehmann, Wan-Chun Alex Ma, Christina Tong, Tim Smith, Tim Ruddick, Bill Strathearn, Jose Lima, Chia-Kai Liang, David Salesin, Shahram Izadi, Navin Sarma, Nisha Masharani, Zachary Senzer.


1  Work conducted while at Google. 

Source: Google AI Blog


Take holiday photos with Night Sight in Portrait Mode

It’s officially the holiday season, which means I can finally decorate my house—so there are lights everywhere. Usually, I’d have friends and family over to see my setup, but this year I’ll be celebrating with just my household. Instead of gathering in person, my friends, family and I are sending each other digital holiday cards, and I’ll be using Night Sight in Portrait Mode on my new Pixel 5 to get the perfect photo.

Night Sight in Portrait Mode is a new feature only on Pixel 4a (5G) and Pixel 5, and it lets you capture beautiful low-light images with sharp subjects and artistically-blurred backgrounds. While Night Sight in Portrait Mode takes incredible photos year-round, it’s perfect for capturing a selfie or photo in front of holiday lights—whether those are on your house, a tree or from a menorah or kinara’s candlelight.

Image showing the author with his mother in front on a decorate tree in a dark room. The subjects are well-lit and in focus while the background is blurred.

Night Sight in Portrait Mode was designed to create professional quality low-light portraits with the tap of a button. Night Sight automatically engages in Portrait Mode when it’s dark enough, and, when you press the shutter button, Pixel’s new exposure bracketing technology will capture, align and merge up to 15 photos to improve low-light detail. To produce bright and vibrant portraits, Portrait Light was integrated directly into Pixel Camera to automatically enhance the lighting on people, and, in really dark scenes, Night Sight in Portrait Mode will autofocus using machine learning to keep your subjects sharp. After predicting the depth of the photo, Pixel will blur the background to create the beautiful bokeh that we love in professional portraits. 

Here are a few tips and tricks to help you nail the perfect holiday shot using Night Sight in Portrait Mode on your Pixel 4a (5G) or Pixel 5:

Tip #1:Accentuate the background lighting. Holiday lights can make for a perfect background because Portrait Mode will turn these small lights into beautiful bokeh circles. Just make sure you also tap on the subject you’d like to be in focus.

Image showing two women smiling at the camera. They are in focus while the decorated tree in the background is blurry.

Tip #2:Distance is important, so get properly set up. The photographer should be close to the subject, and the subject should have some distance from the background. My best photos position the photographer within four feet of the subject and the subject more than six feet from the background. If you’re socially distancing while taking a picture of a friend or family member that’s not of the same household, try placing Pixel on a tripod with the timer enabled, so that you can compose the photo, press the shutter button and move away as the subject enters the frame.

Image showing a woman in a yellow dress standing in a dark room in front of a decorated tree. The woman is well-lit and in focus while the tree in the background is blurred.


Tip #3: If you’re taking a Portrait Mode selfie or photo of someone else, make sure their face has some soft and ambient lighting; otherwise, the photo may be backlit and too dark. Portrait Light in Google Photos can also help you adjust the lighting on your photos after you take them.

Animated GIF showing a Pixel phone using the photo editor to choose what area of a photo shot in low light will be well-lit and in focus. The editor chooses the face of a woman who's smiling in front of a lit up tree.

Tip #4: If you want to capture a close-up of an ornament or other holiday decorations, make sure Pixel is really close to the subject for a macro shot. If you compose the photo such that small lights are far in the background, they will turn into large and beautiful bokeh discs that capture the beauty of the holidays.

Image showing a macro shot of an ornament hanging in a tree. The lights are low and there is decorative lighting, but the ornament remains in focus while the background is blurry.

Tip #5: If your photo isn’t coming out perfect, don’t worry—there are a few things you can try. If you see lens reflections in the viewfinder, try to angle the camera differently so that they disappear. And make sure the lens is cleaned and fingerprint-free; using a clean microfiber cloth can fix shots that are coming out soft and hazy. Lastly, remember to experiment! If you’re not happy with the lighting on your subject, try moving the subject or lighting around to get a better result.

On behalf of #teampixel, I hope you enjoy the holidays safely and capture beautiful memories with Night Sight in Portrait Mode on your Pixel 4a (5G) or Pixel 5.

The latest features for Pixel owners are here



One of the best parts of Pixel is regular feature drops that make the phone better and better (and better). With the December update, even more Pixel owners will get to experience our most recent updates, along with a few new surprises.

The latest and greatest, now on more Pixels

Many of the new features launched with the Pixel 5 are now rolling out to Pixel 3 and newer devices. That includes Extreme Battery Saver. When this is turned on, it lets your Pixel automatically limit some apps and only run the essentials so your battery lasts as long as possible.*

And now friends and families can share in the joy of watching the same video, cheer on live sports together and plan activities—even when they’re far apart. While Duo screen sharing in one-to-one calls is already available, screen sharing is also becoming available in group calls, too, so long as you’re using Wi-Fi or a 5G connection. 




Finally, we showed off a redesigned, more helpful editor in Google Photos with a new tab that gives you suggestions powered by machine learning that are tailored to the picture you’re editing. Now on Pixel, we’re rolling out new suggestions, including Dynamic, which enhances brightness, contrast and colour, and a set of sky suggestions, which help you create stunning sunset and sunrise images in just one tap.

Adapting to you, for you

Google devices are most helpful when they seamlessly assist you throughout the day—wherever you are. We call this ambient computing, and it drives our approach to how Pixel should adapt to your needs in real time.

For example, Adaptive Sound improves the sound quality of your phone speaker based on your surroundings. It uses the microphone to assess the acoustics near you, then adjusts the sound equalizer settings in certain apps. Bringing your Pixel from the bedroom to the bathroom while getting ready in the morning? Your audio will sound great wherever you are.

Speaking of where you’re going, the GPS on Pixel 5 and Pixel 4a (5G) is now more accurate when you’re on foot than previous generations. This means your rideshare service can find you more easily, and there’s no more guessing which side of the street you need to be on when you’re walking somewhere. (Requires an internet connection and Android 8.0 or later.)

Your Pixel can also now detect if you’re viewing a website or app in a different language and translate it using Google Lens. Just take a screenshot or swipe into App Overview, and tap the Lens chip to see the translation. For available Google Lens languages go to g.co/help/lens.

And for a little more help between charges, there are new context-aware battery features. Additional improvements to Adaptive Battery for Pixel 5 and Pixel 4a (5G) can automatically save even more power if a user is likely to miss their next charge, keeping the device powered even longer. Adaptive Charging helps preserve battery health over time by dynamically controlling how quickly a Pixel device charges. Just plug in your phone in the evening, set an alarm and the Adaptive Charging will work its magic.

And for Pixel 5 and Pixel 4a (5G) owners, our new Adaptive Connectivity feature helps you get the most out of your battery by automatically switching from 5G to 4G based on the app you’re using. It’ll choose 4G for things like browsing the web or sending texts, and switch to 5G when you’re watching movies or downloading large files. (Not available on all carriers or for all apps or features.)

Make your Pixel even more yours

Your phone should feel uniquely yours. Now you can personalize your home screen with new icons, grid views and app shapes, or even choose custom wallpapers of famous artworks provided by cultural institutions from around the world on Google Arts & Culture (wallpapers coming soon directly into the wallpaper categories in settings).

Plus, a special treat for Star Wars fans: Google, Disney and Lucasfilm worked together to launch “The Mandalorian” AR Experience, an augmented reality app available on Google Play for 5G Google Pixel devices and other select 5G Android phones. Now all Pixel 3 and newer devices can customize the home screen with original new Mandalorian wallpapers.




And something music lovers can appreciate: Your Pixel can already recognize songs that are playing around you if you enable Now Playing; all the tracks you hear are stored in your Now Playing History. Now you can select all the songs you heard while you were driving or watching TV and export them to a playlist in YouTube Music.





For those who use other Android devices, there's plenty of new things to get excited about: Check out our blog post on everything new for Android phones.



* Battery life depends upon many factors, and usage of certain features will decrease battery life. Actual battery life may be lower. 



Posted by Harrison Lingren, Technical Program Manager

The latest features for Pixel owners are here

One of the best parts of Pixel is regular feature drops that make the phone better and better (and better). With the December update, even more Pixel owners will get to experience our most recent updates, along with a few new surprises. 


The latest and greatest, now on more Pixels 

Many of the new features launched with the Pixel 5 are now rolling out to Pixel 3 and newer devices. That includes Hold for Me, which helps save you time when you're put on hold by a business. Available for Pixel owners in the U.S. in English, Google Assistant waits on the line for you and lets you know when someone’s ready to talk. We've found that when Hold for Me is enabled, it saves eight minutes per call on average.  

Animated GIF of a Pixel phone showing the Hold for Me feature in use.

Another helpful feature rolling out to previous Pixel devices is Extreme Battery Saver. When this is turned on, it lets your Pixel automatically limit some apps and only run the essentials so your battery lasts as long as possible. 

And now friends and families can share in the joy of watching the same video, cheer on live sports together and plan activities—even when they’re far apart. While Duo screen sharing in one-to-one calls is already available, screen sharing is also becoming available in group calls, too, so long as you’re using Wi-FI or a 5G connection.

Image showing a Pixel phone with the new Photos Editor on the screen.

Finally, we showed off a redesigned, more helpful editor in Google Photos with a new tab that gives you suggestions powered by machine learning that are tailored to the picture you’re editing. Now on Pixel, we’re rolling out new suggestions, including Dynamic, which enhances brightness, contrast and color, and a set of sky suggestions, which help you create stunning sunset and sunrise images in just one tap.

Animated GIF showing a Pixel Phone swiping through various editing sets for a photo of a sunset at a beach.

Adapting to you, for you


Google devices are most helpful when they seamlessly assist you throughout the day—wherever you are. We call this ambient computing, and it drives our approach to how Pixel should adapt to your needs in real time.

For example, Adaptive Sound improves the sound quality of your phone speaker based on your surroundings. It uses the microphone to assess the acoustics near you, then adjusts the sound equalizer settings in certain apps. Bringing your Pixel from the bedroom to the bathroom while getting ready in the morning? Your audio will sound great wherever you are.

Animated GIF of a Pixel phone screen selecting Adaptive Sound; the image then shows a woman in her bathroom doing her hair while the Adaptive Sound feature adjust the music to the noise of her environment.

Speaking of where you’re going, the GPS on Pixel 5 and Pixel 4a (5G) is now more accurate when you’re on foot. This means your rideshare service can find you more easily, and there’s no more guessing which side of the street you need to be on when you’re walking somewhere. (Not available at launch in all countries or locations, and requires an internet connection and Android 8.0 or later.)

Your Pixel can also now detect if you’re viewing a website or app in a different language and translate it using Google Lens. Just take a screenshot or swipe into App Overview, and tap the Lens chip to see the translation. For available Google Lens languages go to g.co/help/lens

And for a little more help between charges, there are new context-aware battery features. Additional improvements to Adaptive Batteryfor Pixel 5 and Pixel 4a (5G) can automatically save even more power if a user is likely to miss their next charge, keeping the device powered even longer. Adaptive Charging helps preserve battery health over time by dynamically controlling how quickly a Pixel device charges. Just plug in your phone in the evening, set an alarm and the Adaptive Charging will work its magic. 

And for Pixel 5 and Pixel 4a (5G) owners, our new Adaptive Connectivity feature helps you get the most out of your battery by automatically switching from 5G to 4G based on the app you’re using. It’ll choose 4G for things like browsing the web or sending texts, and switch to 5G when you’re watching movies or downloading large files. (Not available on all carriers or for all apps or features.)


Make your Pixel even more yours 

Your phone should feel uniquely yours. Now you can personalize your home screen with new icons, grid views and app shapes, or even choose custom wallpapers of famous artworks provided by cultural institutions from around the world on Google Arts & Culture (wallpapers coming soon directly into the wallpaper categories in settings).

Plus, a special treat for Star Wars fans: Google, Disney and Lucasfilm worked together to launch “The Mandalorian” AR Experience, an augmented reality app available on Google Play for 5G Google Pixel devices and other select 5G Android phones. Now all Pixel 3 and newer devices can customize the home screen with original new Mandalorian wallpapers.

Image showing Baby Yoda on the screen of a Pixel phone.

And something music lovers can appreciate: Your Pixel can already recognize songs that are playing around you if you enable Now Playing; all the tracks you hear are stored in your Now Playing History. Now you can select all the songs you heard while you were driving or watching TV and export them to a playlist in YouTube Music.


Image showing a Pixel phone using the Now Playing feature.

For those who use other Android devices, there's plenty of new things to get excited about: Check out ourblog post on everything new for Android phones.


Improving urban GPS accuracy for your app

Posted by Frank van Diggelen, Principal Engineer and Jennifer Wang, Product Manager

At Android, we want to make it as easy as possible for developers to create the most helpful apps for their users. That’s why we aim to provide the best location experience with our APIs like the Fused Location Provider API (FLP). However, we’ve heard from many of you that the biggest location issue is inaccuracy in dense urban areas, such as wrong-side-of-the-street and even wrong-city-block errors.

This is particularly critical for the most used location apps, such as rideshare and navigation. For instance, when users request a rideshare vehicle in a city, apps cannot easily locate them because of the GPS errors.

The last great unsolved GPS problem

This wrong-side-of-the-street position error is caused by reflected GPS signals in cities, and we embarked on an ambitious project to help solve this great problem in GPS. Our solution uses 3D mapping aided corrections, and is only feasible to be done at scale by Google because it comprises 3D building models, raw GPS measurements, and machine learning.

The December Pixel Feature Drop adds 3D mapping aided GPS corrections to Pixel 5 and Pixel 4a (5G). With a system API that provides feedback to the Qualcomm® Snapdragon™ 5G Mobile Platform that powers Pixel, the accuracy in cities (or “urban canyons”) improves spectacularly.

Picture of a pedestrian test, with Pixel 5 phone, walking along one side of the street, then the other. Yellow = Path followed, Red = without 3D mapping aided corrections, Blue = with 3D mapping aided corrections.  The picture shows that without 3D mapping aided corrections, the GPS results frequently wander to the wrong side of the street (or even the wrong city block), whereas, with 3D mapping aided corrections, the position is many times more accurate.

Picture of a pedestrian test, with Pixel 5 phone, walking along one side of the street, then the other. Yellow = Path followed, Red = without 3D mapping aided corrections, Blue = with 3D mapping aided corrections.

Why hasn’t this been solved before?

The problem is that GPS constructively locates you in the wrong place when you are in a city. This is because all GPS systems are based on line-of-sight operation from satellites. But in big cities, most or all signals reach you through non line-of-sight reflections, because the direct signals are blocked by the buildings.

Diagram of the 3D mapping aided corrections module in Google Play services, with corrections feeding into the FLP API.   3D mapping aided corrections are also fed into the GNSS chip and software, which in turn provides GNSS measurements, position, and velocity back to the module.

The GPS chip assumes that the signal is line-of-sight and therefore introduces error when it calculates the excess path length that the signals traveled. The most common side effect is that your position appears on the wrong side of the street, although your position can also appear on the wrong city block, especially in very large cities with many skyscrapers.

There have been attempts to address this problem for more than a decade. But no solution existed at scale, until 3D mapping aided corrections were launched on Android.

How 3D mapping aided corrections work

The 3D mapping aided corrections module, in Google Play services, includes tiles of 3D building models that Google has for more than 3850 cities around the world. Google Play services 3D mapping aided corrections currently supports pedestrian use-cases only. When you use your device’s GPS while walking, Android’s Activity Recognition API will recognize that you are a pedestrian, and if you are in one of the 3850+ cities, tiles with 3D models will be downloaded and cached on the phone for that city. Cache size is approximately 20MB, which is about the same size as 6 photographs.

Inside the module, the 3D mapping aided corrections algorithms solve the chicken-and-egg problem, which is: if the GPS position is not in the right place, then how do you know which buildings are blocking or reflecting the signals? Having solved this problem, 3D mapping aided corrections provide a set of corrected positions to the FLP. A system API then provides this information to the GPS chip to help the chip improve the accuracy of the next GPS fix.

With this December Pixel feature drop, we are releasing version 2 of 3D mapping aided corrections on Pixel 5 and Pixel 4a (5G). This reduces wrong-side-of-street occurrences by approximately 75%. Other Android phones, using Android 8 or later, have version 1 implemented in the FLP, which reduces wrong-side-of-street occurrences by approximately 50%. Version 2 will be available to the entire Android ecosystem (Android 8 or later) in early 2021.

Android’s 3D mapping aided corrections work with signals from the USA’s Global Positioning System (GPS) as well as other Global Navigation Satellite Systems (GNSSs): GLONASS, Galileo, BeiDou, and QZSS.

Our GPS chip partners shared the importance of this work for their technologies:

“Consumers rely on the accuracy of the positioning and navigation capabilities of their mobile phones. Location technology is at the heart of ensuring you find your favorite restaurant and you get your rideshare service in a timely manner. Qualcomm Technologies is leading the charge to improve consumer experiences with its newest Qualcomm® Location Suite technology featuring integration with Google's 3D mapping aided corrections. This collaboration with Google is an important milestone toward sidewalk-level location accuracy,” said Francesco Grilli, vice president of product management at Qualcomm Technologies, Inc.

“Broadcom has integrated Google's 3D mapping aided corrections into the navigation engine of the BCM47765 dual-frequency GNSS chip. The combination of dual frequency L1 and L5 signals plus 3D mapping aided corrections provides unprecedented accuracy in urban canyons. L5 plus Google’s corrections are a game-changer for GNSS use in cities,” said Charles Abraham, Senior Director of Engineering, Broadcom Inc.

“Google's 3D mapping aided corrections is a major advancement in personal location accuracy for smartphone users when walking in urban environments. MediaTek’s Dimensity 5G family enables 3D mapping aided corrections in addition to its highly accurate dual-band GNSS and industry-leading dead reckoning performance to give the most accurate global positioning ever for 5G smartphone users,” said Dr. Yenchi Lee, Deputy General Manager of MediaTek’s Wireless Communications Business Unit.

How to access 3D mapping aided corrections

Android’s 3D mapping aided corrections automatically works when the GPS is being used by a pedestrian in any of the 3850+ cities, on any phone that runs Android 8 or later. The best way for developers to take advantage of the improvement is to use FLP to get location information. The further 3D mapping aided corrections in the GPS chip are available to Pixel 5 and Pixel 4a (5G) today, and will be rolled out to the rest of the Android ecosystem (Android 8 or later) in the next several weeks. We will also soon support more modes including driving.

Android’s 3D mapping aided corrections cover more than 3850 cities, including:

  • North America: All major cities in USA, Canada, Mexico.
  • Europe: All major cities. (100%, except Russia & Ukraine)
  • Asia: All major cities in Japan and Taiwan.
  • Rest of the world: All major cities in Brazil, Argentina, Australia, New Zealand, and South Africa.

As our Google Earth 3D models expand, so will 3D mapping aided corrections coverage.

Google Maps is also getting updates that will provide more street level detail for pedestrians in select cities, such as sidewalks, crosswalks, and pedestrian islands. In 2021, you can get these updates for your app using the Google Maps Platform. Along with the improved location accuracy from 3D mapping aided corrections, we hope we can help developers like you better support use cases for the world’s 2B pedestrians that use Android.

Continuously making location better

In addition to 3D mapping aided corrections, we continue to work hard to make location as accurate and useful as possible. Below are the latest improvements to the Fused Location Provider API (FLP):

  • Developers wanted an easier way to retrieve the current location. With the new getCurrentLocation() API, developers can get the current location in a single request, rather than having to subscribe to ongoing location changes. By allowing developers to request location only when needed (and automatically timing out and closing open location requests), this new API also improves battery life. Check out our latest Kotlin sample.
  • Android 11's Data Access Auditing API provides more transparency into how your app and its dependencies access private data (like location) from users. With the new support for the API's attribution tags in the FusedLocationProviderClient, developers can more easily audit their apps’ location subscriptions in addition to regular location requests. Check out this Kotlin sample to learn more.



Qualcomm and Snapdragon are trademarks or registered trademarks of Qualcomm Incorporated.

Qualcomm Snapdragon and Qualcomm Location Suite are products of Qualcomm Technologies, Inc. and/or its subsidiaries.