Tag Archives: Pixel

Google Tensor debuts on the new Pixel 6 this fall

In 2016, we launched the first Pixel. Our goal was to give people a more helpful, smarter phone. Over the years, we introduced features like HDR+ andNight Sight, which used artificial intelligence (AI) to create beautiful images with computational photography. In later years, we applied powerful speech recognition models to build Recorder, which can record, transcribe and search for audio clips, all on device.

AI is the future of our innovation work, but the problem is we’ve run into computing limitations that prevented us from fully pursuing our mission. So we set about building a technology platform built for mobile that enabled us to bring our most innovative AI and machine learning (ML) to our Pixel users. We set out to make our own System on a Chip (SoC) to power Pixel 6. And now, years later, it’s almost here. 

Tensor is our first custom-built SoC specifically for Pixel phones, and it will power the Pixel 6 and Pixel 6 Pro later this fall.

Google Tensor

Pixel 6 and Pixel 6 Pro

Pixel 6 and Pixel 6 Pro debut this fall, and that’s when we'll share all the details we normally release at launch like new features, technical specs and pricing and availability. But today, we’re giving you a preview of what’s to come. 

Industrial design

These new phones redefine what it means to be a Pixel. From the new design that combines the same beautiful aesthetic across software and hardware with Android 12, to the new Tensor SoC, everything about using the Pixel is better.

We also upgraded the rear camera system. The improved sensors and lenses are now too big to fit into the traditional square — so the new design gives the whole camera system a new home with the camera bar. 

Pixel 6 and Pixel 6 Pro

The Pixel 6 and Pixel 6 Pro have new materials and finishes, too — like the Pro’s light polished aluminum frame, and the 6’s matte aluminum finish. And they both feel great in your hand. 

Material You

Google announced Android 12 and the new Material You design language at Google I/O. With Material You, we’re mixing color science with years of work in interaction design and engineering. These UI updates are grounded in the new animation and design framework — to make using your Pixel feel incredibly natural because everything runs smoothly on the Tensor chip. 

Tensor 

Tensor was built for how people use their phones today and how people will use them in the future. As more and more features are powered by AI and ML it’s not simply about adding more computing resources, it’s about using that ML to unlock specific experiences for our Pixel users.


The team that designed our silicon wanted to make Pixel even more capable. For example, with Tensor we thought about every piece of the chip and customized it to run Google's computational photography models. For users, this means entirely new features, plus improvements to existing ones. 


Tensor enables us to make the Google phones we’ve always envisioned —  phones that keep getting better, while tapping the most powerful parts of Google, all in a highly personalized experience. And with Tensor’s new security core and Titan M2, Pixel 6 will have the most layers of hardware security in any phone**.


You’ll see this in everything from the completely revamped camera system to speech recognition and much more. So whether you're trying to capture that family photo when your kids won’t stand still, or communicate with a relative in another language, Pixel will be there — and it will be more helpful than ever. We look forward to sharing more about Pixel 6 and Pixel 6 Pro later this year. 


* These devices have not been authorized as required by the rules of the Federal Communications Commission or other regulators. These devices may not be sold or otherwise distributed until required legal authorizations have been obtained

**Based on a count of independent hardware security subsystems and components.

Rediscover your city through a new Lens this summer

With warmer weather upon us and many places reopening in the U.K., it’s the perfect time to go out and reconnect with your surroundings. Whether it’s soaking up that panoramic view of a city skyline that you’ve really missed, or wondering what that interesting tree species was that you pass every day on your park walk, many of us feel ready to reconnect with our cities in new ways.


British cities are especially ripe for rediscovery. As the country emerges from a long lockdown and people start to reintegrate with their cities this summer, we’re launching a campaign called Behind the Lens with Google Pixel, which aims to help people rediscover their cities using Google Lens on Pixel. We’ll do that through a series of events over the coming weeks, alongside some very special guests in London, Bristol and Liverpool.


Vibrant orange and purple flower shown on a Google Pixel 5 using Google Lens, which has identified the flower as a bird of paradise. The result shows information about the plant: “Strelitzia reginae, commonly called a crane flower or bird of paradise, is a genus of perennial plants, native to South Africa…”

Vibrant orange and purple flower shown on a Google Pixel 5 using Google Lens, which has identified the flower as a Bird of Paradise.

Behind the Lens with Google Pixel encourages people to search what they see using the magic of Lens, and rediscover some forgotten pockets of their city using its updated features. Identifying the species of that bird you keep seeing in the communal gardens of London has never been easier, while discovering new, secret ingredients at a farmer’s market in Liverpool can also be done in a snap. Or, perhaps you’ve always wanted to know more about that forgotten landmark from a viewpoint in Bristol. Lens can give you on-the-spot information about a subject with a single long tap on the Pixel camera viewfinder, which is handy since we often have our cameras open and ready to capture the moment. 


With restrictions being lifted in the U.K. this summer, Search trends reveal that there is an opportunity to rediscover our cities through the interests we have acquired over lockdown. From March 23, 2020 through April 21, 2021, Google searches incrementally increased for new skills and classes: Hiking trails near me (+200%), Online gardening courses (+300%) and Online cooking classes (+800%). 


This suggests not only that some of the hobbies the nation nurtured during lockdown are still very much of interest, but also now people can rediscover these within the backdrop of their city, alongside their communities and friends. 


Within Google Lens, the Places filter is selected and the view is showing a clock tower against a bright, cloudy sky. Lens identifies the clock tower as Big Ben and gives results, including a star rating, two alternative views of the tower and an option to search Google.

Within Google Lens, the Places filter is selected and the view is showing a clock tower against a bright, cloudy sky.

A new tool for rediscovery


Google Lens is now used over three billion times per month by people around the world, and with many ready to explore this summer and rediscover their cities, we’re officially launching the new Places filter in Lens. Now available globally, the Places filter makes it easy to identify buildings and landmarks using your phone camera, combining 3D models from Google Earth and Lens’ powerful image recognition technology to create an in-depth, real-time AR experience, similar to Live View on Google Maps.


The Google Lens app Places filter is open on a black Google Pixel 5, showing a view that scans the River Thames and settles on a large bridge with two towers. Upon identification of the structure as Tower Bridge, Lens results show the star rating, alternative images of Tower Bridge to scroll through, and the option to search Google for more information.

The Google Lens app Places filter is open on a Google Pixel 5, showing a view that scans the River Thames and settles on a large bridge with two towers.

Just open the Google app on your phone and tap the camera icon in the search bar to open Lens. Then, switch to the Places filter and point your camera at notable places around you.


We hope Lens makes rediscovering and learning about your city even more enjoyable.


New for Pixel: Starry night clips, Pride wallpapers and more

Our latest Feature Drop is making your Pixel smarter and more fun, while also helping you feel safer, too. From Locked Folders to expanding some features to more countries to short videos in astrophotography, there’s a lot to unpack. 

Summertime fun for Pixel

Pixel owners love using astrophotography in Night Sight to take incredible photos of the night sky, and now it's getting even better. You can now create videos of the stars moving across the sky all during the same exposure. Once you take a photo in Night Sight, both the photo and video will be saved in your camera roll. Try waiting longer to capture even more of the stars in your video. This feature is available on Pixel 4 and newer phones and you can learn more at g.co/pixel/astrophotography.

Animated gif showing a Pixel 5 using the Night Sight feature to capture a long exposure shot of the night sky, which is then turned into a video.

Pride events across many parts of the world kick-off in June, and Pixel has new wallpapers and ringtones to celebrate. Three bold, joyful wallpaper designs were created exclusively for Pixel by Ashton Attzs. Check out new Pride-themed ringtones and notifications created by other LGBTQ+ artists and YouTube Creators.

More privacy, more safety

Last month at Google I/O we previewed Locked Folder in Google Photos, which is now rolling out to Pixel users. For photos and videos that need a little extra privacy — like pictures of an upcoming gift for a loved one, or screenshots of your recent receipts — you can save them to the new Locked Folder. To make it even easier to add photos and videos to Locked Folder on Pixel, you can choose to save them there straight from the camera. They’re saved on your device and won’t show up in shared albums, Memories or any other apps on your device, and can only be accessed using your device passcode or fingerprint.
Animated gif showing a Pixel 5 capturing a photo, and then saving it into a dedicated locked folder that can only be accessed using the device passcode or fingerprint.

Pixel’s car crash detection has already helped people in serious car accidents, and now it’s expanding to more areas. Pixel users in Spain, Ireland and Singapore will now have car crash detection capabilities in the Personal Safety app. The feature can help detect if you’ve been in a severe accident and will check in to see if you’re OK. If there’s no response, your Pixel can automatically call emergency responders and share your location and other relevant details. Car crash detection is already available in the U.K., Australia and the U.S. This feature is dependent upon network connectivity and other factors and may not be reliable for emergency communications or available in all areas. To learn more, visit the Help Center

Pixel can also help you stay alert while walking. With the new Heads Up feature inside Digital Wellbeing settings, your Pixel can detect when you’re walking and periodically remind you to look up from your screen.


Your Pixel keeps getting smarter 

When your phone is ringing but isn’t within reach, you can answer or reject a call with Google Assistant. Say "Hey Google, answer call" or "Hey Google, reject call." 

Cutting and pasting important information also just got easier thanks to an even smarter Gboard. When you copy text that includes a phone number, email address or URL, you’ll see those key snippets of text suggested in the clipboard. You can easily drop them into places like Messages to share contact information quickly, or Maps to get started on your road trip faster. For more information, including language availability, see g.co/gboard/clipboard

Image showing a Pixel 5 with the messages app open, and a suggested keyboard clip of a recently copied phone number.

We’re always working to bring helpful Pixel phone and voice features to more languages and dialects. Call Screen, which helps you avoid spam calls by answering unknown numbers to find out who’s calling and why, is now available in Japan. Recorder, the app that transcribes audio into text so you can search, edit and share your recordings, will be available in more English dialects including Singaporean, Australian, Irish and British English. The Recorder expansion will roll out to all Pixel 3 or newer phones by the end of July. 

To see the full list of new and expanded features for Pixel, see the Pixel forum post.

For the first time I was able to call my 23-year-old son

Last August — for the first time in my life — I had the opportunity to do something that people do every day. I called my son.

“Dad, do you realize this is the first time we’ve talked on the phone?” said my 23-year-old son Harry who lives in Hong Kong. Since birth, I’ve been profoundly deaf. In my daily life, I rely on captions and lip-reading for communication. Until recently, I haven’t been able to simply pick up the phone to chat with someone because I didn’t have a way to caption what they were saying. But Live Caption on the Google Pixel changed all of this.

Thanks to this feature, when Harry spoke his words were instantly converted to text. I was able to simultaneously read what Harry was saying and respond to him in a way that was natural and fluid.  

Since the first call with my son, I’ve called my bank manager, handyman, colleagues, family and friends. I even used Live Caption on a recent call with my doctor who noted that “it worked extremely well.” And last week, my partner asked me to make a phone call for her — never in my life could I have imagined that scenario.

Live Caption continues to delight me every day, and it’s clear how it can have a huge impact on the millions of people around the world who live with hearing loss every day. These days, I can’t wait to pick up the phone again.

Live Caption for calls is currently available in English only on Pixel 2 or newer phones. The accuracy of the captions can vary depending on the quality and clarity of the audio source. For more information check out our help center page.

Pixel artists show what ‘progress’ means to them

Photos by Natalia Mantini, MaryV and Tim Kellner


In her latest piece, "A gente mora por cima e abaixo do perigo," June Canedo documents her family's immigration from Brazil. “The many lessons from my home place and family, which I  interpret into objects, are markers of my movement,” she explains. “Often forward but with many detours along the way.”

A black and white photo of a piece of cloth hanging from a tree, blowing in the wind.

June Canedo


The same could be said of Google Pixel’s Creator Labs’ latest body of work: The program most recently invited nine artists to explore the idea of progress, captured on Pixel 5. 

Like June, photographers MaryV and Andrew Thomas Huang explored heritage in their work, looking at how we  carry forward certain traditions. MaryV commemorated her friend Aeron and daughter Becca in Korean Hanboks. Andrew showcased his reconnection to his Chinese ancestry by  incorporating Taoist scripture and symbolism; he felt compelled to share an intimate self portrait literally reflecting his self discovery.

Two photos next to one another; the first is of a mother and daughter embracing wearing traditional Korea Hanboks. The second is of a person bent over facing the ground against a dark background with Chinese lettering against the background.

From left to right: MaryV's and Andrew Thomas Huang's work.

Two of our artists trekked alone into state parks with their Pixel devices — thanks to exceptional battery life, no extra equipment needed. Tim Kellner captured vibrant images of flora in his piece “Distant” while Natalia Mantini said she wanted to “soothe the viewer with beautiful, meditative imagery amplifying the historical practice of healing through Earth” in her series. 


Two photographs side by side the first is of a person looking into a mirror surrounded by plants and flowers. The other is a close up of waves on the beach.

From left to right: Tim Kellner's and Natalia Mantini's work.

Inspired by the motto of his native Jamaica, Anthony Prince Leslie created a piece inviting us all to find common ground.  

Photograph showing people laying down to write out the words "We are one."

Anthony Prince Leslie

All of the artists interpreted progress differently, but each left us with a similar feeling —  a feeling of forward movement and positive momentum. Creator Labs artists also include Josh Goldenberg (Glassface), Kennedi Carter and Mayan Toledano. You can see examples of their work and more from the artists above on the Pixel Instagram page

HDR+ with Bracketing on Pixel Phones

We're continuously working to improve the Pixel — making it more helpful, more capable, and more fun — with regular updates, such as the recent V8.2 update to the Camera app. One such improvement (launched on Pixel 5 and Pixel 4a 5G in October) is a feature that operates “under the hood”, HDR+ with Bracketing. This feature works by merging images taken with different exposure times to improve image quality (especially in shadows), resulting in more natural colors, improved details and texture, and reduced noise.

Why Are HDR Scenes Hard to Capture?
The original HDR+ burst photography system is the engine behind high-quality mobile photography, which captures a rapid series of deliberately underexposed images, then combines and renders them in a way that preserves detail across the range of tones. But this system had one limitation: scenes with high dynamic range (HDR) like the one below were noisy in the shadows because all images captured are underexposed.

The same photo using HDR+ (red outline) and HDR+ with Bracketing (green outline). While the characteristic HDR+ look remains the same, bracketing improves image quality, especially in shadows, with more natural colors, improved details and texture, and reduced noise.

Capturing HDR scenes is difficult because of the physical constraints of image sensors combined with limited signal in the shadows. We can correctly expose either the shadows or the highlights, but not both at the same time.

The same scene shot with different exposure settings and tonemapped to similar overall brightness. Left/Top: Exposure set for the highlights. The bright blue sky is preserved, but the shadows are very noisy. Right/Bottom: Exposure set for the shadows. Noise in the shadows is reduced, but the sky is clipped (white).

Photographers sometimes work around these limitations by taking two different exposures and combining them. This approach, known as exposure bracketing, can deliver the best of both worlds, but it is time-consuming to do by hand. It is also challenging in computational photography because it requires:

  1. Capturing additional long exposure frames while maintaining the fast, predictable capture experience of the Pixel camera.
  2. Taking advantage of long exposure frames while avoiding ghosting artifacts caused by motion between frames.

To avoid these challenges, the original HDR+ system used a different approach to handle high dynamic range scenes.

The Limits of HDR+
The capture strategy used by HDR+ is based on underexposure, which avoids loss of detail in the highlights. While this strategy comes at the expense of noise in the shadows, HDR+ offsets the increased noise through the use of burst photography.

Using bursts to improve image quality. HDR+ starts from a burst of full-resolution raw images (left). Depending on conditions, between 2 and 15 images are aligned and merged into a computational raw image (middle). The merged image has reduced noise and increased dynamic range, leading to a higher quality final result (right).

This approach works well for scenes with moderate dynamic range, but breaks down for HDR scenes. To understand why, we need to take a closer look at how two types of noise get into an image.

Noise in Burst Photography
One important type of noise is called shot noise, which depends only on the total amount of light captured — the sum of N frames, each with E seconds of exposure time has the same amount of shot noise as a single frame exposed for N × E seconds. If this were the only type of noise present in captured images, burst photography would be as efficient as taking longer exposures. Unfortunately, a second type of noise, read noise, is introduced by the sensor every time a frame is captured. Read noise doesn’t depend on the amount of light captured but instead depends on the number of frames taken — that is, with each frame taken, an additional fixed amount of read noise is added.

This is why using burst photography to reduce total noise isn’t as efficient as simply taking longer exposures: taking multiple frames can reduce the effect of shot noise, but will also increase read noise. Even though read noise increases with the number of frames, it is still possible to reduce the overall noisiness with burst photography, but it becomes less efficient. If one were to break a long exposure into N shorter exposures, the ratio of signal to noise in the final image would be lower because of the additional read noise. In this case, to get back to the signal-to-noise ratio in the single long exposure, one would need to merge N2 short-exposure frames. In the example below, if a long exposure were divided into 12 short exposures, we'd have to capture 144 (12 × 12) short frames to match the signal-to-noise ratio in the shadows! Capturing and processing this many frames would be much more time consuming — burst capture and processing could take over a minute and result in a poor user experience. Instead, with bracketing one can capture both short and long exposures — combining highlight protection and noise reduction.

Left: The result of merging 12 short-exposure frames in Night Sight mode. Right: A single frame whose exposure time is 12 times longer than an individual short exposure. The longer exposure has significantly less noise in the shadows but sacrifices the highlights.

Solving with Bracketing
While the challenges of bracketing prevented the original HDR+ system from using it, incremental improvements since then, plus a recent concentrated effort, have made it possible in the Camera app. To start, adding bracketing to HDR+ required redesigning the capture strategy. Capturing is complicated by zero shutter lag (ZSL), which underpins the fast capture experience on Pixel. With ZSL, the frames displayed in the viewfinder before the shutter press are the frames we use for HDR+ burst merging. For bracketing, we capture an additional long exposure frame after the shutter press, which is not shown in the viewfinder. Note that holding the camera still for half a second after the shutter press to accommodate the long exposure can help improve image quality, even with a typical amount of handshake.

Capture strategy. Top: The original HDR+ method captures short exposures before the shutter press, six in this example. Bottom: HDR+ with Bracketing captures five short exposures before the shutter press and one long exposure after the shutter press.

For Night Sight, the capture strategy isn't constrained by the viewfinder — because all frames are captured after the shutter press while the viewfinder is stopped, this mode easily accommodates capturing longer exposure frames. In this case, we capture three long exposures to further reduce noise.

Capture strategy for Night Sight. Top: The original Night Sight captured 15 short exposure frames. Bottom: Night Sight with bracketing captures 12 short and 3 long exposures.

The Merging Algorithm
When merging bracketed shots, we choose one of the short frames as the reference frame to avoid potentially clipped highlights and motion blur. All other frames are aligned to this frame before they are merged. This introduces a challenge — for complex scene motion or occluded regions, it is impossible to find exactly matching regions and a naïve merge algorithm would produce ghosting artifacts in these cases.

Left: Ghosting artifacts are visible around the silhouette of a moving person, when deghosting is disabled.
Right: Robust merging produces a clean image.

To address this, we designed a new spatial merge algorithm, similar to the one used for Super Res Zoom, that decides per pixel whether image content should be merged or not. This deghosting is more complicated for frames with different exposures. Long exposure frames have different noise characteristics, clipped highlights, and different amounts of motion blur, which makes comparisons with the short exposure reference frame more difficult. In addition, ghosting artifacts are more visible in bracketed shots, because noise that would otherwise mask these errors is reduced. Despite those challenges, our algorithm is as robust to these issues as the original HDR+ and Super Res Zoom and doesn’t produce ghosting artifacts. At the same time, it merges images 40% faster than its predecessors. Because it merges RAW images early in the photographic pipeline, we were able to achieve all of those benefits while keeping the rest of processing and the signature HDR+ look unchanged. Furthermore, users who prefer to use computational RAW images can take advantage of those image quality and performance improvements.

Bracketing on Pixel
HDR+ with Bracketing is available to users of Pixel 4a (5G) and 5 in the default camera, as well as in Night Sight and Portrait modes. For users of Pixel 4 and 4a, the Google Camera app supports bracketing in Night Sight mode. No user interaction is needed to activate HDR+ with Bracketing — depending on the dynamic range of the scene, and the presence of motion, HDR+ with bracketing chooses the best exposures to maximize image quality (examples).

Acknowledgements
HDR+ with Bracketing is the result of a collaboration across several teams at Google. The project would not have been possible without the joint efforts of Sam Hasinoff, Dillon Sharlet, Kiran Murthy, Mike Milne, Andy Radin, Nicholas Wilson, Navin Sarma‎, Gabriel Nava, Emily To, Sushil Nath, Alexander Schiffhauer, Isaac Reynolds, Bill Strathearn, Marius Renn, Alex Hong, Jose Ricardo Lima, Bob Hung, Ying Chen Lou, Joy Hsu, Blade Chiu, David Massoud, Jean Hsu, Ellie Yang, and Marc Levoy.

Source: Google AI Blog


HDR+ with Bracketing on Pixel Phones

We're continuously working to improve the Pixel — making it more helpful, more capable, and more fun — with regular updates, such as the recent V8.2 update to the Camera app. One such improvement (launched on Pixel 5 and Pixel 4a 5G in October) is a feature that operates “under the hood”, HDR+ with Bracketing. This feature works by merging images taken with different exposure times to improve image quality (especially in shadows), resulting in more natural colors, improved details and texture, and reduced noise.

Why Are HDR Scenes Hard to Capture?
The original HDR+ burst photography system is the engine behind high-quality mobile photography, which captures a rapid series of deliberately underexposed images, then combines and renders them in a way that preserves detail across the range of tones. But this system had one limitation: scenes with high dynamic range (HDR) like the one below were noisy in the shadows because all images captured are underexposed.

The same photo using HDR+ (red outline) and HDR+ with Bracketing (green outline). While the characteristic HDR+ look remains the same, bracketing improves image quality, especially in shadows, with more natural colors, improved details and texture, and reduced noise.

Capturing HDR scenes is difficult because of the physical constraints of image sensors combined with limited signal in the shadows. We can correctly expose either the shadows or the highlights, but not both at the same time.

The same scene shot with different exposure settings and tonemapped to similar overall brightness. Left/Top: Exposure set for the highlights. The bright blue sky is preserved, but the shadows are very noisy. Right/Bottom: Exposure set for the shadows. Noise in the shadows is reduced, but the sky is clipped (white).

Photographers sometimes work around these limitations by taking two different exposures and combining them. This approach, known as exposure bracketing, can deliver the best of both worlds, but it is time-consuming to do by hand. It is also challenging in computational photography because it requires:

  1. Capturing additional long exposure frames while maintaining the fast, predictable capture experience of the Pixel camera.
  2. Taking advantage of long exposure frames while avoiding ghosting artifacts caused by motion between frames.

To avoid these challenges, the original HDR+ system used a different approach to handle high dynamic range scenes.

The Limits of HDR+
The capture strategy used by HDR+ is based on underexposure, which avoids loss of detail in the highlights. While this strategy comes at the expense of noise in the shadows, HDR+ offsets the increased noise through the use of burst photography.

Using bursts to improve image quality. HDR+ starts from a burst of full-resolution raw images (left). Depending on conditions, between 2 and 15 images are aligned and merged into a computational raw image (middle). The merged image has reduced noise and increased dynamic range, leading to a higher quality final result (right).

This approach works well for scenes with moderate dynamic range, but breaks down for HDR scenes. To understand why, we need to take a closer look at how two types of noise get into an image.

Noise in Burst Photography
One important type of noise is called shot noise, which depends only on the total amount of light captured — the sum of N frames, each with E seconds of exposure time has the same amount of shot noise as a single frame exposed for N × E seconds. If this were the only type of noise present in captured images, burst photography would be as efficient as taking longer exposures. Unfortunately, a second type of noise, read noise, is introduced by the sensor every time a frame is captured. Read noise doesn’t depend on the amount of light captured but instead depends on the number of frames taken — that is, with each frame taken, an additional fixed amount of read noise is added.

This is why using burst photography to reduce total noise isn’t as efficient as simply taking longer exposures: taking multiple frames can reduce the effect of shot noise, but will also increase read noise. Even though read noise increases with the number of frames, it is still possible to reduce the overall noisiness with burst photography, but it becomes less efficient. If one were to break a long exposure into N shorter exposures, the ratio of signal to noise in the final image would be lower because of the additional read noise. In this case, to get back to the signal-to-noise ratio in the single long exposure, one would need to merge N2 short-exposure frames. In the example below, if a long exposure were divided into 12 short exposures, we'd have to capture 144 (12 × 12) short frames to match the signal-to-noise ratio in the shadows! Capturing and processing this many frames would be much more time consuming — burst capture and processing could take over a minute and result in a poor user experience. Instead, with bracketing one can capture both short and long exposures — combining highlight protection and noise reduction.

Left: The result of merging 12 short-exposure frames in Night Sight mode. Right: A single frame whose exposure time is 12 times longer than an individual short exposure. The longer exposure has significantly less noise in the shadows but sacrifices the highlights.

Solving with Bracketing
While the challenges of bracketing prevented the original HDR+ system from using it, incremental improvements since then, plus a recent concentrated effort, have made it possible in the Camera app. To start, adding bracketing to HDR+ required redesigning the capture strategy. Capturing is complicated by zero shutter lag (ZSL), which underpins the fast capture experience on Pixel. With ZSL, the frames displayed in the viewfinder before the shutter press are the frames we use for HDR+ burst merging. For bracketing, we capture an additional long exposure frame after the shutter press, which is not shown in the viewfinder. Note that holding the camera still for half a second after the shutter press to accommodate the long exposure can help improve image quality, even with a typical amount of handshake.

Capture strategy. Top: The original HDR+ method captures short exposures before the shutter press, six in this example. Bottom: HDR+ with Bracketing captures five short exposures before the shutter press and one long exposure after the shutter press.

For Night Sight, the capture strategy isn't constrained by the viewfinder — because all frames are captured after the shutter press while the viewfinder is stopped, this mode easily accommodates capturing longer exposure frames. In this case, we capture three long exposures to further reduce noise.

Capture strategy for Night Sight. Top: The original Night Sight captured 15 short exposure frames. Bottom: Night Sight with bracketing captures 12 short and 3 long exposures.

The Merging Algorithm
When merging bracketed shots, we choose one of the short frames as the reference frame to avoid potentially clipped highlights and motion blur. All other frames are aligned to this frame before they are merged. This introduces a challenge — for complex scene motion or occluded regions, it is impossible to find exactly matching regions and a naïve merge algorithm would produce ghosting artifacts in these cases.

Left: Ghosting artifacts are visible around the silhouette of a moving person, when deghosting is disabled.
Right: Robust merging produces a clean image.

To address this, we designed a new spatial merge algorithm, similar to the one used for Super Res Zoom, that decides per pixel whether image content should be merged or not. This deghosting is more complicated for frames with different exposures. Long exposure frames have different noise characteristics, clipped highlights, and different amounts of motion blur, which makes comparisons with the short exposure reference frame more difficult. In addition, ghosting artifacts are more visible in bracketed shots, because noise that would otherwise mask these errors is reduced. Despite those challenges, our algorithm is as robust to these issues as the original HDR+ and Super Res Zoom and doesn’t produce ghosting artifacts. At the same time, it merges images 40% faster than its predecessors. Because it merges RAW images early in the photographic pipeline, we were able to achieve all of those benefits while keeping the rest of processing and the signature HDR+ look unchanged. Furthermore, users who prefer to use computational RAW images can take advantage of those image quality and performance improvements.

Bracketing on Pixel
HDR+ with Bracketing is available to users of Pixel 4a (5G) and 5 in the default camera, as well as in Night Sight and Portrait modes. For users of Pixel 4 and 4a, the Google Camera app supports bracketing in Night Sight mode. No user interaction is needed to activate HDR+ with Bracketing — depending on the dynamic range of the scene, and the presence of motion, HDR+ with bracketing chooses the best exposures to maximize image quality (examples).

Acknowledgements
HDR+ with Bracketing is the result of a collaboration across several teams at Google. The project would not have been possible without the joint efforts of Sam Hasinoff, Dillon Sharlet, Kiran Murthy, Mike Milne, Andy Radin, Nicholas Wilson, Navin Sarma‎, Gabriel Nava, Emily To, Sushil Nath, Alexander Schiffhauer, Isaac Reynolds, Bill Strathearn, Marius Renn, Alex Hong, Jose Ricardo Lima, Bob Hung, Ying Chen Lou, Joy Hsu, Blade Chiu, David Massoud, Jean Hsu, Ellie Yang, and Marc Levoy.

Source: Google AI Blog


From the seas, to more ZZZs: Your new Pixel features

The best part of your Pixel is that it keeps getting even more helpful, and even more unique. With regular updates, Pixels get smarter, more capable and more fun. This latest drop is no exception, and for Pixel 3 and newer devices, includes the ability to easily access and share audio recordings, a new way to use the Pixel Camera app underwater and new wallpapers to celebrate International Women's Day. 

A more shareable Recorder 
Whether it’s that guitar riff you've been working on or reviewing transcripts from a class lecture, Recorder makes it easy for Pixel owners to easily record, transcribe (English only) and search the audio moments that matter to you. Now you can share links to your Recorder audio files, so anyone can listen, even if they don’t have a Pixel. At recorder.google.com, you can hear recordings, see transcripts and even search through files — you get the entire Recorder playback experience in one shareable link. 
You can also back up recordings to your Google Account to help keep them safe, and easily access them from any device. See more at g.co/pixel/recorder

Capture the seas with Kraken Sports 
Now Pixel users can capture the same kinds of high quality images they’re accustomed to above water, and do it underwater without the cumbersome cameras and cases scuba drivers have traditionally used. Pixel camera software engineer, José Ricardo Lima, was scuba diving with his husband in the Philippines when he wondered what it would be like to use his Pixel camera underwater. His idea was to create a custom integration that combined Pixel’s camera with a case made for diving. Now, divers will be able to use their Pixel camera with the Kraken Sports’ Universal Smart Phone Housing to capture marine life and seascapes. Get access to your Pixel’s camera features, including Night Sight, Portrait Mode, Motion Photos and video directly through Pixel’s Camera app for high-quality images of you and your underwater friends. See g.co/pixel/diveconnector for more information. 
Photo captured on Pixel 5 using KRH03 Kraken Sports Universal Smart Housing. Kraken Sports is a registered trademark of Kraken Sports Ontario, Canada. 

Attention-grabbing graphics 
Part of Pixel’s latest drop also includes new wallpapers that celebrate different cultural moments throughout the year with artwork from artists around the world. And for International Women’s Day on March 8, Pixel will add new wallpapers illustrated by Spanish duo Cachetejack, which focus on the strength and transformation of women. 
Adapting to you and your routine 
Your Pixel can help you catch more ZZZs with a more seamless bedtime schedule on your Pixel Stand. When you use the bedtime features in Clock with your Pixel Stand, you’ll see a new, updated bedtime screen, along with redesigned notifications to help you ease into sleep. This feature is available on Pixel phones with wireless charging capability: Pixel 3, Pixel 3 XL, Pixel 4, Pixel 4 XL and Pixel 5. Pixel Stand is sold separately. 
For more information on the new features that just dropped and to see phone compatibility, head to http://g.co/pixel/updates. And if you’re looking for more helpfulness across your device, check out all of the latest updates announced from Android

Pixel 5G devices can now access 5G in dual SIM mode 
Software updates also mean that Pixel 4a with 5G and Pixel 5 devices will now be able to access 5G even when in dual SIM mode (eSIM+physical SIM DSDS).


And as a bonus, we recently announced a new Google Fit feature that allows you to measure your heart rate and respiratory rate using just your phone’s camera. This feature will roll out to Pixel owners next week (and is not intended for medical purposes). 



1. Works with Pixel 2 or newer phones. Requires Android R, Camera Update 8.1 (Nov. 2020), Dive Case Connector app for Google Camera, KRH04 or KRH03 Kraken Sports Universal Smart Phone Housing (sold separately). See g.co/pixel/dive-case-connector-setup for more information on setup. Google is not responsible for the operation of Kraken Sports products or their compliance with any applicable safety or other requirements. Photo captured on Pixel 5 using KRH03 Kraken Sports Universal Smart Phone Housing. Kraken Sports is a registered trademark of Kraken Sports Ontario, Canada. 
2. Transcription is available in English only. Recorder sharing requires an Internet connection and a Google Account. 
3. Cloud storage requires an Internet connection and a Google Account. 
4. Your Pixel will receive feature drops during the applicable Android update and support periods for the phone. See g.co/pixel/updates for details.
5. Requires a 5G data plan (sold separately). 5G service not available on all carrier networks or in all areas. Contact carrier for details. 5G service, speed and performance depend on many factors including, but not limited to, carrier network capabilities, device configuration and capabilities, network traffic, location, signal strength and signal obstruction. Actual results may vary. Some features are not available in all areas. Data rates may apply. See g.co/pixel/networkinfo for info. 
 


From the seas, to more ZZZs: Your new Pixel features


The best part of your Pixel is that it keeps getting even more helpful, and even more unique. With regular updates, Pixels get smarter, more capable and more fun. This latest drop is no exception, and includes the ability to easily access and share audio recordings at recorder.google.com, a new way to use the Pixel Camera app underwater and new wallpapers to celebrate International Women's Day.

A more shareable Recorder

Whether it’s that guitar riff you've been working on or reviewing transcripts from a class lecture, Recorder makes it easy for Pixel owners to easily record, transcribe (English only) and search the audio moments that matter to you. Now you can share links to your Recorder audio files, so anyone can listen, even if they don’t have a Pixel. At recorder.google.com, you can hear recordings, see transcripts and even search through files — you get the entire Recorder playback experience in one shareable link.



You can also back up recordings to your Google Account to help keep them safe, and easily access them from any device. See more at g.co/pixel/recorder.

Capture the seas

Now Pixel users can capture the same kinds of high quality images they’re accustomed to above water, and do it underwater without the cumbersome cameras and cases scuba drivers have traditionally used. Pixel camera software engineer, José Ricardo Lima, was scuba diving with his husband in the Philippines when he wondered what it would be like to use his Pixel camera underwater. His idea for a custom integration that combined Pixel Camera with a case safe for diving led to a collaboration between Pixel and Kraken Sports. Now, divers will be able to use their Pixel camera with the Kraken Sports’ Universal Smart Phone Housing (sold separately) to capture marine life and landscapes. Get access to your Pixel’s camera features, including Night Sight, Portrait Mode, Motion Photos and video directly through Pixel’s Camera app for high-quality images of you and your underwater friends. See g.co/pixel/dive-case-connector-setup for more information.

 
Photo captured on Pixel 5 using KRH03 Kraken Sports Universal Smart Phone Housing. Kraken Sports is a registered trademark of Kraken Sports Ontario, Canada.

Attention-grabbing graphics

Part of Pixel’s latest drop also includes new wallpapers that celebrate different cultural moments throughout the year with artwork from artists around the world. And for International Women’s Day on March 8, Pixel will add new wallpapers illustrated by Spanish duo Cachetejack, which focus on the strength and transformation of women.





Adapting to you and your routine

Your Pixel can also help you catch more zzz’s with a more seamless bedtime schedule on your Pixel Stand. When you use the bedtime features in Clock with your Pixel Stand, you’ll see a new, updated bedtime screen, along with redesigned notifications to help you ease into sleep. This feature is available on Pixel phones with wireless charging capability: Pixel 3, Pixel 3 XL, Pixel 4, Pixel 4 XL and Pixel 5. Pixel Stand is sold separately.


Your Pixel will receive these feature drops during the applicable Android update and support periods for the phone (see g.co/pixel/updates for details). And if you’re looking for more helpfulness across your device, check out all of the latest updates announced from Android.

And as a bonus, we recently announced a new Google Fit feature that helps you to track your heart rate and respiratory rate using just your phone’s camera. This feature will roll out to Pixel owners next week (and is not intended for medical purposes).




Posted by: Shenaz Zack, Group Product Manager  




From the seas, to more ZZZs: Your new Pixel features

The best part of your Pixel is that it keeps getting even more helpful, and even more unique. With regular updates, Pixels get smarter, more capable and more fun. This latest drop is no exception, and includes the ability to easily access and share audio recordings, a new way to use the Pixel Camera app underwater and new wallpapers to celebrate International Women's Day. 

 

A more shareable Recorder

Whether it’s that guitar riff you've been working on or reviewing transcripts from a class lecture, Recorder makes it easy for Pixel owners to easily record, transcribe (English only) and search the audio moments that matter to you. Now you can share links to your Recorder audio files, so anyone can listen, even if they don’t have a Pixel. At recorder.google.com, you can hear recordings, see transcripts and even search through files — you get the entire Recorder playback experience in one shareable link. 


Animated GIF showing Recorder in use.

You can also back up recordings to your Google Account to help keep them safe, and easily access them from any device. See more at g.co/pixel/recorder

 

Capture the seas

Now Pixel users can capture the same kinds of high quality images they’re accustomed to above water, and do it underwater without the cumbersome cameras and cases scuba drivers have traditionally used. Pixel camera software engineer, José Ricardo Lima, was scuba diving with his husband in the Philippines when he wondered what it would be like to use his Pixel camera underwater. His idea was to create a custom integration that combined Pixel’s camera with a case made for diving. Now, divers will be able to use their Pixel camera with the Kraken Sports’ Universal Smart Phone Housing to capture marine life and seascapes. Get access to your Pixel’s camera features, including  Night Sight, Portrait Mode, Motion Photos and video directly through Pixel’s Camera app for high-quality images of you and your underwater friends. See g.co/pixel/diveconnector for more information. 

A Pixel phone with a photo of a whale on the screen in the camera app.

Photo captured on Pixel 5 using KRH03 Kraken Sports Universal Smart Phone Housing. Kraken Sports is a registered trademark of Kraken Sports Ontario, Canada.

Attention-grabbing graphics

Part of Pixel’s latest drop also includes new wallpapers that celebrate different cultural moments throughout the year with artwork from artists around the world. And for International Women’s Day on March 8, Pixel will add new wallpapers illustrated by Spanish duo Cachetejack, which focus on the strength and transformation of women.
Pixels showing International Women's Day wallpapers.

Adapting to you and your routine

Smart Compose uses machine learning to help you complete your sentences as you type, so sending and replying to messages is easier than ever — and it’s now available for select messaging apps on your Pixel. Smart Compose suggests common phrases to help you cut back on repetitive typing and potential typos. Smart Compose is currently in the U.S. and in English only, and see more at g.co/gboardsuggestions.

A Pixel phone using Smart Compose.

Your Pixel can also help you catch more ZZZs with a more seamless bedtime schedule on your Pixel Stand. When you use the bedtime features in Clock with your Pixel Stand, you’ll see a new, updated bedtime screen, along with redesigned notifications to help you ease into sleep. This feature is available on Pixel phones with wireless charging capability: Pixel 3, Pixel 3 XL, Pixel 4, Pixel 4 XL and Pixel 5. Pixel Stand is sold separately.

A Pixel phone using bedtime feature.

For more information on the new features that just dropped and to see phone compatibility, head to http://g.co/pixel/updates. And if you’re looking for more helpfulness across your device, check out all of the latest updates announced from Android.    

And as a bonus, we recently announced a new Google Fit feature that allows you to measure your heart rate and respiratory rate using just your phone’s camera. This feature will roll out to Pixel owners next week (and is not intended for medical purposes).