Tag Archives: Pixel

Team Pixel is in bloom this spring

Our community of photographers is on the rise, and the #teampixel tribe is officially 35,000 members strong (and counting)! This week’s highlights range from colorful plum blossoms in Sakura, Japan to a confetti-filled wedding.

If you’re looking for a daily dose of #teampixel photos, follow our feed on Instagram and keep spreading the loves and likes with fellow Pixel photographers.

Behind the Motion Photos Technology in Pixel 2


One of the most compelling things about smartphones today is the ability to capture a moment on the fly. With motion photos, a new camera feature available on the Pixel 2 and Pixel 2 XL phones, you no longer have to choose between a photo and a video so every photo you take captures more of the moment. When you take a photo with motion enabled, your phone also records and trims up to 3 seconds of video. Using advanced stabilization built upon technology we pioneered in Motion Stills for Android, these pictures come to life in Google Photos. Let’s take a look behind the technology that makes this possible!
Motion photos on the Pixel 2 in Google Photos. With the camera frozen in place the focus is put directly on the subjects. For more examples, check out this Google Photos album.
Camera Motion Estimation by Combining Hardware and Software
The image and video pair that is captured every time you hit the shutter button is a full resolution JPEG with an embedded 3 second video clip. On the Pixel 2, the video portion also contains motion metadata that is derived from the gyroscope and optical image stabilization (OIS) sensors to aid the trimming and stabilization of the motion photo. By combining software based visual tracking with the motion metadata from the hardware sensors, we built a new hybrid motion estimation for motion photos on the Pixel 2.

Our approach aligns the background more precisely than the technique used in Motion Stills or the purely hardware sensor based approach. Based on Fused Video Stabilization technology, it reduces the artifacts from the visual analysis due to a complex scene with many depth layers or when a foreground object occupies a large portion of the field of view. It also improves the hardware sensor based approach by refining the motion estimation to be more accurate, especially at close distances.
Motion photo as captured (left) and after freezing the camera by combining hardware and software For more comparisons, check out this Google Photos album.
The purely software-based technique we introduced in Motion Stills uses the visual data from the video frames, detecting and tracking features over consecutive frames yielding motion vectors. It then classifies the motion vectors into foreground and background using motion models such as an affine transformation or a homography. However, this classification is not perfect and can be misled, e.g. by a complex scene or dominant foreground.
Feature classification into background (green) and foreground (orange) by using the motion metadata from the hardware sensors of the Pixel 2. Notice how the new approach not only labels the skateboarder accurately as foreground but also the half-pipe that is at roughly the same depth.
For motion photos on Pixel 2 we improved this classification by using the motion metadata derived from the gyroscope and the OIS. This accurately captures the camera motion with respect to the scene at infinity, which one can think of as the background in the distance. However, for pictures taken at closer range, parallax is introduced for scene elements at different depth layers, which is not accounted for by the gyroscope and OIS. Specifically, we mark motion vectors that deviate too much from the motion metadata as foreground. This results in a significantly more accurate classification of foreground and background, which also enables us to use a more complex motion model known as mixture homographies that can account for rolling shutter and undo the distortions it causes.
Background motion estimation in motion photos. By using the motion metadata from Gyro and OIS we are able to accurately classify features from the visual analysis into foreground and background.
Motion Photo Stabilization and Playback
Once we have accurately estimated the background motion for the video, we determine an optimally stable camera path to align the background using linear programming techniques outlined in our earlier posts. Further, we automatically trim the video to remove any accidental motion caused by putting the phone away. All of this processing happens on your phone and produces a small amount of metadata per frame that is used to render the stabilized video in real-time using a GPU shader when you tap the Motion button in Google Photos. In addition, we play the video starting at the exact timestamp as the HDR+ photo, producing a seamless transition from still image to video.
Motion photos stabilize even complex scenes with large foreground motions.
Motion Photo Sharing
Using Google Photos, you can share motion photos with your friends and as videos and GIFs, watch them on the web, or view them on any phone. This is another example of combining hardware, software and machine learning to create new features for Pixel 2.

Acknowledgements
Motion photos is a result of a collaboration across several Google Research teams, Google Pixel and Google Photos. We especially want to acknowledge the work of Karthik Raveendran, Suril Shah, Marius Renn, Alex Hong, Radford Juang, Fares Alhassen, Emily Chang, Isaac Reynolds, and Dave Loxton.

Source: Google AI Blog


Behind the Motion Photos Technology in Pixel 2


One of the most compelling things about smartphones today is the ability to capture a moment on the fly. With motion photos, a new camera feature available on the Pixel 2 and Pixel 2 XL phones, you no longer have to choose between a photo and a video so every photo you take captures more of the moment. When you take a photo with motion enabled, your phone also records and trims up to 3 seconds of video. Using advanced stabilization built upon technology we pioneered in Motion Stills for Android, these pictures come to life in Google Photos. Let’s take a look behind the technology that makes this possible!
Motion photos on the Pixel 2 in Google Photos. With the camera frozen in place the focus is put directly on the subjects. For more examples, check out this Google Photos album.
Camera Motion Estimation by Combining Hardware and Software
The image and video pair that is captured every time you hit the shutter button is a full resolution JPEG with an embedded 3 second video clip. On the Pixel 2, the video portion also contains motion metadata that is derived from the gyroscope and optical image stabilization (OIS) sensors to aid the trimming and stabilization of the motion photo. By combining software based visual tracking with the motion metadata from the hardware sensors, we built a new hybrid motion estimation for motion photos on the Pixel 2.

Our approach aligns the background more precisely than the technique used in Motion Stills or the purely hardware sensor based approach. Based on Fused Video Stabilization technology, it reduces the artifacts from the visual analysis due to a complex scene with many depth layers or when a foreground object occupies a large portion of the field of view. It also improves the hardware sensor based approach by refining the motion estimation to be more accurate, especially at close distances.
Motion photo as captured (left) and after freezing the camera by combining hardware and software For more comparisons, check out this Google Photos album.
The purely software-based technique we introduced in Motion Stills uses the visual data from the video frames, detecting and tracking features over consecutive frames yielding motion vectors. It then classifies the motion vectors into foreground and background using motion models such as an affine transformation or a homography. However, this classification is not perfect and can be misled, e.g. by a complex scene or dominant foreground.
Feature classification into background (green) and foreground (orange) by using the motion metadata from the hardware sensors of the Pixel 2. Notice how the new approach not only labels the skateboarder accurately as foreground but also the half-pipe that is at roughly the same depth.
For motion photos on Pixel 2 we improved this classification by using the motion metadata derived from the gyroscope and the OIS. This accurately captures the camera motion with respect to the scene at infinity, which one can think of as the background in the distance. However, for pictures taken at closer range, parallax is introduced for scene elements at different depth layers, which is not accounted for by the gyroscope and OIS. Specifically, we mark motion vectors that deviate too much from the motion metadata as foreground. This results in a significantly more accurate classification of foreground and background, which also enables us to use a more complex motion model known as mixture homographies that can account for rolling shutter and undo the distortions it causes.
Background motion estimation in motion photos. By using the motion metadata from Gyro and OIS we are able to accurately classify features from the visual analysis into foreground and background.
Motion Photo Stabilization and Playback
Once we have accurately estimated the background motion for the video, we determine an optimally stable camera path to align the background using linear programming techniques outlined in our earlier posts. Further, we automatically trim the video to remove any accidental motion caused by putting the phone away. All of this processing happens on your phone and produces a small amount of metadata per frame that is used to render the stabilized video in real-time using a GPU shader when you tap the Motion button in Google Photos. In addition, we play the video starting at the exact timestamp as the HDR+ photo, producing a seamless transition from still image to video.
Motion photos stabilize even complex scenes with large foreground motions.
Motion Photo Sharing
Using Google Photos, you can share motion photos with your friends and as videos and GIFs, watch them on the web, or view them on any phone. This is another example of combining hardware, software and Machine Learning to create new features for Pixel 2.

Acknowledgements
Motion photos is a result of a collaboration across several Google Research teams, Google Pixel and Google Photos. We especially want to acknowledge the work of Karthik Raveendran, Suril Shah, Marius Renn, Alex Hong, Radford Juang, Fares Alhassen, Emily Chang, Isaac Reynolds, and Dave Loxton.

135 countries and counting: travel the world with #teampixel

Wondering what our community of Pixel photographers has been up to lately? Check out this week’s selects, from a dancer’s impeccable form (yay burst mode!) to finding love in the streets of Spain. #teampixel photos have been geo-tagged and posted from 135 countries and counting—shout out to our community for their amazing work! ? ? ?

We’d love to see more of your Pixel photos, so keep tagging them with #teampixel and you might be featured next.

Google Pixel earns Android enterprise seal of approval

With Pixel 2, we set out to make a mobile experience that is smart, simple, and secure, with a great camera, the Google Assistant to help you get more done, long battery life, and much more. We’ve seen a great response from consumers, and we’ve also gotten fantastic reviews from businesses, employees, and industry analysts for the security and productivity features built into Pixel 2.


Today Pixel 2 and first generation Pixel phones have been recognized in the new Android Enterprise Recommended program, which means the phones are endorsed for the workplace. This new initiative from our colleagues on the Android team showcases enterprise devices and services that meet high standards for security, reliability, and productivity while also enabling the teams which deploy corporate devices to manage them easily and securely.
android enterprise recommended badge

Not only does Pixel meet the baseline requirements of the program, it exceeds many of them. For example, while all devices in the Android Enterprise Recommended program must receive a security update within 90 days, Pixel goes further by delivering security patches and feature updates every month. Pixel gets the yearly Android operating system upgrades first, directly from Google, so that users have the latest software. Pixel 2 also offers a tamper-resistant hardware security modulethat reinforces the lock screen, to better defend against malware and hardware attacks.


Alongside its security protections, Pixel has lots of features to help you out at work: you can use the Google Assistant to find out when your next meeting is and the best route to get there, multitask with split-screen which lets you have two apps open on the screen, or check notes while on a video call with picture-in-picture mode.


With Pixel recognized in the Android Enterprise Recommended program, we offer peace-of-mind to administrators who manage corporate devices, while always helping employees get more out of their phones at work and beyond. We look forward to seeing how Pixel will power mobile productivity at work.

Go behind the scenes of “Isle of Dogs” with Pixel

"Isle of Dogs" tells the story of Atari Kobayashi, 12-year-old ward to corrupt Mayor Kobayashi. When, by Executive Decree, all the canine pets of Megasaki City are exiled to a vast garbage-dump, Atari sets off alone in a miniature Junior-Turbo Prop and flies to Trash Island in search of his bodyguard-dog, Spots. There, with the assistance of a pack of newly-found mongrel friends, he begins an epic journey that will decide the fate and future of the entire Prefecture.

The film isn’t out until March 23—but Pixel owners will get an exclusive sneak peek this week.

In “Isle of Dogs Behind the Scenes (in Virtual Reality),” the audience is taken behind-the-scenes in a 360-degree VR experience featuring on-set interviews of the film’s cast (voiced by Bryan Cranston, Bill Murray, Edward Norton, Liev Schreiber, Jeff Goldblum, Scarlett Johansson, Tilda Swinton, F. Murray Abraham and Bob Balaban). Get nose-to-nose with Chief, Boss, Rex and the rest of the cast while the crew works around you, for an inside look at the unique craft of stop-motion animation.


Pixel’s powerful front-firing stereo speakers and brilliant display make it perfect for watching immersive VR content like this. Presented in 4K video with interactive spatial audio that responds to where you’re looking, “Isle of Dogs Behind the Scenes (in Virtual Reality)” is a collaboration between FoxNext VR Studio, Fox Searchlight Pictures, Felix & Paul Studios, the Isle of Dogs production team, and Google Spotlight Stories.

ISLEOFDOGS_Stories.jpg

“Isle of Dogs Behind the Scenes (in Virtual Reality)” is available today on the Google Spotlight Stories app, exclusively for Google Pixel phones (Pixel and Pixel 2) and best watched on the Daydream View headset. To watch, download the Spotlight Stories app.

On March 2, “Isle of Dogs Behind the Scenes (in Virtual Reality)” will become available in VR, 360 and 2D via YouTube VR and Fox Searchlight YouTube channel, and any platform that has the YouTube VR app, including Daydream and Sony PlayStation VR. “Isle of Dogs,” from Fox Searchlight, hits theaters on March 23.

Two #teampixel photographers say “I do” to Pixel 2

Jenny and Colin Hayles are professional photographers (Jenny does weddings and Colin captures nature and wildlife) and proud #teampixel members. Knowing that a Pixel 2 can take high-quality photos, they wanted to see how their phones would fare in the most picture-worthy setting: a wedding. We spoke with Jenny and Colin about their experience using a Pixel 2 at an experimental wedding photo shoot.


Tell us about your wedding experiment. How’d you come up with the idea?
Colin: The concept developed when one of my shots was featured on #teampixel, and I realized just how amazing the Pixel camera was. At first, I wanted to show that wedding guests have no excuse for taking lousy pictures if you have a Pixel. But Jenny and her creative team (shout out to our planner from Jaqueline Rae Weddings) went to the next level—she wanted to shoot professional wedding photos with a Pixel. Before we tried it out at a real wedding, we had to see what the Pixel was capable of—from details, to portraits, to action shots. We simulated the details of a wedding day—the gown and tux, rings, stationery, cake and flowers—and recruited our friends Michele and Tom (a real-life couple) to be our models. We used only a Pixel 2 (no reflectors, lights, or tripods) for the entire photo shoot. The results were, I think, better than any of us dared to hope.

Which Pixel features did you use most during the shoot?
Jenny: We used the portrait feature the most—it’s pretty much like shooting with a high-end prime lens with a large aperture. In other words, it beautifully blurs the foreground and background to create that fine art look. Shooting macro shots without an extra lens is fantastic for the details that brides love to see (like shots of their wedding rings).

pix
Colin and the models

What's the biggest pro of shooting a wedding with a phone?
Jenny: I loved being able to send images to the couple right away. Often brides and grooms see poor-quality images first, as guests begin to post on social media, but shooting with a Pixel, I can share beautiful images right away.

Did the couple feel more comfortable and natural when the photos were taken on a Pixel, rather than a big professional camera?
Colin: Shooting with a Pixel 2 was a novel idea, so there was some curiosity. We shared images throughout the shoot with the team and bride and groom. There were comments that the next phone they get will be a Pixel 2! It made me think that it is an invaluable tool for non-photographers who work in the wedding industry—like florists—to take high-quality images of their work as well.

pix2
Behind-the-scenes at the wedding shoot: here's Jenny with her Pixel 2 (decor and furniture came from @modernluxerental)

What other big events you're going to tackle next?
Colin: We’d love to use the Pixel 2 for a honeymoon or engagement shoot. The idea of not taking along a heavy and conspicuous camera bag and coming away with high-quality images is an exciting and back-saving idea. We traveled to Cuba last year and used our first-generation Pixels to capture the bulk of the photos we took and I was so impressed. I only brought my camera gear along on one day of the whole trip.

#teampixel photographer Dave East heads to Iceland

This week, #teampixel heads north of the equator on a trip with photographer, Dave East. Check out his photos from Iceland (shot in subzero temperatures!) that captured everything from giant glaciers to epic waterfalls, and hear about why he loves his Pixel camera.

Can you tell us about your recent trip to Iceland shooting on Pixel 2 XL?
Iceland was absolutely surreal. It is one of the most amazing places I have ever been to. The landscapes feel like another planet. Shooting on the Pixel 2 was just so easy, the camera is so good that  when we were in a hurry I didn't even take out my DSLR because I knew the shots were going to be just as great on the Pixel 2.

What are your favorite features on Pixel's camera?
The portrait mode is amazing, it really gets the sense of depth perfectly. I also just think the photos are so crisp, it feel not like a phone camera at all.

What piece of advice do you have for capturing great shots?
For landscape work, check that your horizons are straight and that there is some depth to the shot or leading lines that draw your eye to something in particular. And make sure that you are shooting in a good time of day and that it is not too bright.

Anything else you'd like to share with us and the #teampixel community?
Just keep shooting, keep creating and keep exploring. My favourite thing in the world is travelling to new places and seeing new things.


Use Pixel 2 for better photos in Instagram, WhatsApp and Snapchat

https://storage.googleapis.com/gweb-uniblog-publish-prod/images/PixelVisualCore_3.max-2800x2800.jpg
With Pixel 2, we wanted to build the best smartphone camera in the world. One of the ways we did that is with HDR+ technology, which helps you capture better photos in challenging lighting conditions, like scenes with both bright and shaded areas or those with dim light. This technology has always been available when you take photos from Pixel’s main camera app. Now we’re bringing it to your favorite photography, social media, and camera apps.


Today we’re turning on Pixel Visual Core for Pixel 2 users—a custom designed co-processor for Pixel 2. Using computational photography and machine learning (which powers Pixel’s HDR+ technology,) Pixel Visual Core improves image quality in apps that take photos. This means it’ll be easier to shoot and share amazing photos on Instagram, WhatsApp, and Snapchat, along with many other apps which use the Pixel 2 camera. All you need to do is take the photo and Pixel 2 will do the rest. Your photos will be bright, detailed, and clear.


Same picture taken without (left) and with HDR+ on Pixel Visual Core (right).
Same picture taken without (top) and with HDR+ on Pixel Visual Core (bottom).
 
Same picture taken without (left) and with HDR+ on Pixel Visual Core (right).
 
Same picture taken without (left) and with HDR+ on Pixel Visual Core (right).
 
Pixel Visual Core is built to do heavy-lifting image processing while using less power, which saves battery. That means we're able to use that additional computing power to improve the quality of your pictures by running the HDR+ algorithm. Like the main Pixel camera, Pixel Visual Core also runs RAISR, which means zoomed-in shots look sharper and more detailed than ever before. Plus, it has Zero Shutter Lag to capture the frame right when you press the shutter, so you can time shots perfectly. What’s also exciting is these new features are available to any app—developers can find information on Google Open Source.


These updates are rolling out over the next few days, along with other Pixel software improvements, so download the February monthly update when you see the notification.


These aren’t the only updates coming to Pixel this month. As we announced last year, our goal is to build new features for Pixel over time so your phone keeps getting better. Later this week, we’re adding new Augmented Reality (AR) Stickers themed around winter sports, so you can dress up videos and photos with freestyle skiers, twirling ice skaters, hockey players, and more. Like all AR stickers, these characters interact with both the camera and each other, creating a fun-filled way to enhance the moments you capture and share.


If you post photos or videos to your favorite apps, tag your pictures with #teampixel so we can see all the great moments you’ve captured.

Posted by Ofer Shacham, Engineering Manager for Pixel Visual Core

Use Pixel 2 for better photos in Instagram, WhatsApp and Snapchat

With Pixel 2, we wanted to build the best smartphone camera in the world. One of the ways we did that is with HDR+ technology, which helps you capture better photos in challenging lighting conditions, like scenes with both bright and shaded areas or those with dim light.  This technology has always been available when you take photos from Pixel’s main camera app. Now we’re bringing it to your favorite photography, social media, and camera apps.


Today we’re turning on Pixel Visual Core for Pixel 2 users—a custom designed co-processor for Pixel 2. Using computational photography and machine learning (which powers Pixel’s HDR+ technology), Pixel Visual Core improves image quality in apps that take photos. This means it’ll be easier to shoot and share amazing photos on Instagram, WhatsApp, and Snapchat, along with many other apps which use the Pixel 2 camera. All you need to do is take the photo and Pixel 2 will do the rest. Your photos will be bright, detailed, and clear.

Pixel Visual Core is built to do heavy-lifting image processing while using less power, which saves battery. That means we're able to use that additional computing power to improve the quality of your pictures by running the HDR+ algorithm. Like the main Pixel camera, Pixel Visual Core also runs RAISR, which means zoomed-in shots look sharper and more detailed than ever before. Plus, it has Zero Shutter Lag to capture the frame right when you press the shutter, so you can time shots perfectly. What’s also exciting is these new features are available to any app—developers can find information on Google Open Source.


These updates are rolling out over the next few days, along with other Pixel software improvements, so download the February monthly update when you see the notification.


These aren’t the only updates coming to Pixel this month. As we announced last year, our goal is to build new features for Pixel over time so your phone keeps getting better. Later this week, we’re adding new Augmented Reality (AR) Stickers themed around winter sports, so you can dress up videos and photos with freestyle skiers, twirling ice skaters, hockey players, and more. Like all AR stickers, these characters interact with both the camera and each other, creating a fun-filled way to enhance the moments you capture and share.

If you post photos or videos to your favorite apps, tag your pictures with #teampixel so we can see all the great moments you’ve captured.