Tag Archives: Pixel

Take a look at our new Pixel portfolio, made to be helpful

From phones and smartwatches to tablets and laptops — our day-to-day lives can be filled with so many devices, and dealing with them should be easy. This is why we’re focused on building hardware and software that work together to anticipate and react to your requests, so you don’t have to spend time fussing with technology.

To bring this vision to life, we’ve spent years focusing on ambient computing and how it can help us build technology that fades into the background, while being more useful than ever. Today at I/O, I shared several important updates to our hardware portfolio that lay the groundwork for creating a family of devices that not only work better together, but work together for you.

Meet the new Pixel portfolio

We’ve thoughtfully designed the Pixel portfolio so the helpfulness and intelligence of Google can adapt to you in a non-intrusive way. This is all possible thanks to multi-device work from the Android team combined with our work to layer cutting-edge AI research and helpful software and services onto our devices. And of course, we always tightly integrate powerful data security directly into our hardware.

Pixel products grouped together on a white background. Products include Pixel Bud Pros, the Google Pixel Watch and Pixel phones.

Last year we launched Google Tensor, our first custom-designed mobile system on a chip (SoC), to create a common platform for our Pixel phones. The first Pixels built with Tensor, Pixel 6 and Pixel 6 Pro, are the fastest selling Pixel phones to date. And today we introduced the new Pixel 6a, which has the same Tensor processor and industry-leading security from our Titan M2 chip.

Our Pixel Buds are designed to perfectly complement your Pixel phone, and we’re excited to expand the earbuds offerings with Pixel Buds Pro. These premium earbuds include a new, custom 6-core audio chip that runs Google-developed algorithms — all tuned by our in-house audio engineering team.

A sneak peek of what’s to come

Building on our ambient computing vision, we’re focused on how Pixel devices can be even more helpful to you — now and in the future. Today, we gave a preview of our new Google Pixel Watch — the first watch we’ve built inside and out. It has a bold circular, domed design, a tactile crown, recycled stainless steel and customizable bands that easily attach. With this watch, you’ll get the new Wear OS by Google experience and Fitbit’s industry-leading health and fitness tools — right on your wrist. Google Pixel Watch is a natural extension of the Pixel family, providing help whenever and wherever you need it. It will be available this fall, and we’ll share more details in the coming months.

Animated GIF showing the Google Pixel Watch with a white band.

We also previewed our Pixel 7 phones, coming this fall.[42f7f0]Our next version of Google Tensor will power these devices, which are built for those who want the latest technology and fastest performance.

And finally, we shared an early look at our Android tablet, powered by Google Tensor.[a9d69b]Built to be the perfect companion for your Pixel phone, our tablet will blend into your day-to-day routine and help connect the moments you’re on the go with the moments you’re at home. We hope to have more to share here in 2023, so stay tuned.

We’re building out the Pixel portfolio to give you more options for varying budgets and needs. I can’t wait for everyone to see for themselves how helpful these devices and technology can be — from wearables, phones and tablets to audio and smart home technology. And if you’re headed to the New York area, you can see these devices in action at our second Google Store that’s opening this summer in Brooklyn.

Pixel 6a: More of what you want for less than you expect

Our latest A-series phone, Google Pixel 6a, gives you more of what you want — for less than you’d expect. Pixel 6a is packed with the same powerful brains, Google Tensor, and many of the must-have features as our premium phones Pixel 6 and Pixel 6 Pro — at a lower price of $449.

Designed with you in mind

Pixel 6a borrows many of the same design elements from Pixel 6 — including the iconic camera bar — along with a metal frame that is durable by design. You’ll also get the updated Material You design UX that lets you personalize the look and feel of your phone, making it truly yours. Show off your colorful side and coordinate your aesthetic with one of three phone colors: Chalk, Charcoal and Sage.

Image of the phone in Charcoal, Chalk and Sage

For added protection and even more color options, pick out one of the cases made specifically for Pixel 6a — they're translucent and can be mixed and matched to create unique color combos. You’ll also have your choice of cases from our Made for Google partners.

Translucent Pixel 6a cases in Carbon, Frosted Clear and Seafoam.

Fully loaded with the features you love

From exceptional camera features to speech recognition to security you can trust, many of your favorite features from Pixel 6 and Pixel 6 Pro will be joining the party — thanks to Google Tensor. Here’s a look at some of them.

Pixel 6a helps capture your most important moments with a Camera Bar that includes dual rear cameras: a main lens and an ultrawide lens. So rest assured you can capture the whole scene. As for the selfie camera on Pixel 6a, it’s the same great camera as Pixel 6.

The Pixel Camera is built to be versatile and adapt to your needs, and you’ll see some of those features and technologies on Pixel 6a — from Real Tone, which authentically represents all skin tones, to Night Sight, which makes low-light photography a breeze, to Magic Eraser in Google Photos, which makes distractions disappear. And good news, we’ve enhanced Magic Eraser so you can also change the color of distracting objects in your photo. In just a few taps, the object’s colors and shading blend in naturally. So the focus is on the subjects — where it should be.

Pixel 6a comes with the same highly accurate speech recognition as Pixel 6 Pro. That includes features like Recorder, Live Caption and Live Translate.

Pixel 6a is your personal translator with Live Translate. Animated GIF showing Live Translate in action.

With Live Translate, you’ll have a personal translator wherever you go! Find more details and availability here.

The power and safety of Google Tensor

You’ll get the full hardware and software experience you’d expect with Google Tensor without compromising on battery life. Pixel 6a comes with an all-day battery that can last up to 72 hours when in the Extreme Battery Saver mode — a first for Pixel phones.[edfc02]With Google Tensor, Pixel 6a shares the same security architecture as Pixel 6 Pro, including our dedicated security chip Titan M2 that gives you the peace of mind that your sensitive data is safe.

With this common hardware platform across our latest phones, Pixel 6a will receive five years of security updates from when the device first becomes available on GoogleStore.com in the U.S., just like Pixel 6 and Pixel 6 Pro. Plus, Pixel 6a comes with Feature Drops so you get the latest and greatest features and updates. And as with other Pixel devices, Pixel 6a will be among the first Android devices to receive the upcoming Android 13 update.

Pixel 6a will be available for pre-order starting at $449 on July 21 and on shelves on July 28. Find out what countries Pixel 6a will be available in, and sign up for product updates.

Get more of the game in the NBA and Google Pixel Arena

The NBA Playoffs are back this Saturday, and we have a game-changing virtual experience to take the game you love to the next level. The NBA and Google Pixel Arena is a virtual space based on live in-game action where you can create, play, share and celebrate your 2022 NBA Playoffs skills.

Access the experience during halftime or between games via the NBA App — and keep tabs on all the action throughout the NBA Playoffs presented by Google Pixel, and NBA Finals presented by YouTube TV.

Once inside the Pixel Arena, pick a game to enter — including past games — and immerse yourself into a 3D virtual arena using your phone’s gyroscope to navigate around. You can even create your own avatars, and outfit them in your favorite team’s gear and accessories.

Phone screen showing 3D virtual locker room

The experience brings a whole new meaning to courtside. During half time and post-game, you can relive how top scorers and teams performed with a 3D shot recaps that map shots to the Pixel Arena court based on real-time NBA data from the first half of the game. Want to test your basketball trivia expertise? At halftime, challenge your NBA knowledge with game-specific trivia based on live NBA data feeds. With each game you attend, you have more opportunities to climb the leaderboard, create shareable content and unlock new levels and swag for your avatar.

This is the first immersive experience jointly developed by Google and the NBA to connect fans to the game and each other during live games. And, staying true to our tagline “For all the fans,” the Pixel Arena will be available to everyone — regardless of what device or operating system you use.

We’ll see you in the Pixel Arena!

Pixel creators document what it means to be truly seen

Back in February, many of you saw our Super Bowl ad, “Seen on Pixel.” The spot told the story behind Real Tone, Google’s years-long effort to ensure all our camera and imaging products accurately represent all skin tones.

“Seen on Pixel” doesn’t just represent Google’s commitment to image equity, it's an invitation for all Pixel owners to take more beautiful, more equitable images. So for Season 6 of Creator Labs, we invited 23 artists to tell their stories.

While these 23 artists — our largest Creator Labs incubator to-date — come from varying backgrounds, identities and places, common themes emerged from their respective interpretations of “Seen on Pixel” as their prompt. They focused on things like individuality, community and identity.

MaryV, Coyote Park and Anthony Prince Leslie turned the camera on themselves, exploring what it means to be “seen” through your own lens. New artist Coyote Park says they want their work to speak to trans, Indigenous youth. “I want them to know that we are powerful and beautiful,” Coyote says. “For me, the act of shapeshifting is spiritual and a mode of self love. It allows me to embrace myself in my entirety.”

A person in a long black gown stands in front of a small water fall holding a crystal and looking into the camera.

Coyote Park

Chiara Gabellini and Tim Kellner also wanted to capture a sense of vulnerability in their projects, while Natalia Mantini and Andy Jackson took intimate portraits of their chosen families.

Two photographs side by side: The first is of a woman laying in a bed staring at the ceiling, surrounded by knick-knacks. The second is of a person getting out of the water at a dock on a lake.

At right: Chiara Gabellini's work; at left: Tim Kellner's work

Two side by side photographs: One of a woman cuddling on a couch with a dog, the other of a person with flowers draped around their head and shoulders.

At right: Andy Jackson's work; at left: Natalia Mantini's work

Shikeith and Lawrence Agyei photographed churches, as well as the youth boxers of The Bloc Chicago. They both focused on the idea of what it means to be Black, queer and spiritual

Two side by side photographs: The first is of two men as seen through the blue light of a stained glass window. The second is a boxing ring inside a chapel.

At left: Shikeith's work; at right: Lawrence Agyei's work

Andre Wagner documented Black skate culture. He photographed his subject, Ant Lava, who finds reprieve in the roller rink, where he says he feels safe.

Photograph of a man roller skating in a rink.

Andre Wagner

Myesha Evon Gardner, who photographed her hometown of Cleveland, was inspired by a childhood memory: “My mother had a kitchen towel set that depicted a honey bear with a variation on the phrase ‘home is where the honey is,’ which inspired and shares the name of this project,” Myesha says. “I wanted to show the complexities of Cleveland in a light that goes against its negative associations.”

Ultimately, that’s what Season 6 is about: Celebrating what it means to truly be yourself in spaces that feel sacred to you — where you feel seen.

Other artists who were part of this season include: Kennedi Carter, Glassface, MaryV, Adrian Octavius Walker, June Canedo, Anthony Prince Leslie, Aidan Cullen, Zamar Velez, Chiaara Gabellini, Pegah Farahmand, Neva Wireko and Myles Loftin.

Coming soon: More ways to repair your Pixel phone

We want you to have a great experience with your Pixel phone, and that includes easy access to high-quality and safe device repair if your phone is ever damaged. That’s why we’re working with iFixit to make it easier for independent repair professionals and skilled consumers with the relevant technical experience to access the genuine Google parts they need to repair Pixel phones.

Starting later this year, genuine Pixel spare parts will be available for purchase at ifixit.com for Pixel 2 through Pixel 6 Pro, as well as future Pixel models, in the U.S., UK, Canada, Australia and EU countries where Pixel is available. The full range of spare parts for common Pixel phone repairs — things like batteries, replacement displays, cameras and more — will be available either individually or in iFixit Fix Kits, which include tools like screwdriver bits and spudgers.

Easier, more accessible repairs

If you don’t want to make repairs yourself and would prefer professional help, you can always take your phone somewhere local to have it quickly and affordably repaired by an authorized technical expert. We already partner with independent repair providers like uBreakiFix, which has more than 750 locations across the U.S. and Canada supporting in-warranty and out-of-warranty Pixel repairs. We have similar partnerships with walk-in support providers in Canada, Germany, Japan and the U.K., with more to come. Pixel repair options are available in all countries where we sell Pixel phones.

We evaluate each new Pixel model on how easy it is to repair so we can reduce the effort, tools, parts and materials involved in the repair process. We also make training, documentation, tools and spare parts available to our authorized repair partners, and we plan on expanding this availability in the future.

We’re taking steps to expand repair options for other devices, too. We recently partnered with companies like Acer and Lenovo to launch the Chromebook repair program, helping schools find information about repairable Chromebooks and develop in-house repair programs. We also introduced Chrome OS Flex, which lets education and enterprise users repurpose old Mac or PC devices to run a version of Chrome OS alongside their Chromebook fleet. This helps users save on hardware costs, effectively recycle unused devices and manage their fleet sustainably and efficiently.

Helping you make sustainable choices

Improving repairability is an important way to help extend the life and usefulness of your phone. And it’s just one of several steps we’re taking to help you make more sustainable choices. One way we’re working toward our hardware sustainability commitments is by making sure our products can be sustainable from the start — and we incorporate that across our operations.

For example, starting in 2022, 100% of Google hardware products will include recycled materials with a drive to maximize recycled content wherever possible. Additionally, we’re enabling 100% carbon-neutral shipments of Google hardware products to and from our direct customers, as well as working to achieve Zero Waste to landfill certification in 2022 and plastic-free packaging by 2025.

To help older devices work like new and last longer, we’re committed to at least five years of security updates for the Pixel 6 and Pixel 6 Pro, and at least five years of automatic security updates for Nest-connected home devices from the date we start selling them on the U.S. Google Store. We’re also providing quarterly software Feature Drops for Pixel phones, updates for Nest products and extended software updates for Chromebooks.

You also have options if you’re not sure what to do with your phone when you don’t want it anymore. In select countries, our trade-in program accepts phones from a variety of major consumer electronics brands, so you can earn credit rewards for a new Pixel and rest assured that your old phone is responsibly recycled or reused. And with our recycling program, you can mail us your old or unused devices and we’ll responsibly recycle them for you. We make efforts to reclaim or recycle parts from used or returned devices, so that they don’t end up in landfills. We’re also always researching other safe and responsible ways to help you dispose of your old and outdated electronics.

When we built the first Pixel phone just five years ago, we made a commitment to design our hardware products in a way that’s sustainable and puts our customers first. There’s more to do, including expanding our repair network and improving repairability across our products. We look forward to sharing more as we make progress on this promise.

Snap at night, type responses in calls and more from Pixel

Today’s update marks our tenth Pixel Feature Drop 🎉 🎊. The latest updates begin rolling out to Pixel 3a through Pixel 5a (5G) devices today, while Pixel 6 and Pixel 6 Pro devices will begin receiving their updates later this month.

Night Sight in Snapchat

For those moments when the lighting isn’t quite right, Night Sight in Pixel Camera helps you capture clear low-light pictures and video. Now we're bringing Night Sight to Snapchat so you can snap vibrant, detailed, low-light videos or photos without flash. Give it a try on your Pixel 6 or Pixel 6 Pro by selecting low light mode in the Snapchat app to make sure you never miss a moment — even in the dark.

Communicating with captions

Animated GIF showing Live Caption being used on a phone call on a Pixel.

For people who can't or prefer not to speak on calls, there's a new way to communicate with Live Caption. Now when you’re on a phone call you can see captions of what the other person says and type back a response that will be read out loud on the other end.

Make messages and video calls more fun

Animated GIF showing how emoji stickers work on Pixel while texting.

A picture is worth a thousand words. When you’re typing in messaging apps, Gboard can convert your words into colorful stickers built with your exact text. With emoji, emoji kitchen and custom sticker suggestions while you type, your Pixel helps you express exactly how you feel. Coming to Pixel users globally typing in English (U.S.) starting today.

It's also getting easier to connect with friends and family on video calls: Host your own YouTube watch parties or share your favorite app with live sharing for Duo, all on your Pixel phone.

More languages. More ways to translate.

Live Translate on Pixel 6 and Pixel 6 Pro makes communicating in different languages easy. And with Interpreter mode, you can now translate your face-to-face conversations with Spanish, Italian and French speakers right on your phone, as all translations stay on device. Give it a try by saying, “Hey Google, be my Spanish interpreter.”

Animated GIF interpreter options.

Plus, Pixel 6 and Pixel 6 Pro can now identify Spanish in videos and other media, and translate it automatically into any of our supported languages: English, French, Italian, German, Japanese (beta) and Spanish.

Useful information at your fingertips

Your home and lock screens are the first things you see on your Pixel, and new and widgets help make sure you get the info you want.

At a Glance on Pixel makes sure you get the most helpful information at just the right time. At a Glance will now surface even more — including the battery levels of your Pixel Buds and other Bluetooth-connected devices, a safety check countdown from the Personal Safety app, reminders to turn off your alarm if the next day is a holiday and earthquake alerts for your area — all right on your lock and home screen.

Animated GIF showing the new widget options.

And the new Pixel battery widget can show up-to-date battery information for your Pixel, as well as Bluetooth[b0081a]-connected devices like Pixel Buds). Add the widget to your home screen to keep an eye on battery life across your devices.

More helpful features for more Pixels

In addition to the features reaching Pixel 6 and Pixel 6 Pro, we’re also bringing existing features to more languages and devices.

Making sure everyone feels “Seen on Pixel”

The Super Bowl has always been a special moment for Google. From our first Super Bowl ad in 2010, “Parisian Love,” to our 2020 spot “Loretta,” we try to shine a light on the challenges we’re focused on solving with our technology and tell the stories of real people impacted by our products.

And today, we’re continuing this legacy with our latest Super Bowl ad, “Seen on Pixel,” which tells the story of Real Tone, Google’s years-long efforts to ensure all our camera and imaging products accurately represent all skin tones.

For too long, camera technology, including our own, has failed people of color by either making them look washed out or too unnaturally bright or dark. Because everyone deserves to be seen as they truly are, we are committed to addressing this gap. Internally, Googlers of color volunteered to test the camera on Pixel 6 before we launched it and provided input on what was working and what could be better. Externally, we partnered with image experts who spent months with our engineers, testing the camera and providing detailed and thoughtful feedback that helped improve our camera and editing products, including adding significantly more portraits of people of color in the image datasets that train our camera models. This collective teamwork allowed us to launch what we call Real Tone, with Pixel 6 as our first camera to feature these improvements.

Since the launch of Real Tone on Google Pixel 6 and Pixel 6 Pro last October, we have seen the difference camera representation can make. “Seen on Pixel” brings to life what Real Tone represents. It is a montage of beautiful photography of individuals and families from all walks of life, all photographed on Pixel 6 by our director Joshua Kissi and contributing photographers Deun Ivory and Aundre Larrow. We partnered with award-winning artist Lizzo, who truly embodies the spirit of our campaign by always being her authentic self, unapologetically. Her powerful vocals as the soundtrack bring “Seen on Pixel” to life with a preview of her new song, “If You Love Me.”

Representation and equity in everything should always be the norm and the default. And until we reach it, our goal at Google will always be to make gains in the world every day through our products and storytelling.

Accurate Alpha Matting for Portrait Mode Selfies on Pixel 6

Image matting is the process of extracting a precise alpha matte that separates foreground and background objects in an image. This technique has been traditionally used in the filmmaking and photography industry for image and video editing purposes, e.g., background replacement, synthetic bokeh and other visual effects. Image matting assumes that an image is a composite of foreground and background images, and hence, the intensity of each pixel is a linear combination of the foreground and the background.

In the case of traditional image segmentation, the image is segmented in a binary manner, in which a pixel either belongs to the foreground or background. This type of segmentation, however, is unable to deal with natural scenes that contain fine details, e.g., hair and fur, which require estimating a transparency value for each pixel of the foreground object.

Alpha mattes, unlike segmentation masks, are usually extremely precise, preserving strand-level hair details and accurate foreground boundaries. While recent deep learning techniques have shown their potential in image matting, many challenges remain, such as generation of accurate ground truth alpha mattes, improving generalization on in-the-wild images and performing inference on mobile devices treating high-resolution images.

With the Pixel 6, we have significantly improved the appearance of selfies taken in Portrait Mode by introducing a new approach to estimate a high-resolution and accurate alpha matte from a selfie image. When synthesizing the depth-of-field effect, the usage of the alpha matte allows us to extract a more accurate silhouette of the photographed subject and have a better foreground-background separation. This allows users with a wide variety of hairstyles to take great-looking Portrait Mode shots using the selfie camera. In this post, we describe the technology we used to achieve this improvement and discuss how we tackled the challenges mentioned above.

Portrait Mode effect on a selfie shot using a low-resolution and coarse alpha matte compared to using the new high-quality alpha matte.

Portrait Matting
In designing Portrait Matting, we trained a fully convolutional neural network consisting of a sequence of encoder-decoder blocks to progressively estimate a high-quality alpha matte. We concatenate the input RGB image together with a coarse alpha matte (generated using a low-resolution person segmenter) that is passed as an input to the network. The new Portrait Matting model uses a MobileNetV3 backbone and a shallow (i.e., having a low number of layers) decoder to first predict a refined low-resolution alpha matte that operates on a low-resolution image. Then we use a shallow encoder-decoder and a series of residual blocks to process a high-resolution image and the refined alpha matte from the previous step. The shallow encoder-decoder relies more on lower-level features than the previous MobileNetV3 backbone, focusing on high-resolution structural features to predict final transparency values for each pixel. In this way, the model is able to refine an initial foreground alpha matte and accurately extract very fine details like hair strands. The proposed neural network architecture efficiently runs on Pixel 6 using Tensorflow Lite.

The network predicts a high-quality alpha matte from a color image and an initial coarse alpha matte. We use a MobileNetV3 backbone and a shallow decoder to first predict a refined low-resolution alpha matte. Then we use a shallow encoder-decoder and a series of residual blocks to further refine the initially estimated alpha matte.

Most recent deep learning work for image matting relies on manually annotated per-pixel alpha mattes used to separate the foreground from the background that are generated with image editing tools or green screens. This process is tedious and does not scale for the generation of large datasets. Also, it often produces inaccurate alpha mattes and foreground images that are contaminated (e.g., by reflected light from the background, or “green spill”). Moreover, this does nothing to ensure that the lighting on the subject appears consistent with the lighting in the new background environment.

To address these challenges, Portrait Matting is trained using a high-quality dataset generated using a custom volumetric capture system, Light Stage. Compared with previous datasets, this is more realistic, as relighting allows the illumination of the foreground subject to match the background. Additionally, we supervise the training of the model using pseudo–ground truth alpha mattes from in-the-wild images to improve model generalization, explained below. This ground truth data generation process is one of the key components of this work.

Ground Truth Data Generation
To generate accurate ground truth data, Light Stage produces near-photorealistic models of people using a geodesic sphere outfitted with 331 custom color LED lights, an array of high-resolution cameras, and a set of custom high-resolution depth sensors. Together with Light Stage data, we compute accurate alpha mattes using time-multiplexed lights and a previously recorded “clean plate”. This technique is also known as ratio matting.

This method works by recording an image of the subject silhouetted against an illuminated background as one of the lighting conditions. In addition, we capture a clean plate of the illuminated background. The silhouetted image, divided by the clean plate image, provides a ground truth alpha matte.

Then, we extrapolate the recorded alpha mattes to all the camera viewpoints in Light Stage using a deep learning–based matting network that leverages captured clean plates as an input. This approach allows us to extend the alpha mattes computation to unconstrained backgrounds without the need for specialized time-multiplexed lighting or a clean background. This deep learning architecture was solely trained using ground truth mattes generated using the ratio matting approach.

Computed alpha mattes from all camera viewpoints at the Light Stage.

Leveraging the reflectance field for each subject and the alpha matte generated with our ground truth matte generation system, we can relight each portrait using a given HDR lighting environment. We composite these relit subjects into backgrounds corresponding to the target illumination following the alpha blending equation. The background images are then generated from the HDR panoramas by positioning a virtual camera at the center and ray-tracing into the panorama from the camera’s center of projection. We ensure that the projected view into the panorama matches its orientation as used for relighting. We use virtual cameras with different focal lengths to simulate the different fields-of-view of consumer cameras. This pipeline produces realistic composites by handling matting, relighting, and compositing in one system, which we then use to train the Portrait Matting model.

Composited images on different backgrounds (high-resolution HDR maps) using ground truth generated alpha mattes.

Training Supervision Using In-the-Wild Portraits
To bridge the gap between portraits generated using Light Stage and in-the-wild portraits, we created a pipeline to automatically annotate in-the-wild photos generating pseudo–ground truth alpha mattes. For this purpose, we leveraged the Deep Matting model proposed in Total Relighting to create an ensemble of models that computes multiple high-resolution alpha mattes from in-the-wild images. We ran this pipeline on an extensive dataset of portrait photos captured in-house using Pixel phones. Additionally, during this process we performed test-time augmentation by doing inference on input images at different scales and rotations, and finally aggregating per-pixel alpha values across all estimated alpha mattes.

Generated alpha mattes are visually evaluated with respect to the input RGB image. The alpha mattes that are perceptually correct, i.e., following the subject's silhouette and fine details (e.g., hair), are added to the training set. During training, both datasets are sampled using different weights. Using the proposed supervision strategy exposes the model to a larger variety of scenes and human poses, improving its predictions on photos in the wild (model generalization).

Estimated pseudo–ground truth alpha mattes using an ensemble of Deep Matting models and test-time augmentation.

Portrait Mode Selfies
The Portrait Mode effect is particularly sensitive to errors around the subject boundary (see image below). For example, errors caused by the usage of a coarse alpha matte keep sharp focus on background regions near the subject boundaries or hair area. The usage of a high-quality alpha matte allows us to extract a more accurate silhouette of the photographed subject and improve foreground-background separation.

Try It Out Yourself
We have made front-facing camera Portrait Mode on the Pixel 6 better by improving alpha matte quality, resulting in fewer errors in the final rendered image and by improving the look of the blurred background around the hair region and subject boundary. Additionally, our ML model uses diverse training datasets that cover a wide variety of skin tones and hair styles. You can try this improved version of Portrait Mode by taking a selfie shot with the new Pixel 6 phones.

Portrait Mode effect on a selfie shot using a coarse alpha matte compared to using the new high quality alpha matte.

This work wouldn’t have been possible without Sergio Orts Escolano, Jana Ehmann, Sean Fanello, Christoph Rhemann, Junlan Yang, Andy Hsu, Hossam Isack, Rohit Pandey, David Aguilar, Yi Jinn, Christian Hane, Jay Busch, Cynthia Herrera, Matt Whalen, Philip Davidson, Jonathan Taylor, Peter Lincoln, Geoff Harvey, Nisha Masharani, Alexander Schiffhauer, Chloe LeGendre, Paul Debevec, Sofien Bouaziz, Adarsh Kowdle, Thabo Beeler, Chia-Kai Liang and Shahram Izadi. Special thanks to our photographers James Adamson, Christopher Farro and Cort Muller who took numerous test photographs for us.

Source: Google AI Blog

So you got new gear for the holidays. Now what?

The new year is here, and the holidays are (officially) over. If you were gifted a new Google gadget, that means it’s time to get your new gear out of the box and into your home or pocket.

We talked to the experts here at Google and asked for a few of their quick setup tips, so you can get straight to using your new…whatever you got...right away.

So you got a Pixel 6 Pro…

  1. Begin by setting up fingerprint unlock for quick and easy access.
  2. Prepare for future emergencies and turn on the extreme battery saver feature in the settings app. Extreme battery saver can extend your Pixel 6 Pro’s battery life by intelligently pausing apps and slowing processes, and you can preselect when you want to enable the feature — and what your priority apps are.
  3. Create a personal aesthetic with Material You, and express character by customizing wallpaper and interface designs that will give your Pixel 6 Pro’s display a more uniform look.

So you got a Nest Hub Max…

  1. First, set up Face Match to ensure your Nest Hub Max can quickly identify you as the user and share a more personal experience. Then, when you walk up to the device it can do things like present your daily schedule, play your favorite playlist or suggest recommended videos, news and podcasts.
  2. Set up a Duo account for video calling and messaging with your friends and family. From there, you can ask Nest Hub Max to call anyone in your Google contacts who has Duo — just say, “Hey Google, call (your contact name).” For family members or friends who don't already have Duo, the app is free and available for download on both Android and iOS.
  3. Be sure to connect your Nest Hub Max to any other Google gear, such as the Chromecast and Nest Mini for a smart home experience.
The Nest Hub Max in front of a white background.

The Nest Hub Max.

So you got the new Nest Thermostat…

  1. Use Quick Schedule to easily and quickly get your thermostat programmed. You can go with its recommended presets or adjust the settings further to create a custom schedule. You can make changes to your schedule anytime from the Home app.
  2. Then you can opt in Home and Away Routines, which can help you avoid heating or cooling an empty house by using motion sensing and your phone’s location to know when nobody’s home and adjust the temperature accordingly to save energy.
  3. Make sure you’ve enabled notifications and Savings Finder will proactively suggest small tweaks to your schedule that you can accept from the Home app. For example, it might suggest a small change to your sleep temperature to save you energy.

So you got the new Pixel Buds A-Series…

  1. Check out the Pixel Buds A-Series’ latest feature, the bass customization option, to find your perfect sound. This addition doubles the bass range when connected to an Android 6.0 device, and can be adjusted on a scale from -1 to 4 by using the Pixel Buds App.
  2. Here’s a hardware tip: Try out the three different ear tip fit options to find the most comfortable fit for you.
  3. Start listening to your favorite podcasts and music right away by using Fast Pair to immediately connect your Pixel Buds to your phone.

Creator Labs artists take on the Pixel 6

“As humans we are constantly trying to understand ourselves … this is a universal experience, both socially and culturally. I find myself currently in a state of asking questions relating to my own sense of self.” This was what photographer MaryV was thinking while she was working on her latest project with Creator Labs.

Photograph of a woman sitting on a white horse. There is a pink and purple sunset in the background.

Photography by MaryV

Following the launch of Google Pixel 6 Pro in October, MaryV and 12 other lens-based artists were tasked with exploring the idea of “For All You Are,” a prompt referencing why we started the Creator Labs program: We want to give artists the tools to tell their own stories, in their own unique voices.

This year, Creator Labs artists were also able to use Real Tone on Google Pixel 6, a multi-year mission to make best-in-class smartphone cameras that photograph skin more equitably. As part of this initiative, the Pixel team made a suite of improvements and changes across how Pixel’s camera and supporting imagery products work to highlight the nuances of different skin tones beautifully and authentically.

One theme we saw multiple artists focus on was “ancestry,” both from the perspective of honoring traditions and redefining what constitutes family. Anthony Prince Leslie reimagined African Folklore with his piece “Spyda,” which in his words, showcases “the resilience of the Black diaspora and the importance of storytelling as a method of preserving history.” Texas Isaiah paid homage to his childhood home in East New York, Brooklyn. As the first-generation of his family born in the U.S., he never spent time with his extended family. So as a child, his home was filled with native Canadian and South American ​​photographs, souvenirs and other materials his family had collected over more than 30 years.

Myles Loftin challenged the “traditional” family structures with his piece by documenting and honoring his chosen family in New York City. This is an extension of a larger body of work called “In The Life” which centers on Black Queer life.

Photograph of three people standing together, hugging and leaning into one another. There is blue sky and white cloud in the background.

Photography by Myles Loftin

While each artists’ work is unique, they all invite us to reflect and be vulnerable.

Other Creator Labs artists include Mayan Toledano, Pegah Farahmand, Kennedi Carter, Aidan Cullen, Andre Wagner, Tim Kellner, Natalia Mantini, Josh Goldenberg (glassface) and June Canedo. You can see examples of their work and more from the artists above onthe Pixel Instagram page.