Tag Archives: Pixel

Ask a Techspert: How does motion sensing work?

Editor’s Note: Do you ever feel like a fish out of water? Try being a tech novice and talking to an engineer at a place like Google. Ask a Techspert is a series on the Keyword asking Googler experts to explain complicated technology for the rest of us. This isn’t meant to be comprehensive, but just enough to make you sound smart at a dinner party. 

Thanks to my allergies, I’ve never had a cat. They’re cute and cuddly for about five minutes—until the sneezing and itching set in. Still, I’m familiar enough with cats (and cat GIFs) to know that they always have a paw in the air, whether it’s batting at a toy or trying to get your attention. Whatever it is they’re trying to do, it often looks like they’re waving at us. So imagine my concern when I found out that you can now change songs, snooze alarms or silence your phone ringing on your Pixel 4 with the simple act of waving. What if precocious cats everywhere started unintentionally making us sleep late by waving their paws?

Fortunately, that’s not a problem. Google’s motion sensing radar technology—a feature called Motion Sense in the Pixel 4—is designed so that only human hands, as opposed to cat paws, can change the tracks on your favorite playlist. So how does this motion sensing actually work, and how did Google engineers design it to identify specific motions? 

To answer my questions, I found our resident expert on motion sensors, Brandon Barbello. Brandon is a product manager on our hardware team and he helped me unlock the mystery behind the motion sensors on your phone, and how they only work for humans. 

When I’m waving my hand in front of my screen, how can my phone sense something is there? 

Brandon tells me that your Pixel phone has a chip at the top with a series of antennas, some of which emit a radio signal and others that receive “bounce backs” of the same signal the other antenna emitted. “Those radio signals go out into the world, and then they hit things and bounce back. The receiver antennas read the signals as they bounce back and that’s how they’re able to sense something has happened. Your Pixel actually has four antennas: One that sends out signals, and three that receive.”

What happens after the antenna picks up the motion? 

According to Brandon, when the radio waves bounce back, the computer in your phone begins to process the information. “Essentially, the sensor picks up that you’re around, and that triggers your phone to keep an eye out for the relevant gestures,” he says.

How does the Pixel detect that a motion is a swipe and not something else? 

With the motion sensing functions on the Pixel, Brandon and his team use machine learning to determine what happened. “Those radio waves get analyzed and reduced into a series of numbers that can be fed into the machine learning models that detect if a reach or a swipe has just happened,” Brandon says. “We collected millions of motion samples to pre-train each phone to recognize intentional swipes. Specifically, we’ve trained the models to detect motions that look like they come from a human hand, and not, for instance, a coffee mug passing over the phone as you put it down on the table.”

What will motion sensors be capable of in the future? 

Brandon told me that he and his team plan to add more gestures to recognize beyond swiping, and that specific movements could be connected to more apps. “In the future, we want to create devices that can understand your body language, so they’re more intuitive to use and more helpful,” he tells me. 

At the moment, motion-sensing technology is focused on the practical, and there’s still improvements to be made and new ground to cover, but he says this technology can also be delightful and fun—like on the Pixel’s gesture-controlled Pokémon Live Wallpaper. Overall, motion sensing technology helps you use your devices in a whole new way, and that will keep changing as the tech advances. "We're just beginning to see the potential of motion sensing," Brandon says.

Made by Google’s 20 tips for 2020

The new year is a time for resolutions and reflection, from getting organized to creating some healthy habits. And there are more than a few ways that the tech in your home and in your pocket can help you get there. 

If you received a Made by Google device over the holidays—or you’ve owned one for a while—consider these pro tips for getting the most out of them. We’re sharing 20 fun features and tricks available across a variety of devices to try, plus expert advice for adding an extra layer of protection to your accounts across the web.

  1. Turn off distractions. With the new Focus mode, found in Pixel's device settings under "Digital Wellbeing & parental controls," you can temporarily pause and silence certain apps so you can focus on the task at hand. While you’re working out, during your commute or while you’re trying to take a moment to yourself, Focus mode gives you control over which apps you need notifications from and when.

  2. Capture one-of-a-kind photos.With Pixel, you can snap great pictures year-round using features like Portrait Mode, Photobooth and even Night Sight, which allows you to shoot photos of the stars. See g.co/pixel/astrophotography to learn more about astrophotography on Pixel 4.

  3. Outsmart robocalls.U.S.-based, English-speaking Pixel owners can use Call Screen on Pixel to automatically screen spam calls, so you can avoid calls from unknown numbers and limit interruptions throughout your day (Call Screen is new and may not detect all robocalls, but it will definitely try!).

  4. Try wall-mounting your Nest Mini. Nest Mini comes with wall mounting capabilities, which comes in handy if you’re short on counter space. Wall-mounting also helps you take advantage of its improved bass and full sound.

  5. Stress-free healthy cooking. If you’re trying to eat more fresh fruits and vegetables, don’t sweat meal planning: Get easy inspiration from Nest Hub or Nest Hub Max. Say “Hey Google, show me recipes with spinach, lentils and tomatoes” and you’ll see ideas to scroll through, select, and follow step-by-step.

  6. Stay in touch. We could all do better at keeping in touch with loved ones. Nest Hub Max offers the option to make video calls using Google Duo, so you can catch up with mom face-to-face right from your display. 

  7. Get help with delegating. Create Assignable reminders for other members of your household, like reminding your partner to walk the dog. Face Match will show them any missed reminders automatically when they approach Hub Max. You can also use reminders to send someone a note of encouragement when they need it the most (“Hey Google, remind Kathy that she’ll do great in tomorrow’s interview”).

  8. View and share your favorite photos. Enjoy your favorite moments from Google Photos on Nest Hub Max’s 10-inch high definition screen. See a photo pop up that brings a smile to your face? Share it with one of your contacts: “Hey Google, share this photo with Mom.” Or if you see an old memory and can’t remember the location, just ask “Hey Google, where was this photo taken?”

  9. Check your Wi-Fi easily. You can use a Nest Wifi point the same way you use a Google Nest speaker. Simply say, “Hey Google, what’s my internet speed?” or “Hey Google, pause Wi-Fi for Daniel” to pause individual users’ devices at certain times, like during dinner.

  10. Have a worry-free work week.The Talk and Listen feature on Nest Hello makes it easy for busy families to keep in touch throughout the day. When you see Nest Hello start recording, you can share your status with your family members who have access to Nest Hello’s camera feed. It’ll become a quick video they can view on their phones.

  11. Keep track of deliveries. Nest Hello also detects packages for Nest Aware users—helpful if you’re expecting something important. 

  12. Choose when your cameras record. You can schedule your Nest cameras to automatically turn off on the weekends and back on again during the week (or during the time frame you prefer). To do this, turn off Home/Away assist and create your schedule

  13. Control what you save.While your Nest Cam video history automatically expires after a specific time frame depending on your Nest Aware subscription, you can also manually delete footage anytime. Simply select the “Delete video history” option in your camera’s settings.

  14. Skip the monthly gym fee.Few things are more difficult in the dead of winter than driving to a gym first thing in the morning. Choose a more  manageable routine: Pull up a workout from YouTube or Daily Burn and cast it to your TV with Chromecast, so you can sweat while the coffee is brewing. 

  15. New partners, new content.Over the past few months we’ve introduced new content partners for Chromecast and displays so you have tons of movies and TV shows to choose from based on your subscriptions, including Disney+, Amazon Prime Video, Hulu and Sling TV.

  16. Attention gamers! If you own a standalone Chromecast Ultra, you can play Stadia on it if you have an existing Stadia account. Link your Stadia controller to your Chromecast Ultra and you’re ready to go. For best results, connect an Ethernet cable to your Chromecast Ultra.

  17. Save on your energy bill.On your Nest Thermostat, seeing the Nest Leaf is an easy way to know you’re saving energy, and it encourages you to continually improve your savings over time. You’ll see the Leaf on your thermostat when you set a temperature that helps save energy. The more often you see a Leaf, the more you save.

  18. Enable 2-factor authentication, or migrate to a Google account. 2-factor authentication uses a secondary confirmation to make it harder for unauthorized people to access your account. Migrating to a Google account provides automatic security protections, proactive alerts about suspicious account activity and the security checkup

  19. Give your passwords a makeover.Repeating passwords makes your accounts more vulnerable to common hacks, so make sure each password you use is unique and complicated.

  20. Enlist extra protection from Chrome.When you type your credentials into a website, Chrome will now warn you if your username and password have been compromised in a data breach on some site or app. It will suggest that you change them everywhere they were used.

Cheers to a new decade—and some new gear! 

Capture your holiday on Pixel 4

Who doesn’t love holiday photos? Luckily for Pixel 4 owners, the camera on your phone is packed with all of the features you need to get the perfect picture every time, year round.

All is calm, all photos are bright

Holiday decorations and lights can make it difficult to capture that perfectly lit photo. Pixel 4’s Dual Exposure Controls ensure that no matter how decked your halls are, you always get the great photo you want by giving you control over the lighting, silhouettes and exposure in your shots.


Dual_Exposure_Controls.gif

Celebrate the festival of lights no matter how dark it is thanks to Night Sight on Pixel 4. As your family gathers around the menorah, snap a great picture of your loved ones’ faces lit by candle light by using the low-light photography mode on Pixel. And for those staying up and waiting for Santa, use Night Sight to capture the stockings hung by the chimney with care even as the fire dwindles. 

Night_Sight.gif

Ring in the New Year in new ways

Looking for an evening activity once the presents are unwrapped? Dec. 25 is also the start of a new moon, making it the best night for a photo of the sky. And if you’re planning to ring in the new year under the stars, Pixel 4 is the best companion, with the ability to capture astrophotography on Night Sight.


Bust out those old New Year’s Eve or family holiday photos for the perfect throwback holiday season pic, and use portrait blur--now available on Pixel devices in Google Photos--to give them new life, even years after they’ve been taken. 


Portrait_Blur.gif

Whether you’re gathered around the dining room table, the menorah or Christmas tree, or watching the ball drop in Times Square, this Pixel’s camera is your perfect companion. 


Happy holiday photos to all and to all a good night! 


Improvements to Portrait Mode on the Google Pixel 4 and Pixel 4 XL



Portrait Mode on Pixel phones is a camera feature that allows anyone to take professional-looking shallow depth of field images. Launched on the Pixel 2 and then improved on the Pixel 3 by using machine learning to estimate depth from the camera’s dual-pixel auto-focus system, Portrait Mode draws the viewer’s attention to the subject by blurring out the background. A critical component of this process is knowing how far objects are from the camera, i.e., the depth, so that we know what to keep sharp and what to blur.

With the Pixel 4, we have made two more big improvements to this feature, leveraging both the Pixel 4’s dual cameras and dual-pixel auto-focus system to improve depth estimation, allowing users to take great-looking Portrait Mode shots at near and far distances. We have also improved our bokeh, making it more closely match that of a professional SLR camera.
Pixel 4’s Portrait Mode allows for Portrait Shots at both near and far distances and has SLR-like background blur. (Photos Credit: Alain Saal-Dalma and Mike Milne)
A Short Recap
The Pixel 2 and 3 used the camera’s dual-pixel auto-focus system to estimate depth. Dual-pixels work by splitting every pixel in half, such that each half pixel sees a different half of the main lens’ aperture. By reading out each of these half-pixel images separately, you get two slightly different views of the scene. While these views come from a single camera with one lens, it is as if they originate from a virtual pair of cameras placed on either side of the main lens’ aperture. Alternating between these views, the subject stays in the same place while the background appears to move vertically.
The dual-pixel views of the bulb have much more parallax than the views of the man because the bulb is much closer to the camera.
This motion is called parallax and its magnitude depends on depth. One can estimate parallax and thus depth by finding corresponding pixels between the views. Because parallax decreases with object distance, it is easier to estimate depth for near objects like the bulb. Parallax also depends on the length of the stereo baseline, that is the distance between the cameras (or the virtual cameras in the case of dual-pixels). The dual-pixels’ viewpoints have a baseline of less than 1mm, because they are contained inside a single camera’s lens, which is why it’s hard to estimate the depth of far scenes with them and why the two views of the man look almost identical.

Dual Cameras are Complementary to Dual-Pixels
The Pixel 4’s wide and telephoto cameras are 13 mm apart, much greater than the dual-pixel baseline, and so the larger parallax makes it easier to estimate the depth of far objects. In the images below, the parallax between the dual-pixel views is barely visible, while it is obvious between the dual-camera views.
Left: Dual-pixel views. Right: Dual-camera views. The dual-pixel views have only a subtle vertical parallax in the background, while the dual-camera views have much greater horizontal parallax. While this makes it easier to estimate depth in the background, some pixels to the man’s right are visible in only the primary camera’s view making it difficult to estimate depth there.
Even with dual cameras, information gathered by the dual pixels is still useful. The larger the baseline, the more pixels that are visible in one view without a corresponding pixel in the other. For example, the background pixels immediately to the man’s right in the primary camera’s image have no corresponding pixel in the secondary camera’s image. Thus, it is not possible to measure the parallax to estimate the depth for these pixels when using only dual cameras. However, these pixels can still be seen by the dual pixel views, enabling a better estimate of depth in these regions.

Another reason to use both inputs is the aperture problem, described in our previous blog post, which makes it hard to estimate the depth of vertical lines when the stereo baseline is also vertical (or when both are horizontal). On the Pixel 4, the dual-pixel and dual-camera baselines are perpendicular, allowing us to estimate depth for lines of any orientation.

Having this complementary information allows us to estimate the depth of far objects and reduce depth errors for all scenes.

Depth from Dual Cameras and Dual-Pixels
We showed last year how machine learning can be used to estimate depth from dual-pixels. With Portrait Mode on the Pixel 4, we extended this approach to estimate depth from both dual-pixels and dual cameras, using Tensorflow to train a convolutional neural network. The network first separately processes the dual-pixel and dual-camera inputs using two different encoders, a type of neural network that encodes the input into an intermediate representation. Then, a single decoder uses both intermediate representations to compute depth.
Our network to predict depth from dual-pixels and dual-cameras. The network uses two encoders, one for each input and a shared decoder with skip connections and residual blocks.
To force the model to use both inputs, we applied a drop-out technique, where one input is randomly set to zero during training. This teaches the model to work well if one input is unavailable, which could happen if, for example, the subject is too close for the secondary telephoto camera to focus on.
Depth maps from our network where either only one input is provided or both are provided. Top: The two inputs provide depth information for lines in different directions. Bottom: Dual-pixels provide better depth in the regions visible in only one camera, emphasized in the insets. Dual-cameras provide better depth in the background and ground. (Photo Credit: Mike Milne)
The lantern image above shows how having both signals solves the aperture problem. Having one input only allows us to predict depth accurately for lines in one direction (horizontal for dual-pixels and vertical for dual-cameras). With both signals, we can recover the depth on lines in all directions.

With the image of the person, dual-pixels provide better depth information in the occluded regions between the arm and torso, while the large baseline dual cameras provide better depth information in the background and on the ground. This is most noticeable in the upper-left and lower-right corner of depth from dual-pixels. You can find more examples here.

SLR-Like Bokeh
Photographers obsess over the look of the blurred background or bokeh of shallow depth of field images. One of the most noticeable things about high-quality SLR bokeh is that small background highlights turn into bright disks when defocused. Defocusing spreads the light from these highlights into a disk. However, the original highlight is so bright that even when its light is spread into a disk, the disk remains at the bright end of the SLR’s tonal range.
Left: SLRs produce high contrast bokeh disks. Middle: It is hard to make out the disks in our old background blur. Right: Our new bokeh is closer to that of an SLR.
To reproduce this bokeh effect, we replaced each pixel in the original image with a translucent disk whose size is based on depth. In the past, this blurring process was performed after tone mapping, the process by which raw sensor data is converted to an image viewable on a phone screen. Tone mapping compresses the dynamic range of the data, making shadows brighter relative to highlights. Unfortunately, this also results in a loss of information about how bright objects actually were in the scene, making it difficult to produce nice high-contrast bokeh disks. Instead, the bokeh blends in with the background, and does not appear as natural as that from an SLR.

The solution to this problem is to blur the merged raw image produced by HDR+ and then apply tone mapping. In addition to the brighter and more obvious bokeh disks, the background is saturated in the same way as the foreground. Here’s an album showcasing the better blur, which is available on the Pixel 4 and the rear camera of the Pixel 3 and 3a (assuming you have upgraded to version 7.2 of the Google Camera app).
Blurring before tone mapping improves the look of the backgrounds by making it more saturated and by making disks higher contrast.
Try it Yourself
We have made Portrait Mode on the Pixel 4 better by improving depth quality, resulting in fewer errors in the final image and by improving the look of the blurred background. Depth from dual-cameras and dual-pixels only kicks in when the camera is at least 20 cm from the subject, i.e. the minimum focus distance of the secondary telephoto camera. So consider keeping your phone at least that far from the subject to get better quality portrait shots.

Acknowledgments
This work wouldn’t have been possible without Rahul Garg, Sergio Orts Escolano, Sean Fanello, Christian Haene, Shahram Izadi, David Jacobs, Alexander Schiffhauer, Yael Pritch Knaan and Marc Levoy. We would also like to thank the Google Camera team for helping to integrate these algorithms into the Pixel 4. Special thanks to our photographers Mike Milne, Andy Radin, Alain Saal-Dalma, and Alvin Li who took numerous test photographs for us.

Source: Google AI Blog


Let Google be your holiday travel tour guide

When it comes to travel, I’m a planner. I’m content to spend weeks preparing the perfect holiday getaway: deciding on the ideal destination, finding the cheapest flights and sniffing out the best accommodations. I’ve been dreaming about a trip to Greece next year, and—true story—I’ve already got a spreadsheet to compare potential destinations, organized by flight length and hotel perks.

But the thing I don’t like to do is plot out the nitty-gritty details. I want to visit the important museums and landmarks, but I don’t want to write up a daily itinerary ahead of time. I’m a vegetarian, so I need to find veggie-friendly restaurants, but I’d prefer to stumble upon a good local spot than plan in advance. And, since I don’t speak Greek, I want to be able to navigate transportation options without having to stop and ask people for help all the time.

So I’ve come to rely on some useful Google tools to make my trips work for the way I like to travel. Here’s what I’ve learned so far.

Let Maps do the talking

Getting dropped into a new city is disorienting, and all the more so when you need to ask for help but don’t know how to pronounce the name of the place you’re trying to get to. Google Maps now has a fix for this: When you’ve got a place name up in Maps, just press the new little speaker button next to it, and it will speak out a place's name and address in the local lingo. And if you want to continue the conversation, Google Maps will quickly link you to the Google Translate app.

gif of Google Translate feature in Google Maps

Let your phone be your guidebook

New cities are full of new buildings, new foods and even new foliage. But I don’t want to just see these things; I want to learn more about them. That’s where Google Lens comes in as my know-it-all tour guide and interpreter. It can translate a menu, tell me about the landmark I’m standing in front of or identify a tree I’ve never seen before. So whenever I think, “I wonder what that building is for,” I can just use my camera to get an answer in real time. 

using Google Lens to identify a flower

Photo credit: Joao Nogueira

Get translation help on the go

The Google Assistant’s real-time translation feature, interpreter mode, is now available on Android and iOS phones worldwide, enabling you to have a conversation with someone speaking a foreign language. So if I say, “Hey Google, be my Greek translator,” I can easily communicate with, say, a restaurant server who doesn’t speak English. Interpreter mode works across 44 languages, and it features different ways to communicate suited to your situation: you can type using a keyboard for quiet environments, or manually select what language to speak.

gif of Google Assistant interpreter mode

Use your voice to get things done

Typing is fine, but talking is easier, especially when I’m on vacation and want to make everything as simple as possible. The Google Assistant makes it faster to find what I’m looking for and plan what’s next, like weather forecasts, reminders and wake-up alarms. It can also help me with conversions, like “Hey Google, how much is 20 Euros in pounds?”

Using Google Assistant to answer questions

Photo credit: Joao Nogueira

Take pics, then chill

When I’m in a new place, my camera is always out. But sorting through all those pictures is the opposite of relaxing. So I offload that work onto Google Photos: It backs up my photos for free and lets me search for things in them . And when I want to see all the photos my partner has taken, I can create an album that we can both add photos to. And Photos will remind me of our vacation in the future, too, with story-style highlights at the top of the app.

photo of leafy old town street

Photo credit: Joao Nogueira

Look up

I live in a big city, which means I don’t get to see the stars much. Traveling somewhere a little less built up means I can hone my Pixel 4 astrophotography skills. It’s easy to use something stable, like a wall, as a makeshift tripod, and then just let the camera do its thing.

a stone tower at night with a starry sky in the background

Photo credit: DDay

Vacation unplugged

As useful as my phone is, I try to be mindful about putting it down and ignoring it as much as I can. And that goes double for when I’m on vacation. Android phones have a whole assortment of Digital Wellbeing features to help you disconnect. My favorite is definitely flip to shhh: Just place your phone screen-side down and it silences notifications until you pick it back up.

someone sitting on a boat at sunset watching the shoreline

Photo credit: Joao Nogueira

Source: Google LatLong


Interpreter mode brings real-time translation to your phone

You’ve booked your flights, found the perfect hotel and mapped out all of the must-see local attractions. Only one slight issue—you weren’t able to brush up on a new foreign language in time for your trip. The Google Assistant is here to help.


Travelers already turn to the Assistant for help researching and checking into flights, finding local restaurant recommendations and more. To give you even more help during your trip, the Assistant’s real-time translation feature, interpreter mode, is starting to roll out today on Assistant-enabled Android and iOS phones worldwide. Using your phone, you can have a back and forth conversation with someone speaking a foreign language.


To get started, just say “Hey Google, be my German translator” or “Hey Google, help me speak Spanish” and you’ll see and hear the translated conversation on your phone. After each translation, the Assistant may present Smart Replies, giving you suggestions that let you quickly respond without speaking—which can make your conversations faster and even more seamless.
Google Assistant_interpreter mode on mobile.gif

Interpreter mode helps you translate across 44 languages, and since it’s integrated with the Assistant, it’s already on your Android phone. To access it on iOS, simply download the latest Google Assistant app. Interpreter mode also features different ways to communicate suited to your situation: you can type using a keyboard for quiet environments, or manually select what language to speak.


Whether you’re heading on a trip this holiday season, gearing up for international travel in the New Year, or simply want to communicate with family members who speak another language, interpreter mode is here to remove language barriers no matter where you are.


Gute Reise! Translation: “Enjoy your trip!”

Interpreter mode brings real-time translation to your phone

You’ve booked your flights, found the perfect hotel and mapped out all of the must-see local attractions. Only one slight issue—you weren’t able to brush up on a new foreign language in time for your trip. The Google Assistant is here to help.


Travelers already turn to the Assistant for help researching and checking into flights, finding local restaurant recommendations and more. To give you even more help during your trip, the Assistant’s real-time translation feature, interpreter mode, is starting to roll out today on Assistant-enabled Android and iOS phones worldwide. Using your phone, you can have a back and forth conversation with someone speaking a foreign language.


To get started, just say “Hey Google, be my German translator” or “Hey Google, help me speak Spanish” and you’ll see and hear the translated conversation on your phone. After each translation, the Assistant may present Smart Replies, giving you suggestions that let you quickly respond without speaking—which can make your conversations faster and even more seamless.
Google Assistant_interpreter mode on mobile.gif

Interpreter mode helps you translate across 44 languages, and since it’s integrated with the Assistant, it’s already on your Android phone. To access it on iOS, simply download the latest Google Assistant app. Interpreter mode also features different ways to communicate suited to your situation: you can type using a keyboard for quiet environments, or manually select what language to speak.


Whether you’re heading on a trip this holiday season, gearing up for international travel in the New Year, or simply want to communicate with family members who speak another language, interpreter mode is here to remove language barriers no matter where you are.


Gute Reise! Translation: “Enjoy your trip!”

Making Pixel more helpful with the first Pixel feature drop

Your phone should get better over time. Your Pixel automatically updates regularly with fixes and improvements. Now, your Pixel will also get bigger updates in new Pixel feature drops.  Our first one, coming this month, includes a new way to capture portraits, easier Duo calls and automatic call screening. 

More photo controls

Now, you can turn a photo into a portrait on Pixel by blurring the background post-snap. So whether you took the photo years ago, or you forgot to turn on portrait mode, you can easily give each picture an artistic look with Portrait Blur in Google Photos. 


05_Add_Portrait_Blur_to_Photos_EN (1).gif

Put an end to robocalls

With our latest update to Call Screen on Pixel 4 in the US, the Google Assistant now helps you automatically screen unknown callers and filter out detected robocalls before your phone ever rings, so you’re not interrupted by them. And when it’s not a robocall, your phone rings a few moments later with helpful context about who is calling and why. Call Screen works on your device and does not use Wi-Fi or data, which makes the screening fast and the content private to you.


Call Screen.gif

Improved video calls on Duo 

Video calls are better on Pixel 4 with new Duo features that let you focus on conversations instead of logistics. Auto-framing keeps your face centered during your Duo video calls, even as you move around, thanks to Pixel 4’s wide-angle lens. And if another person joins you in the shot, the camera automatically adjusts to keep both of you in the frame.


06_Auto-framing_on_Duo_EN.gif

Now, the playback on your Duo calls is even smoother, too. When a bad connection leads to spotty audio, a machine learning model on your Pixel 4 predicts the likely next sound and helps you to keep the conversation going with minimum disruptions. Pixel 4’s Smooth Display also reduces choppiness on your video feed, refreshing up to 90 times a second.

When you make Duo video calls on Pixel 2, 3 and 4, you can now apply a portrait filter as well. You’ll look sharper against the gentle blur of your background, while the busy office or messy bedroom behind you goes out of focus.


08_Duo_Portrait_Mode_EN_var06_191115.gif

With the latest update to Pixel 4, you'll also get amazingly fast accuracy in Google Maps with improved on-device computing for much better location quality. 

More helpful features for more Pixels

In addition to new features for Pixel 4, we’re also bringing new apps and features to Pixel 2, 3 and 3a:

  • The Recorder app is now available on older generations of Pixel.
  • Pixel 3 and 3a users will get Live Caption. 
  • Digital Wellbeing is getting updates too. Focus mode is rolling out to help you stay productive and minimize distractions by pausing apps you've selected in a single tap. You can now set an automatic schedule, take a short break or end Focus mode early without disrupting your schedule.
  • Flip to Shhh will also join the Digital Wellbeing features on Pixel 2 and 2XL.
  • If you use a Pixel 4 in the UK, Canada, Ireland, Singapore and Australia, you’ll soon get the new Google Assistant (English only), which is even faster and more helpful.

A more efficient phone

In addition to these new experiences, all Pixel devices will also receive an update to its memory management in the feature drop. With this new enhancement, your phone proactively compresses cached applications so that users can run multiple applications at the same time -- like games, streaming content and more.


Pixel phones have always received monthly updates to improve performance and make your device safe. Now, feature drops will bring more helpful and fun features to users on a regular basis to continue to make your Pixel better than ever. 


These features are already rolling out, and will hit Pixel devices in the coming weeks. To get the new features, update to the latest version of Android and go to the Play Store to start downloading your updated apps.


Source: Google LatLong


Closer to the stars with Pixel 4

On a clear, cool late October evening, the residents of the village of Star, U.K., turned off their lights, left their homes, and gathered together in a field. The mayor of the tiny Welsh hamlet was already there, serving everyone tea and coffee, and people grouped together around deck chairs set up for the occasion—which was quite unusual for the 70 or so inhabitants. Because despite having one of the clearest night skies in all of the U.K., it turns out that residents of Wales are the least likely to pause and look up at the stars.


We thought the launch of astrophotography on Pixel 4’s Night Sight mode was a great opportunity to try and change that, and where better to start than the aptly named Star? Photos of the night sky have traditionally been best left to the experts, but Pixel 4 makes it easy for anyone to snap a stunning shot of the Milky Way. So we brought a handful of new phones, along with some chairs and tripods, to give the people of Star a new way to stargaze. Here are a few shots from the night, taken on Pixel 4.

If you’re looking to try your hand at astrophotography, our photography lead engineer Marc Levoy has a few tips for you.

  1. Hold your phone still. Use two hands to hold your device. Tuck your elbows into your sides, and hold the phone close to your chest.  Spread your feet apart to create a stable base. Lean against a wall or solid object to prevent you from swaying back and forth.

  2. Use a makeshift tripod.When the Pixel is held still against against anything stable—a tree trunk, a big rock, a car hood—the camera enters a “braced” mode. It will use longer exposures, and give you even more detail and less noise than when you hand-hold it.  

  3. Be patient. In very dark environments, the Pixel 4 may need some time to gather enough light. The phone tips you off to that with a countdown timer, which may be up to four minutes if you’re using a tripod. But if you’re about to get interrupted—say a car headlamp is about to come into the picture—you can end the capture early with the stop button. 

  4. Let autofocus do its thing.For best results, we recommend just letting the camera do the work and using autofocus. But if you’re determined to strike out on your own, select a manual focus mode (“near” or “far”) on the toolbar. “Far” is what you’ll want for astro shots. If your subject is close to you (within six feet), choose “near.”

  5. Play around with the exposure compensation slider.Take a night sky photo that was too bright or too dark? Try again and adjust the exposure compensation slider: Tap on your subject, move the exposure slider up or down, and take a photo. Tap again to reset it.






Record a lecture, name that song: Pixel 4 uses on-device AI

Pixel 4 is our latest phone that can help you do a lot of stuff, like take pictures at night or multitask using the Assistant. With on-device AI, your camera can translate foreign text or quickly identify a song that’s playing around you. Everything needed to make these features happen is processed on your phone itself, which means that your Pixel can move even quicker and your information is more secure. 


Lens Suggestions

When you point your camera at a phone number, a URL, or an email address using Pixel, Google Lens already helps you take action by showing you Lens Suggestions. You can call the number, visit the URL or add the email address to your contact with single tap. Now, there are even more Lens Suggestions on Pixel 4. If you’re traveling in a foriegn country and see a language you can’t read, just open your camera and point it at the text, and you’ll see a suggestion to Translate that text using Lens. For now, this works on English, Spanish, German, Hindi, and Japanese text, with more to come soon. 


There are also Lens Suggestions for copying text and scanning a document, which are processed and recognized on-device as well. So if you point your camera at a full page document, you’ll see a suggestion to scan it and save it for later using Lens. 

just-chip-tight-fit.gif

Lens will prompt you with a suggestion to translate foreign text, which happens on device. Then, you’ll see the translation in your native language.

Recorder

Remember that time you were in a brainstorm, and everyone had good ideas, but no one could remember them the next day? Or that meeting when you weren’t paying attention because you were too busy taking notes? With the Recorder app on Pixel 4, you can record, transcribe and search for audio clips. It automatically transcribes speech and tags sounds like applause (say your great idea was met with cheers!), music or whistling, and more, so you can find exactly what you’re looking for. You can search within a specific recording, or your entire library of recordings, and everything you record stays on your phone and private to you. We're starting with English for transcription and search, with more languages coming soon.

Now Playing

Now Playing is a Pixel feature that identifies songs playing around you. If that song gets stuck in your head and you want to play it again later, Now Playing History will play it on your favorite streaming service (just find the song you want, tap it to listen to it on Spotify, YouTube Music and more). On Pixel 4, Now Playing uses a privacy-preserving technology called Federated Analytics, which figures out the most frequently-recognized songs on Pixel devices in your region, without collecting individual audio data from your phone. This makes Now Playing even more accurate because the database will update with the songs people are most likely to hear (without Google ever seeing what you listen to).


With so much processing happening directly on your Pixel 4, it’s even faster to access the features that make you love being a #teampixel member. Pre-order Pixel 4 or head out to your local AT&T, Verizon, T-Mobile or Sprint store on October 24.