The promise of using AI to help prostate cancer care

In 2021, nearly 250,000 Americans will be diagnosed with prostate cancer, which remains the second most common cancer among men in the U.S. Even as we make advancements in cancer research and treatment, diagnosing and treating prostate cancer remains difficult. This National Prostate Cancer Awareness Month, we’re sharing how Google researchers are looking at ways artificial intelligence (AI) can improve prostate cancer care and the lessons learned along the way.  

Our AI research to date 

Currently, pathologists rely on a process called the ‘Gleason grading system’ to grade prostate cancer and inform the selection of an effective treatment option. This process involves examining tumor samples under a microscope for tissue growth patterns that indicate the aggressiveness of the cancer. Over the past few years, research teams at Google have developed AI systems that can help pathologists grade prostate cancer with more objectivity and ease. 

These AI systems can help identify the aggressiveness of prostate cancer for tumors at different steps of the clinical timeline — from smaller biopsy samples during initial diagnosis to larger samples from prostate removal surgery. In prior studies published in JAMA Oncology and Nature Partner Journal Digital Medicine, we found our AI system for Gleason grading prostate cancer samples performed at a higher rate of agreement with subspecialists (pathologists who have specialized training in prostate cancer) as compared to general pathologists. These results suggest that AI systems have the potential to support high-quality prostate cancer diagnosis for more patients. 

To understand this system's potential impact within a clinical workflow, we also studied how general pathologists could use our AI system during their assessments. In arandomized study involving 20 pathologists reviewing 240 retrospective prostate biopsies, we found that the use of an AI system as an assistive tool was associated with an increase in grading agreement between general pathologists and subspecialists. This indicated that AI tools may help general pathologists grade prostate biopsies with greater accuracy. The AI system also improved both pathologists’ efficiency and their self-reported diagnostic confidence. 

In our latest study in Nature Communications Medicine, we directly examined whether the AI’s grading was able to identify high-risk patients by comparing the system’s grading against mortality outcomes. This is important because mortality outcomes are one of the most clinically relevant results for evaluating the value of Gleason grading, ensuring greater confidence in the AI’s grading. We found that the AI’s grades were more strongly associated with patient outcomes than the grades from general pathologists, suggesting that the AI could potentially help inform decision-making on treatment plans. 


Contributing to reducing variability in AI research 

We first began training our AI system using Gleason grades from both general pathologists and subspecialists. As we continued to develop AI systems for assisting prostate cancer grading, we learned that both training the AI and evaluating the model’s performance can be challenging because often the “ground truth” or reference standard is based on expert opinion. Because of this subjectivity, for some cases, two pathologists examining the same sample may arrive at a different Gleason grade.

To improve the quality of the “ground truth”, we developed a set of best practices that we have shared this week in Lancet Digital Health. These recommendations include involving experienced prostate pathology experts, making sure that multiple experts look at each sample, and designing an unbiased disagreement resolution process. By sharing these learnings, we hope to encourage and accelerate further work in this area, particularly in earlier-phase research when it’s impractical to train or validate a model using patient outcomes data.

Our research has shown that AI can be most helpful when it's built to support clinicians with the right problem, in the right way, at the right time. With that in mind, we plan to further validate the role of AI and other novel technologies in helping improve prostate cancer diagnosis, treatment planning and patient outcomes. 

Upgrade your drive with Google as your copilot

Do you drive with your phone clipped to your air vent? Or does your car have the latest built-in infotainment system? No matter what kind of car you own, Google is ready to make your drive better.  We’re bringing updates to Google Assistant driving mode, Android Auto and cars with Google built-in (welcome Honda!) to help every driver find their way around, stay entertained, and keep in touch.


Google Assistant driving mode on Android phones gets a new dashboard

Millions of people in more than 12 countries use Google Assistant driving mode every day, by offering  voice-activated help via your Android phone in older cars. We originally launched it for active navigation in Google Maps, helping drivers manage tasks, like answering a call or responding to text messages with minimal distraction. 

Thanks to early feedback, we heard how important it is to have your go-to apps handy for your drive, even when you don’t need turn-by-turn navigation. So coming soon, you’ll be able to say “Hey Google, let’s drive” (or connect your phone to your car’s Bluetooth) to open the new driving mode dashboard, reducing the need to fiddle with your phone while also making sure you stay focused on the road. With glanceable, tappable cards, the basics you’ll need for the road are available with a single tap — no scrolling required: Start your navigation, see who called or texted recently and quickly resume media from Amazon Music, Audible, iHeartRadio, JioSaavn, Pandora, Podcast Addict, SoundCloud, Spotify, YouTube Music and more providers. Plus, there’s a new messaging update: Just say “​​Hey Google, turn on auto read” to hear new messages read aloud as they come in and to respond by voice.

Driving mode will be the primary experience for Android phones going forward and will fully roll out in the next few weeks for Android phones in English (U.S., Australia, Canada, Ireland, India, Singapore and U.K.), German, Spanish (Spain, Mexico), French and Italian.

Image of the new Google Assistant driving mode dashboard which features easy to see, tappable cards to find media, navigate and call / text..

Improvements coming to Android Auto on car displays

We’re also launching new features for the more than 100 million cars compatible with Android Auto — bringing help from Google onto your car display via your Android phone. 

You’ll now see music, news and podcast recommendations from Google Assistant, and be able to set which app launches whenever Android Auto starts. You’ll even be able to enjoy games from GameSnacks right from the car’s display while you’re parked, waiting for a to-go order or charging your vehicle. 

If you’re a dual-SIM Android phone user, you can now choose which SIM card to use when making calls through Android Auto. And great news for commuters: ​​Android Auto will support your “Work profile,” which lets you see upcoming work meetings and messages on your car’s display. 

When it’s time to fill up at the gas station, you can now put away your credit card or cash and say, “Hey Google, pay for gas” on Android Auto or from your Android phone. Select your pump number and  complete contactless payment with Google Pay. This will be available at over 32,500 gas stations across the U.S. starting with Exxon and Mobil, Conoco, Phillips 66 and 76 stations. 


The best of Google apps and services built-in to more cars

In the coming years, millions of cars will have Google fully built-in to their infotainment systems, so you can get around with Google Maps, use Google Assistant to turn on the A/C, download your favorite apps on Google Play and much more, even without a smartphone.

Image of Honda's brand logo

We’re excited to share that our newest partner, Honda, will be launching future models with Google built-in starting in 2022. In addition to Honda, this experience will be available on cars from top brands including Ford, General Motors, Polestar, Renault and Volvo Cars. Today, you can test drive or purchase cars with Google built-in —  like the Polestar 2 and Volvo XC40 Recharge — and it’s coming to many more cars soon, like the new Chevrolet Silverado and Renault Mégane E-TECH Electric.

Image of a user asking Google to help find the nearest charging station from a car with Google -built in

If you drive an electric vehicle with Google built in, we make it easy to find charging stations and minimize charging time with Google Maps. Just say, “Hey Google, find me a charging station” to instantly see nearby stations compatible with your car, payment type and speed preferences, along with real-time information about whether or not a charger is available. And with new support for thermal battery management, Google Maps saves you precious time by helping your car’s battery heat up or cool down before you charge, reducing the amount of time you need to spend at a charger. 

No matter what car you drive, we’re working hard to make sure you have the help you need from Google to get things done while keeping your hands on the wheel and eyes on the road. 


New Android features coming this season

With Android, you can look forward to your device getting better and better throughout the year. From accessibility to Android Auto to features that make your life just a little easier, like Assistant and Gboard, we’re rolling out new features that help you do more, stay secure and have more fun with your Android phone.

Animation of Camera Switches and Project Activate in use

Camera Switches [left] and Project Activate [right]

Control your phone with your facial gestures

Inspired by people with motor and speech disabilities, Camera Switches and Project Activate are two new accessibility features that let you use your phone by making facial gestures. Beginning to roll out this week, Camera Switches is a feature within Android Accessibility Suite that turns your front-facing camera into a switch (an adaptive tool that replaces a keyboard, mouse, or touching the phone screen) so that you can navigate your phone. Project Activate is a new app that makes it easier to communicate and express yourself in the moment. You can use facial gestures and eye movements to activate preset actions like speaking a phrase (like "Wait!"), playing audio (like a laugh) or sending a text message (like "Please come here"). 

Also new in accessibility, we’re bringing handwriting recognition to Lookout, an app that uses your phone’s camera to assist people with low vision or blindness get things done faster and more easily. In Documents mode, Lookout will now read out both handwritten and printed text for Latin-based languages. Also, in response to Lookout’s growing global audience, we are adding Euro and Indian Rupee recognition within Currency mode, with more on the way.

Control your TV with your phone

Starting today, you can find something great to watch on your Google TV even when the couch has eaten your remote. We’ve built remote-control features directly into your Android phone so you can power on your TV, navigate through your recommendations or even start up your favorite show right from your phone. And you can use your phone’s keyboard to quickly type complicated passwords, movie names or search terms. Try it on your Google TV or other Android TV OS devices by adding the remote tile to quick settings on your Android phone, or by visiting the Google TV app — coming to 14 more countries over the next few weeks.

Manage day-to-day tasks using Reminders from Assistant

Keeping track of everyday to-dos is even easier with Reminders. You can now manage all your reminders in one place by saying, “Hey Google, open my reminders" where you'll also see helpful suggestions for recurring reminders that you can activate with a tap. And of course, you can continue to use your voice to create and automate your to-dos. Just say, “Hey Google, remind me to water the plants every morning.” Once set, Google will notify you at the perfect moment across your devices, whether you’re at home or on the go.

Stay entertained, connected and on track during your drive

With Android Auto, you can stay entertained by quickly launching and listening to your favorite music, news and podcasts with personalized recommendations from Google Assistant. You can also play a variety of games from GameSnacks while you’re parked waiting for a to-go order or charging your car. 

And for commuters, ​​Android Auto can help you stay on top of important work meetings and messages with new support for your work profile. Plus, if you’re a dual-SIM Android phone user, you can now choose which SIM card to use when making calls through Android Auto. 

To help you stay on track, Waze on Android Auto is also getting a refresh to create a more streamlined navigation experience. The new design includes touchpad support, night mode and lane guidance support, and puts the map and directions at the forefront so other elements aren’t in the way. With Waze, Google Maps and many more navigation apps, Android Auto makes it easy to get to where you need to be.   

These Android Auto features will be available soon on Android phones when connected to a compatible car. If you don’t have a compatible car, you can check out other ways Google can help on your drive including new updates to Google Assistant driving mode and more on your Android phone.

Add photos and videos to a passcode-protected space

Previously on Pixel only and rolling out soon to Android, Locked Folder in Google Photos gives you a passcode-protected space to save photos and videos separately, so they won’t show up as you scroll through Google Photos or any other apps on your device.

Animation of the crying laughing emoji and the owl emoji being combined into a laughing owl sticker

Express yourself with Gboard

Express how you really feel with new additions to Emoji Kitchen on Gboard. With over 1,500 stickers coming this fall, you’ll be able to create even more combinations of your favorite emoji like ????. 

In addition to making your messages more fun, Gboard is also helping you communicate faster and more fluidly with new features.

gif of Smart Compose being used to auto-complete a message someone is sending

First, when you copy text that includes multiple information like phone numbers, email addresses and URLs, Gboard will automatically extract and separate them into multiple pasting options, so you can choose the information that is most important to you. Second, when you open a messaging app right after taking a screenshot, Gboard will now show that screenshot as a suggestion to share. And finally, for devices running Android 11 or newer, the power of machine learning lets you quickly complete your sentences with just a swipe with Smart Compose.


Control who shares with you

With improved visibility settings in Nearby Share, you can take full control of who can discover your device and send files. Choose between everyone, your contacts, or no one, and you can easily change your preference through your phone’s Quick Settings space anytime.

And wait, there’s more

With the Heads Up feature, you can get reminders to look up and stay alert when you’re walking and using your phone. Launched first on Pixel earlier this year, Heads Up is now available through the Digital Wellbeing setting on devices running Android 9 and newer. 


We can’t wait for you to try out all these features. Learn more about each at https://www.android.com/google-features-on-android/fall-2021/.

Two new tools that make your phone even more accessible

Every day, people use voice commands, like “Hey Google,” or their hands to navigate their phones. However, that’s not always possible for people with severe motor and speech disabilities. 

To make Android more accessible for everyone, we’re introducing two new tools that make it easier to control your phone and communicate using facial gestures: Camera Switches and Project Activate. Built with feedback from people who use alternative communication technology, both of these tools use your phone’s front-facing camera and machine learning technology to detect your face and eye gestures. We’ve also expanded our existing accessibility tool, Lookout, so people who are blind or low-vision can get more things done quickly and easily. 

Camera Switches: navigate Android with facial gestures 

In 2015, we launched Switch Access for Android, which lets people with limited dexterity navigate their devices more easily using adaptive buttons called physical switches. Camera Switches, a new feature in Switch Access, turns your phone’s camera into a new type of switch that detects facial gestures. Now it’s possible for anyone to use eye movements and facial gestures to navigate their phone — sans hands and voice! Camera Switches begins rolling out within the Android Accessibility Suite this week and will be fully available by the end of the month. . 

You can choose from one of six gestures — look right, look left, look up, smile, raise eyebrows or open your mouth — to scan and select on your phone. There are different scanning methods you can choose from — so no matter your experience with switch scanning, you can move between items on your screen with ease. You can also assign gestures to open notifications, jump back to the home screen or pause gesture detection. Camera Switches can be used in tandem with physical switches. 

We heard from people who have varying speech and motor impairments that customization options would be critical. With Camera Switches, you or a caregiver can select how long to hold a gesture and how big it has to be to be detected. You can use the test screen to confirm what works best for you. 

A gif showing the customization options in Camera Switches.

An individual and their caregiver customize Camera Switches. The set up process, shown through a finger on the screen, showcases customization for the size of gestures and assigning the gesture to a scanning action. 

To get started, head to the Android Accessibility settings on your Android phone under Switch Access or download the app. For more information go to g.co/cameraswitches

Project Activate: making communication more accessible

Project Activate, a new Android application, lets people use these same facial gestures to quickly activate customized actions with a single gesture — like speaking a preset phrase, sending a text, making a phone call or playing audio. 

To understand how face gestures could allow for communication and personal expression, we worked with numerous people with motor and speech impairments and their caregivers. Darren Gabbert is an expert at using assistive technology and communicates using a speech-generating device. He uses physical switches to type letters that his computer speaks aloud. It's a slow process that makes fully participating in conversations difficult. With Project Activate, Darren has a quick and portable way to respond in the moment — using just his phone. He can answer yes or no to questions, ask for a minute to type something into his speech-generating device, or shoot a text to his wife asking her to come in from another room. 

Customization is built into all areas of the application — from the particular actions you’d like to trigger, to the facial gestures you want to use, to how sensitive the application is to your facial gestures. So whatever your facial mobility, you can use Project Activate to express yourself.

Project Activate is available in the U.S., U.K., Canada, and Australia in English and can be downloaded from the Google Play store

Lookout: Expanding to new currencies and modes

We’re always updating our accessibility features and tools so that more people can benefit. In 2019, we launched Lookout for people who are blind or low-vision. Using a person’s smartphone camera, Lookout recognizes objects and text in the physical world and announces them aloud. Lookout has several modes to make a variety of everyday tasks easier — from identifying food products to describing objects in your surroundings. 

Last year, we introduced Documents mode for capturing text on a page. Starting today, Documents mode can now read handwritten text,  including sticky notes and birthday cards from friends and family. Lookout supports handwriting in Latin-based languages, with more coming. Additionally, with more people around the world discovering Lookout, we’ve expanded Currency mode to recognize Euros and Indian Rupees, with more currencies on the way. 

Building a more accessible Android

We believe in building truly helpful products with and for people with disabilities and hope these features can make Android even more accessible. If you have questions on how these features can be helpful, visit our Help Center, connect with our Disability Support team or learn more about our accessibility products on Android

Bring performance and privacy together with Server-Side Tagging

It’s important for businesses to have the insights they need to drive more conversions on their websites. But rising expectations and regulations around user privacy can make it hard to meet both performance and privacy needs. We’re continuing to invest in solutions to help you find that balance.

Server-Side Tagging in Google Tag Manager allows you to move measurement and advertising tags off your website and into a secure server container. This helps protect your customers by restricting access to their information, and helps increase conversion rates on your site by reducing page load times.

To ensure all businesses can use this feature, Server-Side Tagging now works with any cloud or server provider that supports Docker — an open source platform for developing and running applications. We’ve also integrated Server-Side Tagging into more Google products and services to help you move more tags off your website and achieve better site performance. With these improvements, we're moving Server-Side Tagging out of beta and making it generally available to all customers in Tag Manager and Tag Manager 360.

Support for more Google advertising products

Server-Side Tagging now supports Google Ads and Google Marketing Platform products, including Campaign Manager 360, Display & Video 360 and Search Ads 360. Previously, you had to continue using a client-side tag for each marketing product you use, and keep them all running directly on your site.

Now, when customers interact with your site, a single client-side tag can activate multiple tags for these products directly in your server container. This means you’ll have fewer tags on your site, which can help improve your site's page load time.

Integration with other privacy solutions

Marketers often ask us how to use Server-Side Tagging with other privacy solutions like Consent Mode and enhanced conversions. Consent Mode helps you customize how Google tags behave before and after users make their consent decisions; and enhanced conversions help you use consented, first-party, user-provided data to better understand how users convert after engaging with your ads.

We're now making it simpler to use these products together. Advertisers with Google Analytics 4 on their sites will soon be able to use enhanced conversions in Google Ads without needing to add additional tags to their site. And once you’ve set up Consent Mode, any Google tags implemented in your server container will automatically respect consent choices that users have made on your website.

We're also making it easier for you to ensure that user data is handled according to your security preferences. Server-Side Tagging automatically anonymizes your users’ IP addresses before the information is shared with Google’s reporting tools. And in cases where you need more control, you have the option to eliminate users’ IP addresses from your data completely before they’re shared.

Success with Server-Side Tagging

Since launching Server-Side Tagging last year, we’ve seen businesses around the world use this feature to uphold higher expectations around user privacy and drive better marketing performance.

Nemlig, Denmark’s leading online grocer, saw a large rise in visitors to its site as people turned to online shopping and home delivery for their daily essentials last year. This resulted in longer page load times, which negatively impacted conversion rates on Nemlig’s site. After adopting Server-Side Tagging, the company was able to move tags from the browser into its secure server container, improving its page load time by 7%. Read the full story here.

Square has also found success with Server-Side Tagging. The San Francisco-based company helps businesses of all sizes reach buyers online and in person, manage their business and access financing. Since implementing Server-Side Tagging, Square has seen a 46% increase in reported conversions.

Server-Side Tagging is our preferred method for sending measurement data to our marketing partners. It allows us to collect data from the website in a secure manner while improving data collection and enabling event enrichment. Doug Logue
Sr Product Manager of Marketing Technology, Square

With Server-Side Tagging, you can improve both user trust and website performance. As we continue to work on new features and updates, our goal is to help you achieve your privacy and performance goals across all of your measurement needs.

What this Cloud Googler learned from the military

Welcome to the latest edition of “My Path to Google,” where we talk to Googlers, interns and alumni about how they got to Google, what their roles are like and even some tips on how to prepare for interviews.

Today we spoke with Dennis James, Director of Cloud Customer Experience for the US East Region and a veteran of the United States Army. Dennis talks to us about his time in the military, his transition to Google and why it’s important to keep trying — even if you don’t succeed the first time.

Can you tell us a bit about yourself?

I grew up in Long Island, New York. Both of my parents were educators, and my father was also a volunteer (and eventually Chief) firefighter and paramedic. There was always a strong theme of leadership, academics and service in our household. 

That environment undoubtedly influenced my decision to attend the United States Military Academy — otherwise known as "West Point." Once I got there, I participated in many physical activities while also pursuing my passion for electronics. I majored in electrical engineering and spent most of my downtime tinkering with gadgets at West Point’s computer lab. 

After graduation, I served as an infantry officer in the US Army with the 25th Infantry Division and deployed to Iraq from December 2007 to February 2009. When I returned, I left active duty to become an IT strategy consultant in Washington DC, while also serving as a Military Intelligence Officer in the Army Reserves. I attended Columbia Business School two years later, where I was accepted to the Google MBA internship program. I started full time at Google in 2013, and have been here ever since!

What do you do at Google?

I'm on the Google Cloud Customer Experience team, which provides consulting, training, technical account management and support services to our customers and partners. One example of our work that I’m particularly proud of is how we helped the New York City Department of Education support a quick transition to remote teaching and learning with Google Classroom. 

What made you decide to apply to Google?

During my deployment to Iraq, I realized I was ready for a new challenge outside of the military — ideally in the technology world. I started looking through a directory of former service members who now worked at tech companies, and connected with a Naval Academy graduate and Aviator who worked at Google. He shared helpful advice about his own journey, and helped me think about jobs I might like and what skills they required. Through his ongoing coaching and support, he became an important mentor and part of my path to joining Google.

I loved the idea of working at Google, but I hesitated to apply at first. I was worried that I wouldn’t be considered a good fit because of my background, and that it would be hard to convey my experiences to someone outside of the military. It took me a lot of time (and work!) to overcome these feelings. But by continuing to meet with my Google mentor, growing my skills in the military, and earning my MBA, I ultimately built up my confidence to apply for an internship.

Dennis smiling in his military uniform and holding his helmet

Dennis while serving in Iraq

Do you have any tips you’d like to share with aspiring Googlers?

Show up with enthusiasm and, most importantly, be yourself. In my case, I embraced my military background and channeled those leadership skills into the business world. And when I reflect on the reasons behind my success at Google, the vast majority tie back to my military experience. 

And finally, don’t get discouraged if you don’t succeed at first. If you’re passionate about what you’re doing, keep at it.

Dennis and his wife, Tiffany, standing and smiling in front of a Google building, while holding their twins, Gabriella and Mason

Dennis with his wife, Tiffany, and twins, Gabriella and Mason

Spanish arrives on Nest Hub and Hub Max in the U.S.

Ver abajo versión en español

Whether you're learning, practicing or a native Spanish speaker, getting help around the house just got a little easier with the latest updates en español on Nest Hub and Hub Max in the U.S.

Starting today, Spanish on Nest Hub and Hub Max gets even better, with more queries and display text in Spanish. To add or switch to Spanish, just go to Languages in Assistant settings in the Google Home app. Now you can do more in Spanish – whether that’s enjoying music, video chatting with your family, watching your favorite sports teams or controlling your smart home devices. 

Rock out to your favorite music legends by simply saying “Hey Google, reproduce Rock en Español.” Choose to listen from several free and subscription-based music services. Just ask Google for your favorite artists, songs, albums or genres — todo en español.

Now, with a YouTube TV subscription, you can also stream Univision to watch your favorite live shows and sports. Just say, "Hey Google, quiero ver Univision en YouTube TV" and enjoy.
Loteria Don Clemente on Google Nest Hub

Game night? Gather the family around your Nest display to try out the popular Mexican Bingo game called “Lotería” — in English or Spanish. Just say "Hey Google, habla con Loteria Don Clemente" to get started.  With music, sound effects, and a fully-recorded game show host, the whole experience comes to life.

Discover new arroz con leche recipes by saying, "Hey Google, muéstrame cómo preparar arroz con leche." Then follow along with step-by-step instructions in Spanish. Or learn something new by saying, "Muéstrame cómo hacer pasta en YouTube."

When it’s time to catch up with loved ones, Google Duo makes it easy. If Friday night plans require a quick huddle with friends, or you need an impromptu gut check with Mom, just say, "Hey Google, llama a mamá" and you'll be instantly connected.

Google Duo group video call on Google Nest Hub Max

Finally, we could all use a hand around the house. Now, get all the help you need in Spanish by simply saying:

  • "Hey Google, agrega huevos a mi lista de compras" and items will be automatically added to a centralized shopping list

  • "Hey Google, muéstrame la cámara de la puerta principal" and you'll quickly see who or what is at your front door

  • "Hey Google, anuncia es hora de cenar'' when it's time to bring everyone together for dinner.

You can do all this and more in Spanish across your Google Nest products, including Nest Audio, Nest Mini and now, Nest Hub and Nest Hub Max.

El español llega a Nest Hub y Hub Max en EE. UU.

Pantalla de Tu Dia en Google Nest Hub

Ya sea que estés aprendiendo, practicando o el español sea tu lengua materna, obtener ayuda en casa ahora es un poco más fácil con la última actualización en español al Nest Hub y Hub Max.

A partir de hoy, la experiencia en español en Nest Hub y Hub Max es aún mejor, con más consultas y visualizador de texto en español. Para agregar o cambiar a español, simplemente ve a Idiomas en la configuración del Asistente en la aplicación Google Home. Ahora puedes hacer más en español, ya sea disfrutar de la música, charlar con tu familia virtualmente por video, ver a tus equipos favoritos o controlar dispositivos inteligentes en la casa. 

Muévete al ritmo de rock con tus leyendas musicales favoritas simplemente diciendo "Hey Google, reproduce Rock en Español". Simplemente pregúntale a Google por tus artistas, canciones, álbumes o géneros favoritos: todo en español.

Ahora también puedes transmitir Univision y tus programas y deportes en vivo favoritos con una suscripción a YouTube TV, simplemente diciendo: "Hey Google, quiero ver Univision en YouTube TV" y disfruta.
Lotería Don Clemente en Google Nest Hub

¿Noche familiar? Reúne a la familia alrededor de tu pantalla Nest y prueba el popular juego mexicano titulado "Lotería", ¡en inglés o español!  Simplemente diciendo"Hey Google, habla con Lotería Don Clemente," Con un narrador, música y sonidos completamente grabados, la experiencia completa cobra vida.

Descubre una nueva receta de arroz con leche al decir "Hey ​​Google, muéstrame cómo preparar arroz con leche". Luego, sigue las instrucciones paso a paso en español. O aprende algo nuevo diciendo "Muéstrame cómo hacer pasta en YouTube".

Cuando te tengas que poner al día con familiares y amigos, Google Duo lo hace más fácil. Si los planes del viernes por la noche requieren una reunión rápida con amigos o si necesitas una cita espontánea con mamá, sólo di "Hey Google, llama a mamá" y estarás conectado al instante.

Videollamada en grupo con Google Duo en el Google Nest Hub Max

Finalmente, todos podemos utilizar una ayudadita de más en la casa.  Ahora puedes obtener toda la ayuda que necesitas en español simplemente diciendo:

  • "Hey Google, agrega huevos a mi lista de compras" y el producto se agregará automáticamente a una lista de compras centralizada.

  • "Hey Google, muéstrame la cámara de la puerta principal" y verás rápidamente quién o qué está en la puerta de tu casa.

  • "Hey Google, anuncia es hora de cenar" cuando sea el momento de reunir a todos para cenar.

Puedes hacer todo esto y más en español en todos tus productos Google Nest: Nest Audio, Nest Mini y ahora, Nest Hub y Nest Hub Max.

Financially motivated actor breaks certificate parsing to avoid detection

Introduction

Google’s Threat Analysis Group tracks actors involved in disinformation campaigns, government backed hacking, and financially motivated abuse. Understanding the techniques used by attackers helps us counter these threats effectively. This blog post is intended to highlight a new evasion technique we identified, which is currently being used by a financially motivated threat actor to avoid detection.

Attackers often rely on varying behaviors between different systems to gain access. For instance, attacker’s may bypass filtering by convincing a mail gateway that a document is benign so the computer treats it as an executable program. In the case of the attack outlined below, we see that attackers created malformed code signatures that are treated as valid by Windows but are not able to be decoded or checked by OpenSSL code — which is used in a number of security scanning products. We believe this is a technique the attacker is using to evade detection rules.

Technical Details

Code signatures on Windows executables provide guarantees about the integrity of a signed executable, as well as information about the identity of the signer. Attackers who are able to obscure their identity in signatures without affecting the integrity of the signature can avoid detection longer and extend the lifetime of their code-signing certificates to infect more systems.

OpenSUpdater, a known family of unwanted software which violates our policies and is harmful to the user experience, is used to download and install other suspicious programs.The actor behind OpenSUpdater tries to infect as many users as possible and while they do not have specific targeting, most targets appear to be within the United States and prone to downloading game cracks and grey-area software.

Groups of OpenSUpdater samples are often signed with the same code-signing certificate, obtained from a legitimate certificate authority. Since mid-August, OpenSUpdater samples have carried an invalid signature, and further investigation showed this was a deliberate attempt to evade detection. In these new samples, the signature was edited such that an End of Content (EOC) marker replaced a NULL tag for the 'parameters' element of the SignatureAlgorithm signing the leaf X.509 certificate.

EOC markers terminate indefinite-length encodings, but in this case an EOC is used within a definite-length encoding (l= 13). 


Bytes: 30 0D 06 09 2A 86 48 86  F7 0D 01 01 0B 00 00 

Decodes to the following elements:

SEQUENCE (2 elem)

OBJECT IDENTIFIER 1.2.840.113549.1.1.11 sha256WithRSAEncryption (PKCS #1)

EOC


Security products using OpenSSL to extract signature information will reject this encoding as invalid. However, to a parser that permits these encodings, the digital signature of the binary will otherwise appear legitimate and valid. This is the first time TAG has observed actors using this technique to evade detection while preserving a valid digital signature on PE files. 

As shown in the following screenshot, the signature is considered to be valid by the Windows operating system. This issue has been reported to Microsoft.

Image of digital signatures settings

Since first discovering this activity, OpenSUpdater's authors have tried other variations on invalid encodings to further evade detection.

The following are samples using this evasion:

https://www.virustotal.com/gui/file/5094028a0afb4d4a3d8fa82b613c0e59d31450d6c75ed96ded02be1e9db8104f/detection

New variant:

https://www.virustotal.com/gui/file/5c0ff7b23457078c9d0cbe186f1d05bfd573eb555baa1bf4a45e1b79c8c575db/detection

Our team is working in collaboration with Google Safe Browsing to protect users from downloading and executing this family of unwanted software. Users are encouraged to only download and install software from reputable and trustworthy sources.