Tag Archives: Google Assistant

Celebrate Native American artists in Chrome and ChromeOS

It’s Native American Heritage Month in the U.S., a time when we honor the history, traditions and contributions of Native Americans. As a citizen of the Cherokee Nation, I celebrate this month by taking time to reflect and express gratitude for my ancestors, the resilience of my tribe and other Indigenous people, and future generations carrying our tribal traditions forward.

As a product manager at Google, I’m also proud of how we’re celebrating across our products. On Google Assistant, for example, just say “Happy Native American Heritage Month” or “Give me a fact about Native American Heritage” throughout the month of November to hear a collection of historical facts and stories from the Native American community. Meanwhile, a recent Doodle on Google’s homepage celebrated the history of Stickball, a traditional sport created by Indigenous tribes.

An image of a recent Doodle on Google’s homepage with 5 abstract characters playing Stickball, a traditional sport created by Indigenous tribes.

We also commissioned five Native American artists to create a collection of themes for Chromebooks and Chrome browser. This collection has a special meaning to me because it showcases important traditions and reminds me of home. Richard D. York’s piece “ᎤᎧᏖᎾ (Uktena, or Horned Serpent)” in particular brings me back to my childhood listening to the stories of Uktena and other tales from my elders. A more solemn work, “A Lot Meant,” reminded me of growing up in Oklahoma and how historical policies like allotment impacted my family and so many others.

Now available globally, these themes reflect the unique experiences and identities of each artist. Here’s what they shared about their work:

To apply one of these themes (or others from Black, Latino and LGBTQ+ artists) to your Chrome browser, visit the Chrome Web Store collection, select a theme and click "Add to Chrome." You can also open a new tab and click the “Customize Chrome” button on the bottom right to explore background collections. To apply one of these wallpapers to your Chromebook, right-click your desktop, choose "Set wallpaper and style," then select "Native American Artists.”

Source: Google Chrome


New features for parents and kids on Google Assistant

Earlier this week, I was in the kitchen watching my kids — at the (very fun) ages of seven and 11 — engaged in a conversation with our Google Assistant. My son, who has recently discovered a love of karaoke, asked it to play music so he could practice singing along to his favorite band, BTS. He and his sister ask it all kinds of questions: “How tall is the Eiffel Tower?” “How much do elephants weigh?” “Where was the Declaration of Independence signed?”

Whether we’re dictating a text message in the car or starting a timer while cooking at home, one thing is true: Voice plays an increasingly important role in the way we get things done — not just for us, but for our kids, too. It allows them to indulge their curiosities, learn new things and tap into their creative, inquisitive minds — all without having to look at a screen. As a mom, I see firsthand how kids’ relationship with technology starts by discovering the power of their own voice. And as today’s kids grow up in a world surrounded by technology, we want to help them have safer, educational and natural conversational experiences with Assistant. Here’s how we’re doing it.

Parental controls for safer, age-appropriate content

Since we know kids — like my own — tend to use their families’ shared devices, we take very seriously our responsibility to help parents protect them from harmful and inappropriate content. Building on that long-standing commitment, we’re rolling out a number of new features that will make it safer for your kids to interact with Assistant.

To give parents more control and peace of mind over the interactions their children have on Google speakers and smart displays, we’re introducing parental controls for Google Assistant. In the coming weeks through the Google Home, Family Link and Google Assistant apps on Android and iOS, you can modify media settings, enable or disable certain Assistant functionality and set up downtime for your kids.

The home screen of Google Assistant parental controls displaying different options, including Media, Assistant features, Downtime and Assistant devices.

After selecting your child’s account, you can choose the music and video providers they can access — such as YouTube Kids, YouTube and YouTube Music — and your kids will only be able to explore content from those pre-selected providers. You can also decide whether you want your children to listen to news and podcasts on their devices.

Through parental controls, you can also control the specific Assistant features your kids can use — like restricting them from making phone calls or choosing what kind of answers they get from Assistant. And to encourage a healthy relationship between kids and technology, just say, “Hey Google, open Assistant settings.” From there, navigate to parental controls, and you can block off time when they shouldn’t use their devices, just like you can do on personal Android devices and Chromebooks. Whether you have parental controls turned on or not, we always make sure you’re in control of your privacy settings.

Educational and fun conversations with Kids Dictionary

“What does telescope mean?” “What is the definition of ‘fluorescent’?”

Kids are naturally inquisitive and often turn to their Assistant to define words like these when they’re not sure what they mean. To help make those interactions even more engaging, we're introducing Kids Dictionary, which gives simplified and age-appropriate answers across speakers, smart displays and mobile devices.

With Kids Dictionary, children’s interactions with Assistant can be both educational and fun, allowing them to fuel their interests and learn new things. When your child is voice matched and Assistant detects their voice asking about a definition, it will automatically respond using this experience in Kids Dictionary.

A text bubble asks “Hey Google, what does telescope mean?” A Nest Hub Max is shown next to the text bubble, displaying a picture of a telescope and its definition.

Whether they’re doing their homework or simply curious about a new word they saw in a book, they’re only a “Hey Google” away from a little more help.

Kid-friendly voices for more engaging interactions

Kids today are growing up with technology, so it’s important that their experiences are developmentally appropriate. In addition to our increased efforts around safety and education, we’re also introducing four new kid-friendly voices. These new voices, which we designed alongside kids and parents, were developed with a diverse range of accents to reflect different communities and ways of speaking. And like a favorite teacher, these voices speak in slower and more expressive styles to help with storytelling and aid comprehension.

To activate one of Assistant’s new kid-friendly voices, kids can simply say, “Hey Google, change your voice!” Parents can also help their child navigate to Assistant settings, where they can select a new voice from the options available.

Like all parents, I’m always amazed by my kids’ insatiable curiosity. And every day, I see that curiosity come to life in the many questions they ask our Assistant. We’re excited to not only provide a safer experience, but an educational and engaging one, too — and to continue our work to truly build an Assistant for everyone.

Google Assistant offers information and hope for Breast Cancer Awareness Month

It has been nearly 15 years since that otherwise ordinary Thursday afternoon when my mom came home with a diagnosis that would change our lives, irrevocably and forever: Stage II breast cancer. Despite the visceral and all-consuming fear that accompanies a cancer diagnosis, the oncologists reassured us hers was treatable, that she’d be there to dance at our weddings, that she’d live to grow old.

But she died instead.

Her cancer was too aggressive. She ran out of treatment options. Just two years after the word “cancer” cleaved our lives in half, she was gone — destroyed by a disease that could’ve been stopped had we just known sooner.

Unfortunately this experience — this painfully tragic, heartbreaking and circuitous trajectory — is shared by too many people. Every year, approximately 42,000 women in the U.S. die of breast cancer, and one of every eight women in the U.S. will be diagnosed with the disease over the course of her lifetime. These are women we love fiercely — our moms, sisters, friends, neighbors, daughters and leaders. And like my mom, for most of them — nearly 85% — this diagnosis comes with no family history whatsoever.

While we can’t stop the incidence of breast cancer, we know one thing is true: Early detection saves lives. Women who catch their cancers early — through regular screenings, checkups and mammograms — have a much higher chance of surviving. Of responding to treatments. Of living to meet their grandchildren.

That’s why, in honor of Breast Cancer Awareness Month, we’re sharing updates around how Google is helping. On top of our work building AI models that can improve the detection of breast cancer in screenings, we’re raising awareness about the importance of these checkups. For instance, we’re building features into products like Google Assistant to help people take early steps to protect themselves against breast cancer.

Breast cancer facts and resources on Assistant

Since more than 700 million people turn to Google Assistant every month as their go-to helper, it’s a great way to reach them in their everyday moments.

If you’re prone to putting off your checkups, just tell your Assistant, “Hey Google, set an annual reminder to get my breast exam on [date].” And if you say, “Hey Google, tell me about Breast Cancer Awareness Month” or “Give me a Breast Cancer Awareness fact” in the U.S., you’ll receive facts from the Centers for Disease Control and Prevention (CDC) about the critical importance of early detection and mammography in improving prognoses and saving lives. From here on out, you can always turn to Assistant as a fast and reliable source of this information, not just during Breast Cancer Awareness Month.

A graphic shows someone asking Google Assistant for a Breast Cancer Awareness fact, followed by a response referencing the CDC’s guidance to lower one’s risk of breast cancer.

To reach even more people, during the month of October we’re also sharing more information about breast cancer in response to some of the most common questions people ask their Assistant every day — including “What’s up?” and “How are you?” Give it a try today in the U.S. on your home or mobile device.

Like so many, I’ve learned firsthand that our lives can change in a single, ordinary moment — with the discovery of a tumor you pray is benign, a diagnosis for which you hope there is a cure, the fear that the person you love may not celebrate another birthday or live to become a grandmother. While breast cancer took my mom's life far too soon, I can think of no greater gift to share in her memory than the reminder for other women to detect and treat their diseases early. Before they’re too aggressive to cure. Before they can circumvent even the strongest treatments. Before it’s too late.

For the millions who use Google Assistant, we want to make this information as easy to find as a simple, “Hey Google, how are you?” And by doing that, provide something just as meaningful: a place to start, and a glimmer of hope.

Ask a Techspert: How does Google Assistant understand your questions?

Talking to Google Assistant is a real “wow, we’re officially in the future” moment for me, often to the point that it makes me wonder: How do voice-activated virtual assistants work? Specifically, how do they understand what someone is asking, then provide a correct, useful and even delightful response? For instance, a few weeks ago, I was playing around with Assistant before getting to my actual question, which was, naturally, food-related. I said, “Hey Google, what’s your favorite food?” Assistant’s answer was swift: “I’m always hungry for knowledge,” it said. As the cherry on top, the written version that appeared as Assistant spoke had a fork and knife emoji at the end of the sentence.

Assistant can respond to so many different types of queries. Whether you’re curious about the biggest mammal in the world or if your favorite ice cream shop is open, chances are Assistant can answer that for you. And the team that works on Assistant is constantly thinking about how to make its responses better, faster and more helpful than ever. To learn more, I spoke with Distinguished Scientist Françoise Beaufays, an engineer and researcher on Google’s speech team, for a primer on how Assistant understands voice queries and then delivers satisfying (and often charming) answers.

Françoise, what exactly do you do at Google?

I lead the speech recognition team at Google. My job is to build speech recognition systems for all the products at Google that are powered by voice. The work my team does allows Assistant to hear its users, try to understand what its users want and then take action. It also lets us write captions on YouTube videos and in Meet as people speak and allows users to dictate text messages to their friends and family. Speech recognition technology is behind all of those experiences.

Why is it so key for speech recognition to work as well as possible with Assistant?

Assistant is based on understanding what someone said and then taking action based on that understanding. It's so critical that the interaction is very smooth. You only decide to do something by voice that you could do with your fingers if it provides a benefit. If you speak to a machine, and you're not confident it can understand you quickly, the delight disappears.

So how does the machine understand what you're asking? How did it learn to recognize spoken words in the first place?

Everything in speech recognition is machine learning. Machine learning is a type of technology where an algorithm is used to help a “model” learn from data. The way we build a speech recognition system is not by writing rules like: If someone is speaking and makes a sound “k” that lasts 10 to 30 milliseconds and then a sound “a” that lasts 50 to 80 milliseconds, maybe the person is about to say “cat.” Machine learning is more intelligent than that. So, instead, we would present a bunch of audio snippets to the model and tell the model, here, someone said, “This cat is happy.” Here, someone said, “That dog is tired.” Progressively, the model will learn the difference. And it will also understand variations of the original snippets, like “This cat is tired” or “This dog is not happy,” no matter who says it.

The models we use nowadays in Assistant to do this are deep neural networks.

What’s a deep neural network?

It’s a kind of model inspired by how the human brain works. Your brain uses neurons to share information and cause the rest of your body to act. In artificial neural networks, the “neurons” are what we call computational units, or bits of code that communicate with each other. These computational units are grouped into layers. These layers can stack on top of each other to create more complex possibilities for understanding and action. You end up with these “neural networks” that can get big and involved — hence, deep neural networks.

For Assistant, a deep neural network can receive an input, like the audio of someone speaking, and process that information across a stack of layers to turn it into text. This is what we call “speech recognition.” Then, the text is processed by another stack of layers to parse it into pieces of information that help the Assistant understand what you need and help you by displaying a result or taking an action on your behalf. This is what we call “natural language processing.”

Got it. Let’s say I ask Assistant something pretty straightforward, like, “Hey Google, where's the closest dog park?” — how would Assistant understand what I'm saying and respond to my query?

The first step is for Assistant to process that “Hey Google” and realize, “Ah, it looks like this person is now speaking to me and wants something from me.”

Assistant picks up the rest of the audio, processes the question and gets text out of it. As it does that, it tries to understand what your sentence is about. What type of intention do you have?

To determine this, Assistant will parse the text of your question with another neural network that tries to identify the semantics, i.e. the meaning, of your question.

In this case, it will figure out that it's a question it needs to search for — it's not you asking to turn on your lights or anything like that. And since this is a location-based question, if your settings allow it, Assistant can send the geographic data of your device to Google Maps to return the results of which dog park is near you.

Then Assistant will sort its possible answers based on things like how sure it is that it understood you correctly and how relevant its various potential answers are. It will decide on the best answer, then provide it in the appropriate format for your device. It might be just a speaker, in which case it can give you spoken information. If you have a display in front of you, it could show you a map with walking directions.

To make it a little more complicated: If I were to ask something a bit more ambiguous, like, “Hey Google, what is the most popular dog?” — how would it know if I meant dog breed, dog name or the most popular famous dog?

In the first example, Assistant has to understand that you’re looking for a location ("where is") and what you’re looking for ("a dog park"), so it makes sense to use Maps to help. In this, Assistant would recognize it's a more open-ended question and call upon Search instead. What this really comes down to is identifying the best interpretation. One thing that is helpful is that Assistant can rank how satisfied previous users were with similar responses to similar questions — that can help it decide how certain it is of its interpretation. Ultimately, that question would go to Search, and the results would be proposed to you with whatever formatting is best for your device.

It’s also worth noting that there’s a group within the Assistant team that works on developing its personality, including by writing answers to common get-to-know-you questions like the one you posed about Assistant’s favorite food.

One other thing I’ve been wondering about is multi-language queries. If someone asks a question that has bits and bobs of different languages, how does Assistant understand them?

This is definitely more complicated. Roughly half of the world speaks more than one language. I’m a good example of this. I’m Belgian, and my husband is Italian. At home with my family, I speak Italian. But if I'm with just my kids, I may speak to them in French. At work, I speak English. I don't mind speaking English to my Assistant, even when I'm home. But I wouldn't speak to my husband in English because our language is Italian. Those are the kinds of conventions established in multilingual families.

The simplest way of tackling a case where the person speaks two languages is for Assistant to listen to a little bit of what they say and try to recognize which language they’re speaking. Assistant can do this using different models, each dedicated to understanding one specific language. Another way to do it is to train a model that can understand many languages at the same time. That’s the technology we're developing. In many cases, people switch from one language to the other within the same sentence. Having a single model that understands what those languages are is a great solution to that — it can pick up whatever comes to it.

More from this Series

Ask a Techspert

We ask experts at Google to explain complicated topics, for the rest of us. 

View more from Ask a Techspert

Google Assistant’s new updates make it easier than ever to get things done across devices

When we launched Google Assistant six years ago, we envisioned a world in which you could control lights and thermostats with your voice, naturally communicate with your devices in multiple languages, and simplify your daily tasks with voice controls and proactive reminders. Fast forward to today, and every month more than 700 million people in over 95 countries – and across 29 languages! – get things done reliably with their Assistant. As voice has become a primary way we engage with technology, Assistant makes it easy to get things done across different devices, whether you're at home or on the go.

Today at Made by Google, we shared some of our latest improvements to Assistant’s powerful AI capabilities, including new ways to interact more naturally with Assistant across Pixel 7 and the Google Pixel Watch. Here’s a look at a few of our favorites.

Use your voice to get things done faster and more easily

Send messages faster with Assistant voice typing

With Assistant voice typing, you can easily talk to Google to type, edit and send messages on average 2.5x faster than typing on the keyboard, and now in more languages – Spanish, Italian and French. It’s also getting more fun! When you’re writing a message, Assistant voice typing can now suggest emojis relevant to your messages and lets you search the emoji to insert with your voice even without knowing the exact name for it. Just say “LOL emoji” and Assistant will know what you mean ?.

Assistant voice typing with emoji suggestions and search

Discover more delightful calling and messaging experiences powered by Assistant

At Made by Google, we shared some ways Google is creating a more delightful calling and texting experience on Android-powered devices and how Assistant’s speech models are helping to make it even easier to communicate:

  • Call Screen helps you avoid unwanted calls and it has handled over 600 million calls for users last year.
  • Direct My Call, powered by Duplex, will now show call menu options right away, so you can tap to get where you need without waiting to hear lengthy recorded menus and has helped Pixel users navigate over 50 million calls with businesses.
  • Voice Message Transcription enables users to easily read audio messages on Google’s Messages app when you’re not in a great place to listen to one.

Launched last year on Pixel 6, quick phrases give you hands-free help on specific tasks, without needing to say “Hey Google.” On Pixel 7, you can now say “Silence” to dismiss incoming calls when you are not ready to pick up. And soon the Recorder app on Pixel 7 will include Speaker labels to differentiate and transcribe each speaker’s words separately, allowing you to capture meeting and interview notes with ease.

GIF of smartphone using Direct My Call feature

Direct My Call shows you call menu options right away before they were spoken

Bringing proactive intelligence directly to your screen

At a Glance on Pixel helps get what you need, when you need it – before you have to ask – right on your home or lock screen. If it’s going to rain or snow in your area in the next hour, At a Glance can proactively show you an update right on your phone so you can plan accordingly. Wondering if your package arrived? Get a video feed preview from your Nest doorbell. Traveling? Simply see flight or baggage information and your destination’s weather forecast.

Experience the best of Google Assistant on your new Pixel devices

With Assistant, you can use your voice in new and exciting ways on the sleek, new Pixel Watch. Take quick actions like sending messages, setting a timer, controlling your connected home devices and starting your run. Or when you’re wearing your Pixel Buds, you can say “Hey Google, play my workout playlist” to power through your cardio session. We also announced the Google Pixel Tablet, coming next year, which is designed to be helpful in your hand and in your home. With the Pixel Tablet and the Charging Speaker Dock, you can enjoy hands-free help from Assistant or a photo frame of your memories.

An image of Assistant working on a watch dial

Whether you’re using Assistant to send a message from your Pixel Watch, glancing at useful information on your Pixel 7 lock screen, or asking “Hey Google, find my phone” right from your watch, we want Assistant to be your go-to conversation helper. One that moves with you throughout your day – whether you’re at home or on the go – to make life easier and give you time to focus on what matters most.

Google Assistant’s new updates make it easier than ever to get things done across devices

When we launched Google Assistant six years ago, we envisioned a world in which you could control lights and thermostats with your voice, naturally communicate with your devices in multiple languages, and simplify your daily tasks with voice controls and proactive reminders. Fast forward to today, and every month more than 700 million people in over 95 countries – and across 29 languages! – get things done reliably with their Assistant. As voice has become a primary way we engage with technology, Assistant makes it easy to get things done across different devices, whether you're at home or on the go.

Today at Made by Google, we shared some of our latest improvements to Assistant’s powerful AI capabilities, including new ways to interact more naturally with Assistant across Pixel 7 and the Google Pixel Watch. Here’s a look at a few of our favorites.

Use your voice to get things done faster and more easily

Send messages faster with Assistant voice typing

With Assistant voice typing, you can easily talk to Google to type, edit and send messages on average 2.5x faster than typing on the keyboard, and now in more languages – Spanish, Italian and French. It’s also getting more fun! When you’re writing a message, Assistant voice typing can now suggest emojis relevant to your messages and lets you search the emoji to insert with your voice even without knowing the exact name for it. Just say “LOL emoji” and Assistant will know what you mean ?.

Assistant voice typing with emoji suggestions and search

Discover more delightful calling and messaging experiences powered by Assistant

At Made by Google, we shared some ways Google is creating a more delightful calling and texting experience on Android-powered devices and how Assistant’s speech models are helping to make it even easier to communicate:

  • Call Screen helps you avoid unwanted calls and it has handled over 600 million calls for users last year.
  • Direct My Call, powered by Duplex, will now show call menu options right away, so you can tap to get where you need without waiting to hear lengthy recorded menus and has helped Pixel users navigate over 50 million calls with businesses.
  • Voice Message Transcription enables users to easily read audio messages on Google’s Messages app when you’re not in a great place to listen to one.

Launched last year on Pixel 6, quick phrases give you hands-free help on specific tasks, without needing to say “Hey Google.” On Pixel 7, you can now say “Silence” to dismiss incoming calls when you are not ready to pick up. And soon the Recorder app on Pixel 7 will include Speaker labels to differentiate and transcribe each speaker’s words separately, allowing you to capture meeting and interview notes with ease.

GIF of smartphone using Direct My Call feature

Direct My Call shows you call menu options right away before they were spoken

Bringing proactive intelligence directly to your screen

At a Glance on Pixel helps get what you need, when you need it – before you have to ask – right on your home or lock screen. If it’s going to rain or snow in your area in the next hour, At a Glance can proactively show you an update right on your phone so you can plan accordingly. Wondering if your package arrived? Get a video feed preview from your Nest doorbell. Traveling? Simply see flight or baggage information and your destination’s weather forecast.

Experience the best of Google Assistant on your new Pixel devices

With Assistant, you can use your voice in new and exciting ways on the sleek, new Pixel Watch. Take quick actions like sending messages, setting a timer, controlling your connected home devices and starting your run. Or when you’re wearing your Pixel Buds, you can say “Hey Google, play my workout playlist” to power through your cardio session. We also announced the Google Pixel Tablet, coming next year, which is designed to be helpful in your hand and in your home. With the Pixel Tablet and the Charging Speaker Dock, you can enjoy hands-free help from Assistant or a photo frame of your memories.

An image of Assistant working on a watch dial

Whether you’re using Assistant to send a message from your Pixel Watch, glancing at useful information on your Pixel 7 lock screen, or asking “Hey Google, find my phone” right from your watch, we want Assistant to be your go-to conversation helper. One that moves with you throughout your day – whether you’re at home or on the go – to make life easier and give you time to focus on what matters most.

Work Diary: a Google Assistant marketer in San Francisco

In our new Work Diary series, we show you what a day on the job is really like for Googlers with all sorts of roles and interests around the world. In this installment you'll hear from Seonah, who works on privacy and trust marketing for Google Assistant. Follow along with her day below, and be sure to watch her video diary, too.

Name: Seonah Iverson
Location: San Francisco
Time at Google: 1 year
Job title: Google Assistant Privacy & Trust Product Marketing Lead
What that role actually does: I help make Google Assistant more trusted and safe for the people who use our products.
What’s your favorite part of a typical work day? When I get the chance to hear from real users during research calls - it always inspires me to keep pushing our product to be the best that it can be.

7:15 a.m. → “Hey Google, good morning”

Yes, you guessed it, Google Assistant wakes me up with some alternative R&B playing from my Nest Hub Max Smart Display. I think the first words out of my mouth most mornings are “Hey Google, good morning,” which I set up for my Assistant to tell me the weather forecast, what’s on my calendar and news highlights from my favorite news outlets. I really try not to pull out my phone right away in the morning and dive straight into work emails, and this seriously helps.

8 a.m. → E-bike commute with a view

I work from home two days a week and from the office the rest, I love this flexible schedule. On the days I go into the office, I’m ready to get out of my apartment for a while and connect with my coworkers in person. Google food and coffee doesn’t hurt either. On the mornings I go in, I grab an e-bike in my neighborhood — North Beach — and take the Embarcadero cycling path to the office. The view is so nice (especially when it’s sunny!).

Two photos side-by-side: The first is a hand holding a cup of coffee in front of a window revealing the San Francisco bay and the Bay Bridge; the other is taken from the perspective of someone riding in the bike lane down a street lined with palm trees. On the right of the frame is an icon of a clock that says 8:00 a.m.; on the left there is an icon of a bicycle.

Seonah’s morning consists of an e-bike commute and coffee — both with great views.

8:30 a.m. → Prep for projects focused on protecting user privacy

When I get to the office, I grab an oat milk latte from the Flora Hub coffee bar on the 13th floor and start looking at my emails and calendar for the day. (This spot has the best views of the Bay — you can see all the way to the Bay Bridge.)

I start this part of the workday by taking inventory of my inbox and calendar and making any adjustments I need to — moving meetings or booking conference rooms, things like that. I also make sure I’ve blocked off at least one part of the day for me to go heads-down and get things done on my top projects. I think of this as my mental prep time.

This week, I’m focused on gathering key user insights from research and prepping for a product and messaging review — this helps our product team address top user concerns and explain things in a simple way that makes sense to everyone.

9 a.m. → Down to business!

My meeting blocks tend to start around 9 a.m., so I head to a conference room. For most of the morning, I’m in Google Meet calls with user focus groups to hear from real users on the privacy controls and settings they use most often or would like to better understand.

When I finish up with user research calls, I meet with Assistant product managers, engineers and other teammates located in New York, Atlanta and Mountain View. We talk about the upcoming privacy and safety settings improvements we’re planning to launch and how we can introduce the updates without disrupting our users’ experience — these meetings always spark good ideas and are key to moving projects forward.

12 p.m. → Cafe with teammates, by route of dooglers <3

Around noon I meet up with some of my fellow Assistant marketing teammates and we walk outside to get to the Maritime Social cafe. The best part is passing by the doogler area and seeing the pups playing!

I have a huge sweet tooth, so I always get dessert with lunch, whatever it is.

Two side-by-side photos: The first shows a grassy field, there are dogs and people. Skyscrapers are in the background. The second shows a lunch table where you can see some people in the background and in the foreground are multiple plates and bowls of different kinds of food. On the right side there is a frame with an icon that reads 12:00 p.m.; on the left there is an icon of a plate of food and a salt and pepper shaker.

Seonah’s walk to lunch takes her past fellow Googlers — and a few dooglers.

1 p.m. → Boba and brainstorm

In the afternoon, I grab boba from a nearby cafe and meet up with the Google Asians in Marketing group. We get together in a conference room to talk about creating more representation and inclusion not just at Google, but also in the marketing industry in general. This is a volunteer project that I always enjoy participating in.

2 p.m. → Back into top privacy priorities and checking off tasks

I head to my desk and get back to daily tasks for the next couple of hours. Today, one thing I want to cross off my list is completing some writing and design work that explains the latest updates to Assistant privacy controls that will appear on our website and in our email newsletter to users. Part of this process includes making sure our explainer videos are up to date and translated appropriately for users worldwide.

I usually listen to music when I’m doing this — that’s how I get into a flow mindset and get the most done. I really try to balance meetings with tasks to make each day as productive as possible; I love creating Calendar tasks and crossing them off my list. Sometimes I'll even retroactively add them and cross them off! The mix of completed tasks and meetings on my calendar is my source of truth to keep me accountable to projects I’m spending time on and if I have room to take on any other stretch projects.

4 p.m. → Ahhhh, a quick chair massage break

When I feel like I need a pick-me-up, I take a break with a chair massage on the second floor, near the gym.

Two side-by-side photos - the first shows Seonah sitting in a chair working t her laptop. The second photo shows a laptop on a desk open to a Google Calendar. There is an illustrated frame featuring a boba tea drink and a clock that reads 4pm.

Seonah finishes up her day and looks at her calendar for tomorrow.

4:30 p.m. → Time for inbox zero

Back to my laptop one last time to finish up daily work and respond to those last few emails. Personally, I subscribe to the inbox zero way of life, so I make sure to check that box before I head out for the day! Oh, and I water my desk plant if it’s looking wilty.

5 p.m. → Barre class and a walk home

Before heading home, I like to take a barre class nearby. Afterward, I’ll walk home and run some errands on the way. I’ll usually listen to my more recent playlist, a current events podcast or call my family. I’m always listening to something if I’m walking around or sitting at my desk; part of my daily attire is a pair of headphones.

7 p.m. → Dinner and a show — or a Korean language lesson

To close out the day it’s either date night, grab dinner with friends or cook at home. Afterwards, I’ll usually watch some TV — or if I want to do something more engaging, I’ll practice my Korean (I just started taking language classes) or practice piano on my keyboard.

Two side-by-side photographs - the first of a Nest Hub sitting on a desk and the second of a person's hands over a keyboard. There is an illustrated frame feature icons of an alarm clock and a stack of books.

Time to relax with some music and get ready for tomorrow.

The last to-do of the day is looking at my Nest Hub Max Smart Display and by using Look and Talk, I ask my Assistant to “set my alarm for 7:15 a.m. tomorrow."

9 years later, Chromecast has way more — at a lower price

Since we launched it in 2013, we’ve introduced three generations of the original Chromecast, plus Chromecast Ultra for 4K and HDR support. Most recently, in 2020, we took our biggest leap yet with Chromecast with Google TV (4K).

Chromecast with Google TV comes with your favorite Chromecast features from over the years — plus the Google TV experience, which brings together movies, shows, live TV and more from across your apps and subscriptions and organizes them just for you.

Today, we’re expanding this lineup with the new Chromecast with Google TV (HD). We built this product with affordability in mind and to help bring all our favorite features of Chromecast and Google TV to more people than ever.

We’ve come pretty far since the original Chromecast launched 9 years ago. Here’s how we’ve evolved — and what you can expect from the new Chromecast with Google TV (HD).

More affordable than our original Chromecast

Chromecast originally launched as a $35 dongle that made it easy and inexpensive to bring your online entertainment to your TV. At the time, this was a big deal: you could browse the web, watch TV shows and movies, and listen to music simply by plugging it into the back of your TV and connecting the device to Wi-Fi.

Chromecast with Google TV (HD) is just $29.99 in the US, making it even more affordable than our original Chromecast. It brings more capabilities and intelligence to the Chromecast experience that people have loved for years.

More streaming options than ever, organized for you

In the beginning, Chromecast launched with just a handful of partners: Netflix, YouTube, Google Play Movies and Google Play Music. Nine years later, there are 10,000+ apps to choose from with Google TV, from HBO Max and Disney+ to Prime Video, and we continue to add new content all the time. Google TV’s content offerings go beyond entertainment as well — you can work out with Peloton right from their app, for example.

And with more content choices than ever before, Chromecast with Google TV has helped reinvent what simple and easy content discovery on your TV looks like. It’s the home for your entertainment, bringing together movies, shows, live TV and more so you can find what to watch without jumping from app to app.

Three friends sit on a couch in the living room watching soccer on the TV with Chromecast with Google TV.

Google smarts built-in

In 2013, you could use your phone, tablet or laptop with Chromecast to browse and cast content to your TV, play and pause, control the volume and more. Chromecast brought a broad range of content to your big screen, from sharing your family photos to enjoying a video clip for your favorite news site with the press of a button.

Today, with new streaming services and apps launching all the time, there is so much content, and choosing something to watch has become harder than ever. That’s why we built Google TV.

And since launching Google TV two years ago, we’ve continued to make updates to the experience, like introducing profiles for everyone in your family to help resolve common complaints like, “Why am I getting recommendations for cartoons when I only want thrillers?” It also helps parents have a place for their kids to watch family-friendly content.

Plus, Google Assistant has a dedicated button on the included remote, so you can easily find something to watch — “Play 'House of the Dragon' on HBO Max” — or ask everyday questions like, “How’s the traffic to work?” Or, when it’s movie night, you can see your front door on the big screen with a Nest Doorbell to keep tabs on your pizza delivery.

Two people use Chromecast with Google TV to see a live view of their front door where someone waits with a delivery.

Plug in and play

The original Chromecast was a small dongle for your TV that was designed to get out of the way. It introduced the foundational casting experience of using apps that people were familiar with on their smartphone: just open a supported app, press the Cast icon, and sit back and enjoy.

We’ve kept that same spirit with all of our Chromecast devices, and Chromecast with Google TV (HD) is no exception. It comes in the same compact and thin design as the Chromecast with Google TV (4K), tucks neatly behind your TV, and set-up is fast and simple.

And of course, you’ll still have access to your favorite Chromecast features like casting from your phone, sharing your Google Photos to your TV, and casting your Google Meet video calls to TV, so you can join the team meeting or lecture from the comfort of your couch.

Chromecast with Google TV (HD) streams in high definition with 1080p HDR, and we’ve made software optimizations behind the scenes to make sure you get a smooth and snappy experience no matter what TV you’re watching on.

Starting today, Chromecast with Google TV (HD) is available for $29.99 in our classic Snow color, and is available in 19 countries now, with more regions coming soon. For full details on availability, check out our Google Store help page.

And right now, people who buy Chromecast with Google TV (HD) will get 6 months of Peacock Premium, so you can watch hit movies and shows, exclusive Originals, WWE, extended live sports, and more.[3b70bf]

From day one with Chromecast, we wanted to create an easy solution that worked for everyone, for every TV in the house. Nine years later, that mission hasn’t changed.