What’s new with Google TV & Android TV OS

Shobana Radhakrishnan, Senior Director of Engineering - Google TV

Paul Lammertsma, Developer Relations Engineer

Image of Android and Google TV Iconography

Today, there is more entertainment content available than ever before. In fact, our research shows a third of U.S. households now watch more than 25 hours of TV every week. As the role of TV continues to evolve, it’s our goal to build a tailored TV experience that gives users easy access to the entertainment they love.

We’re excited about the future of Android TV OS, now with over 110 million monthly active devices, including millions of Google TVs. Android TV and Google TV are available on over 300 partners worldwide, including 7 of the 10 largest smart TV OEMs and over 170 pay TV operators. And thanks to the hard work of our developer community, there are more than 10,000 apps available on TV, with more being added everyday.

Since last year’s I/O, we’ve continued our commitment to enable you to build better and more engaging experiences on Android TV OS. In addition to platform updates, new features, like expanded integrations with the Live tab, offer opportunities for users to better engage with your content. And if you haven’t begun using WatchNext API, take a moment to learn how to add it to your app to make your content more discoverable and accessible.

Today, we are introducing new features and tools on Android 13 that focus on overall performance & quality, improve accessibility, and enable multitasking.

  • Performance & quality: To help build for the next generation of TVs, we’re introducing new APIs to help you better detect a user’s settings and give them the best experience for their device. AudioManager allows your app to anticipate audio routes and precisely understand which playback mode is available. Integrating your app correctly with MessiaSession allows Android TV to react to HDMI state changes in order to save power and signal that content should be paused.
  • Accessibility: To improve how users interact with their TV, we’ve added support for different keyboard layouts in the InputDevice API. Game developers can also reference keys by their physical location to support different layouts of physical keyboards, such as QWERTZ and AZERTY keyboards. A new system-wide accessibility preference also allows users to enable audio descriptions across apps.
  • Multitasking: TVs are now used for more than just watching media content. In fact, we often see users taking calls or monitoring cameras in a smart home. To help with multitasking, an updated picture in picture API will be supported in Android 13 with the APIs from core Android. Picture in picture on the TV supports an expanded mode to show more videos from a group call, a docked mode to avoid overlaying content on other apps, and a keep-clear API to prevent overlays from concealing important content in full-screen apps.
Image of Google TV interface with picture of Dune showing

Android 13 Beta for TV is available now, allowing you to test your apps and provide feedback on the latest release. Thank you for your continued support of Android TV OS. We can’t wait to see what amazing and innovative things you continue to build for the big screen.

What’s new with Android for Cars

Posted by Jennifer Chui, Technical Program Manager and Rod Lopez, Product Manager

animated car dashboard 

At Google, our work in cars has always been guided by our vision of creating safe and seamless connected experiences. This work would not be possible without developers like you. We’re excited to share some of our combined accomplishments from this past year, and introduce new updates that will make it easier for you to provide users with an even better experience in the car.

Android Auto continues to grow and scale, with compatible vehicles now numbering over 150 million worldwide. An increasing number are also wirelessly compatible, and with the newly introduced Motorola MA1 adapter, even more drivers now have access to a wireless experience. In addition, our new design for Android Auto brings split-screen functionality to every screen, keeping navigation and media front and center while also providing room for prominent notification widgets.

View of the Android Automotive dashboard 

Android Automotive OS with Google built-in also has exciting updates. Beyond the continued expansion of carmakers that are bringing more car models to the market, we’ve also been hard at work enabling more parked experiences to take advantage of the large screens that many AAOS cars offer. From more video streaming apps like Epix Now and Tubi to future features like browsing and cast, there’s much to look forward to, and given minimal effort is required to translate your large screen tablet apps into a parked car experience, it’s now easier than ever to reach users in the car.

View of the Android Automotive dashboard 

We know that developing for cars can be complex, which is why we’re focused on making developing across Android for Cars as easy as possible. We’ve seen strong momentum with our Car App Library with over 200 apps published to date, and beyond enriching the navigation feature set with version 1.3, we’re also excited to share that all developers can now publish apps in supported categories directly to production for both Android Auto and Android Automotive OS. We’ve also created new templates and expanded our supported app categories, adding driver apps like Lyft to the navigation category, and replacing the parking and charging categories with a comprehensive point of interest (POI) category to include apps like MochiMochi and Fuelio.

We’re also introducing several new features to help you build more powerful media apps on Android Auto. Media recommendations working side by side with Google Assistant helps users easily discover and quickly play relevant content based on their preferred music provider at the click of a button. To surface recommendations from your app, integrate with this API.

For long form content such as podcasts and audiobooks, you can now introduce a progress bar that shows how much of the content the user has previously listened to, and with our new single item styling API, you can now assign content items individually as either list or grid as opposed to categorically, to easily combine them in the same content space.

View of the Android Automotive dashboard 

We’re grateful to have you on the journey with us as we seek to create safer, more seamless connected experiences in cars. Be sure to check out our Google I/O technical session above, and as always, you can get help from the developer community at Stack Overflow using the android-automotive and android-auto tags. We can’t wait to see what you build next, and where the road takes you.

Chrome Beta for Android Update

Hi everyone! We've just released Chrome Beta 102 (102.0.5005.50) for Android. It's now available on Google Play.

You can see a partial list of the changes in the Git log. For details on new features, check out the Chromium blog, and for details on web platform updates, check here.

If you find a new issue, please let us know by filing a bug.

Erhu Akpobaro
Google Chrome

YouTube receives brand safety distinction for second year

At YouTube, we’re committed to protecting our viewers, creators and advertisers. Last year, we became the first digital platform to receive content-level brand safety accreditation from the Media Rating Council (MRC). Today, the MRC has given us that accreditation again, making YouTube the only platform to hold this distinction.[ceb9d8]This is a testament to the investments we’ve made in responsibility, YouTube's top priority.

“We congratulate Google for this noteworthy achievement,” says George W. Ivie, Executive Director and CEO of the MRC. “Brand safety and suitability are critical issues in today’s digital ad environment, and MRC’s accreditation of YouTube, first granted last year and continued today, remains a landmark achievement in providing marketers with strong assurances that their advertising investments on the YouTube platform are being well protected.”

As part of this accreditation, the MRC extensively audited our content review systems, including the machine learning technology that analyzes content uploaded to our platform and the policies that determine which videos on YouTube are eligible to run ads. The MRC auditors also met with our brand safety personnel on site to review our processes and dug into how we protect our global community — including our procedures for evaluating content across different languages. The accreditation also recognized YouTube’s advertiser safety error rate, a metric authorized by the Global Alliance for Responsible Media (GARM) which evaluates the total percentage of ad impressions that run across violative content.

“We’re thrilled to see YouTube take another industry-leading step in their continued accreditation with MRC this year,” says Robert Rakowitz, Initiative Lead, GARM. “With this latest certification, YouTube fulfills a key request from advertisers and agencies in having an audit oversight body approve a core metric on the safety of their monetization practices. This is a step to celebrate and a further demonstration of YouTube’s commitment to GARM’s mission.”

Our continued accreditation confirms that our strategy and systems are keeping pace with the current environment. And it builds on our commitment to remaining at least 99% effective at ensuring brand safety of advertising placements on YouTube, in accordance with industry standards.

In addition to working with the MRC and GARM to raise the bar on brand safety, we’re also improving brand suitability. Over the past two years, we’ve worked directly with advertisers and agencies to better understand their needs and develop a set of best practices, such as anchoring on YouTube’s inventory modes and reassessing whether they should exclude certain types of content. When advertisers knew how to better navigate our suitability controls, they experienced performance benefits ranging from increased reach and view-through rates to decreased cost-per-view.

We’re now using these best practices and customer feedback to evolve our suitability offering. This will include intuitive controls, more consistency across all Google inventory and clarity on how controls may impact ad campaigns. We’ll share more details in the coming months.

“Better suitability controls allow advertisers to access and support more diverse content and audiences in a brand-safe way,” says Luis Di Como, EVP, Global Media, Unilever. “Unilever has long championed a responsible and safe online environment, and we are encouraged by YouTube’s commitment to create a positive digital ecosystem that is safe and inclusive for all.”

By extending the rigor of our brand safety systems to our suitability solutions, we hope to continue to help advertisers tap into the full scale and potential of YouTube.

New ways to stay connected and entertained in your car

Our work in cars has always been guided by our goal to help make your driving experience easier and safer. Today, we’re introducing several updates for cars compatible with Android Auto and cars with Google built-in to help you stay connected and entertained while enhancing your experience on the road.

A brand-new look for Android Auto

Since it first launched, Android Auto has expanded to support more than 150 million cars across nearly every car brand. And over the years, we’ve found there are three main functionalities that drivers prioritize in their cars: navigation, media and communication. This summer, Android Auto will roll out a brand new interface that will help you get directions faster, control your media more easily and have more functionality at your fingertips.

Car dashboard with display showcasing new Android Auto design in different screen sizes

Built to adapt to any screen size

With split screen mode, now standard across all screen types and sizes, you’ll have access to your most-used features all in one place — no need to return to your home screen or scroll through a list of apps. With your navigation and media always on, you won’t have to worry about missing your next turn while changing your favorite commute podcast. And with the new design able to adapt to different screen sizes, it looks great across widescreen, portrait and more.

New features for Android Auto

Google Assistant is bringing contextual suggestions to help you be more productive in the car. From suggested replies, to messages, to sharing arrival times with a friend, or even playing recommended music, Google Assistant is helping you do more in the car efficiently.

In addition to using your voice, you can now quickly message and call favorite contacts with just one tap, and reply to messages by simply selecting a suggested response on the screen – helping you communicate effectively, while allowing you to keep your eyes on the road. Keep an eye out for these updates to Android Auto in the coming months

Stay connected and entertained with Google built-in

Cars with Google built-in often come with large displays, and we’re continuing to build new experiences for those displays while your car is parked. We previously announced we’re bringing YouTube to cars with Google built-in and more video streaming apps will join the queue, including Tubi and Epix Now. So, when you’re parked waiting for your car to charge or at curbside pickup, you’ll be able to enjoy video directly from your car display.

As we work to add more capabilities to cars with Google built-in in the future, you’ll be able to not only browse the web directly from your car display, but also cast your own content from your phone to your car screen.

Car dashboard with display showcasing Tubi

Enjoy video content directly from your car’s screen while parked

Across Android Auto and cars with Google built-in, we’re working hard to ensure every drive is a helpful and connected experience.

100 things we announced at I/O

And that’s a wrap on I/O 2022! We returned to our live keynote event, packed in more than a few product surprises, showed off some experimental projects and… actually, let’s just dive right in. Here are 100 things we announced at I/O 2022.

Gear news galore

Pixel products grouped together on a white background. Products include Pixel Bud Pro, Google Pixel Watch and Pixel phones.
  1. Let’s start at the very beginning — with some previews. We showed off a first look at the upcoming Pixel 7 and Pixel 7 Pro[1ac74e], powered by the next version of Google Tensor
  2. We showed off an early look at Google Pixel Watch! It’s our first-ever all-Google built watch: 80% recycled stainless steel[ec662b], Wear OS, Fitbit integration, Assistant access…and it’s coming this fall.
  3. Fitbit is coming to Google Pixel Watch. More experiences built for your wrist are coming later this year from apps like Deezer and Soundcloud.
  4. Later this year, you’ll start to see more devices powered with Wear OS from Samsung, Fossil Group, Montblanc and others.
  5. Google Assistant is coming soon to the Samsung Galaxy Watch 4 series.
  6. The new Pixel Buds Pro use Active Noise Cancellation (ANC), a feature powered by a custom 6-core audio chip and Google algorithms to put the focus on your music — and nothing else.
  7. Silent Seal™ helps Pixel Buds Pro adapt to the shape of your ear, for better sound. Later this year, Pixel Buds Pro will also support spatial audio to put you in the middle of the action when watching a movie or TV show with a compatible device and supported content.
  8. They also come in new colors: Charcoal, Fog, Coral and Lemongrass. Ahem, multiple colors — the Pixel Buds Pro have a two-tone design.
  9. With Multipoint connectivity, Pixel Buds Pro can automatically switch between your previously paired Bluetooth devices — including compatible laptops, tablets, TVs, and Android and iOS phones.
  10. Plus, the earbuds and their case are water-resistant[a53326].
  11. …And you can preorder them on July 21.
  12. Then there’s the brand new Pixel 6a, which comes with the full Material You experience.
  13. The new Pixel 6a has the same Google Tensor processor and hardware security architecture with Titan M2 as the Pixel 6 and Pixel 6 Pro.
  14. It also has two dual rear cameras — main and ultrawide lenses.
  15. You’ve got three Pixel 6a color options: Chalk, Charcoal and Sage. The options keep going if you pair it with one of the new translucent cases.
  16. It costs $449 and will be available for pre-order on July 21.
  17. We also showed off an early look at the upcoming Pixel tablet[a12f26], which we’re aiming to make available next year.

Android updates

18. In the last year, over 1 billion new Android phones have been activated.

19. You’ll no longer need to grant location to apps to enable Wi-Fi scanning in Android 13.

20. Android 13 will automatically delete your clipboard history after a short time to preemptively block apps from seeing old copied information

21. Android 13’s new photo picker lets you select the exact photos or videos you want to grant access to, without needing to share your entire media library with an app.

22. You’ll soon be able to copy a URL or picture from your phone, and paste it on your tablet in Android 13.

23. Android 13 allows you to select different language preferences for different apps.

24. The latest Android OS will also require apps to get your permission before sending you notifications.

25. And later this year, you’ll see a new Security & Privacy settings page with Android 13.

26. Google’s Messages app already has half a billion monthly active users with RCS, a new standard that enables you to share high-quality photos, see type indicators, message over Wi-Fi and get a better group messaging experience.

27. Messages is getting a public beta of end-to-end encryption for group conversations.

28. Early earthquake warnings are coming to more high-risk regions around the world.

29. On select headphones, you’ll soon be able to automatically switch audio between the devices you’re listening on with Android.

30. Stream and use messaging apps from your Android phone to laptop with Chromebook’s Phone Hub, and you won’t even have to install any apps.

31. Google Wallet is here! It’s a new home for things like your student ID, transit tickets, vaccine card, credit cards, debits cards.

32. You can even use Google Wallet to hold your Walt Disney World park pass.

33. Google Wallet is coming to Wear OS, too.

34. Improved app experiences are coming for Android tablets: YouTube Music, Google Maps and Messages will take advantage of the extra screen space, and more apps coming soon include TikTok, Zoom, Facebook, Canva and many others.

Developer deep dive

Illustration depicting a smart home, with lights, thermostat, television, screen and mobile device.

35. The Google Home and Google Home Mobile software developer kit (SDK) for Matter will be launching in June as developer previews.

36. The Google Home SDK introduces Intelligence Clusters, which make intelligence features like Home and Away, available to developers.

37. Developers can even create QR codes for Google Wallet to create their own passes for any use case they’d like.

38. Matter support is coming to the Nest Thermostat.

39. The Google Home Developer Center has lots of updates to check out.

40. There’s now built-in support for Matter on Android, so you can use Fast Pair to quickly connect Matter-enabled smart home devices to your network, Google Home and other accompanying apps in just a few taps.

41. The ARCore Geospatial API makes Google Maps’ Live View technology available to developers for free. Companies like Lime are using it to help people find parking spots for their scooters and save time.

42. DOCOMO and Curiosity are using the ARCore Geospatial API to build a new game that lets you fend off virtual dragons with robot companions in front of iconic Tokyo landmarks, like the Tokyo Tower.

43. AlloyDB is a new, fully-managed PostgreSQL-compatible database service designed to help developers manage enterprise database workloads — in our performance tests, it’s more than four times faster for transactional workloads and up to 100 times faster for analytical queries than standard PostgreSQL.

44. AlloyDB uses the same infrastructure building blocks that power large-scale products like YouTube, Search, Maps and Gmail.

45. Google Cloud’s machine learning cluster powered by Cloud TPU v4 Pods is super powerful — in fact, we believe it’s the world’s largest publicly available machine learning hub in terms of compute power…

46. …and it operates at 90% carbon-free energy.

47. We also announced a preview of Cloud Run jobs, which reduces the time developers spend running administrative tasks like database migration or batch data transformation.

48. We announced Flutter 3.0, which will enable developers to publish production-ready apps to six platforms at once, from one code base (Android, iOS, Desktop Web, Linux, Desktop Windows and MacOS).

49. To help developers build beautiful Wear apps, we announced the beta of Jetpack Compose for Wear OS.

50. We’re making it faster and easier for developers to build modern, high-quality apps with new Live edit features in Android Studio.

Help for the home

GIF of a man baking cookies with a speech bubble saying “Set a timer for 10 minutes.” His Google Nest Hub Max responds with a speech bubble saying “OK, 10 min. And that’s starting…now.”

51. Many Nest Devices will become Matter controllers, which means they can serve as central hubs to control Matter-enabled devices both locally and remotely from the Google Home app.

52. Works with Hey Google is now Works with Google Home.

53. The new home.google is your new hub for finding out everything you can do with your Google Home system.

54. Nest Hub Max is getting Look and Talk, where you can simply look at your device to ask a question without saying “Hey Google.”

55. Look and Talk works when Voice Match and Face Match recognize that it’s you.

56. And video from Look and Talk interactions is processed entirely on-device, so it isn’t shared with Google or anyone else.

57. Look and Talk is opt-in. Oh, and FYI, you can still say “Hey Google” whenever you want!

58. Want to learn more about it? Just say “Hey Google, what is Look and Talk?” or “Hey Google, how do you enable Look and Talk?”

59. We’re also expanding quick phrases to Nest Hub Max, so you can skip saying “Hey Google” for some of your most common daily tasks – things like “set a timer for 10 minutes” or “turn off the living room lights.”

60. You can choose the quick phrases you want to turn on.

61. Your quick phrases will work when Voice Match recognizes it’s you .

62. And looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.

Taking care of business

Animated GIF  demonstrating portrait light, bringing studio-quality lighting effects to Google Meet.

63. Google Meet video calls will now look better thanks to portrait restore and portrait light, which use AI and machine learning to improve quality and lighting on video calls.

64. Later this year we’re scaling the phishing and malware protections that guard Gmail to Google Docs, Sheets and Slides.

65. Live sharing is coming to Google Meet, meaning users will be able to share controls and interact directly within the meeting, whether it’s watching an icebreaker video from YouTube or sharing a playlist.

66. Automated built-in summaries are coming to Spaces so you can get a helpful digest of conversations to catch up quickly.

67. De-reverberation for Google Meet will filter out echoes in spaces with hard surfaces, giving you conference-room audio quality whether you’re in a basement, a kitchen, or a big empty room.

68. Later this year, we're bringing automated transcriptions of Google Meet meetings to Google Workspace, so people can catch up quickly on meetings they couldn't attend.

Apps for on-the-go

A picture of London in immersive view.

69. Google Wallet users will be able to check the balance of transit passes and top up within Google Maps.

70. Google Translate added 24 new languages.

71. As part of this update, Indigenous languages of the Americas (Quechua, Guarani and Aymara) and an English dialect (Sierra Leonean Krio) have also been added to Translate for the first time.

72. Google Translate now supports a total of 133 languages used around the globe.

73. These are the first languages we’ve added using Zero-resource Machine Translation, where a machine learning model only sees monolingual text — meaning, it learns to translate into another language without ever seeing an example.

74. Google Maps’ new immersive view is a whole new way to explore so you can see what an area truly looks and feels like.

75. Immersive view will work on nearly any phone or tablet; you don’t need the fanciest or newest device.

76. Immersive view will first be available in L.A., London, New York, San Francisco and Tokyo — with more places coming soon.

77. Last year we launched eco-friendly routing in the U.S. and Canada. Since then, people have used it to travel 86 billion miles, which saved more than half a million metric tons of carbon emissions — that’s like taking 100,000 cars off the road.

78. And we’re expanding eco-friendly routing to more places, like Europe.

All in on AI

Ten circles in a row, ranging from dark to light.

The 10 shades of the Monk Skin Tone Scale.

79. A team at Google Research partnered with Harvard’s Dr. Ellis Monk to openly release the Monk Skin Tone Scale, a new tool for measuring skin tone that can help build more inclusive products.

80. Google Search will use the Monk Skin Tone Scale to make it easier to find more relevant results — for instance, if you search for “bridal makeup,” you’ll see an option to filter by skin tone so you can refine to results that meet your needs.

81. Oh, and the Monk Skin Tone Scale was used to evaluate a new set of Real Tone filters for Photos that are designed to work well across skin tones. These filters were created and tested in partnership with artists like Kennedi Carter and Joshua Kissi.

82. We’re releasing LaMDA 2, as a part of the AI Test Kitchen, a new space to learn, improve, and innovate responsibly on this technology together.

83. PaLM is a new language model that can solve complex math word problems, and even explain its thought process, step-by-step.

84. Nest Hub Max’s new Look and Talk feature uses six machine learning models to process more than 100 signals in real time to detect whether you’re intending to make eye contact with your device so you can talk to Google Assistant and not just giving it a passing glance.

85. We recently launched multisearch in the Google app, which lets you search by taking a photo and asking a question at the same time. At I/O, we announced that later this year, you'll be able to take a picture or screenshot and add "near me" to get local results from restaurants, retailers and more.

86. We introduced you to an advancement called “scene exploration,” where in the future, you’ll be able to use multisearch to pan your camera and instantly glean insights about multiple objects in a wider scene.

Privacy, security and information

A GIF that shows someone’s Google account with a yellow alert icon, flagging recommended actions they should take to secure their account.

87. We’ve expanded our support for Project Shield to protect the websites of 200+ Ukrainian government agencies, news outlets and more.

88. Account Safety Status will add a simple yellow alert icon to flag actions you should take to secure your Google Account.

89. Phishing protections in Google Workspace are expanding to Docs, Slides and Sheets.

90. My Ad Center is now giving you even more control over the ads you see on YouTube, Search, and your Discover feed.

91. Virtual cards are coming to Chrome and Android this summer, adding an additional layer of security and eliminating the need to enter certain card details at checkout.

92. In the coming months, you’ll be able to request removal of Google Search results that have your contact info with an easy-to-use tool.

93. Protected Computing, a toolkit that helps minimize your data footprint, de-identifies your data and restricts access to your sensitive data.

94. On-device encryption is now available for Google Password Manager.

95. We’re continuing to auto enroll people in 2-Step Verification to reduce phishing risks.

What else?!

Illustration of a black one-story building with large windows. Inside are people walking around wooden tables and white walls containing Google hardware products. There is a Google Store logo on top of the building.

96. A new Google Store is opening in Williamsburg.

97. This is our first “neighborhood store” — it’s in a more intimate setting that highlights the community. You can find it at 134 N 6th St., opening on June 16.

98. The store will feature an installation by Brooklyn-based artist Olalekan Jeyifous.

99. Visitors there can picture everyday life with Google products through interactive displays that show how our hardware and services work together, and even get hands-on help with devices from Google experts.

100. We showed a prototype of what happens when we bring technologies like transcription and translation to your line of sight.

Helping you build across devices, platforms, and the world

Posted by Jeanine Banks, VP & General Manager of Developer X & Head of Developer Relations

We’re thrilled to be back at the Shoreline Amphitheatre hosting Google I/O this week. It’s great to connect with you all from around the world virtually and in person.

I/O is our love letter to you, the developer. Developers are the engine which enables the information revolution. But more than that, it’s developers who turn information and ideas into code that powers the way we learn, work, communicate, and play.

A few decades ago, building a digital experience meant publishing a static website and reaching thousands of people on their desktops. Today, it means a lightning-fast, interactive experience across browsers, desktops, phones, tablets, virtual assistants, TVs, gaming consoles, cars, watches, and more. People expect new features faster than ever -- all while we respect and uphold the highest standards for privacy and safety.

To help you deal with the complexity and rising expectations, we want to bring simplicity to the challenges you face. This week at I/O, we shared the beginning of a long-term effort to connect our developer products to work even better together, and provide more guidance and best practices to optimize your end-to-end workflow. Here are just a few highlights of what we announced in the developer keynote:

  • The new ARCore Geospatial API, that lets you place AR content at real-world locations in 87 countries without physically being there.
  • Modern Android Development for the best experiences on any screen, including new Jetpack Compose support for WearOS and tablets, an upgrade to Android Studio with Live Edit, and much more.
  • Chrome DevTools’ new Performance Insights panel and support coming in WebAssembly for managed programming languages like Dart, Java, and Kotlin.
  • Flutter 3, our open source multi-platform UI framework, now supports six platforms for building beautiful applications from a single code base.
  • Firebase Crashlytics seamlessly integrated across Android Studio, Flutter, and Google Play for consistent and actionable crash reporting.
  • Cloud Run jobs to execute batch data transformation, administrative tasks or scheduled jobs, and AlloyDB for PostgreSQL, our new fully managed, relational database that’s more than 4x faster than standard PostgreSQL for transactional workloads.
  • Exciting research in AI-assisted coding and the AI for Code (AI4Code) challenge on Kaggle in partnership with X, the moonshot factory.

Watch the developer keynote or this recap video to get a fuller taste of what's new this year across many of our platforms including Android, ARCore, Chrome OS, Cloud, Flutter, Firebase, Google Play, Kaggle, Machine Learning, and Web Platform:

Whether you are looking to build your first app, expand what your products can do, or leverage ML easily and responsibly, we hope you will be inspired by the vast space in front of you to make your ideas a reality and make people’s lives better.

Make connections that Matter in Google Home

We’re entering a new era of the smart home built on openness and collaboration — one where you should have no problem using devices from different smart home brands to turn on your lights, warm up your living room and set your morning alarm. All of them should work together in harmony.

Matter, the new smart home industry standard we developed with other leading technology companies, is making this possible. Whether you’re shopping for or building your own smart home devices, let’s take a closer look at how Matter can help you make more connections with Google products and beyond when it launches later this year.

Connect your favorite smart home brands

When you buy a Matter-enabled device, the set-up process will be quick and consistent. In just a few taps, you can easily link it to your home network, another smart home ecosystem and your favorite apps. Support for Matter through Fast Pair on Android makes it as easy as connecting a new pair of headphones. And because Matter devices connect and communicate locally over Wi-Fi and Thread, a wireless mesh networking technology, they’re more reliable and responsive — reducing lag and potential connection interruptions.

To help you get ready for Matter, we’ll update many Google Nest devices to be Matter controllers. This will let you connect all your Matter-enabled devices to Google Home, and control them both locally and remotely with the Google Home app, smart home controls on your Android phone or Google Assistant. Matter controllers will include the original Google Home speaker, Google Mini, Nest Mini, Nest Hub (1st and 2nd gen), Nest Hub Max, Nest Audio and Nest Wifi.

Meanwhile, Nest Wifi, Nest Hub Max and Nest Hub (2nd gen) will also serve as Thread border routers, allowing you to connect devices built with Thread — like backyard lights that need long-range connectivity — to your home network.

We’ve also rolled out a new Google Home site to help you explore everything you can do with your Google Home in one spot. You can discover thousands of smart home devices that work with Google Home and learn how to get the most out of your helpful home — including automated routines to make everyday life easier, safer and more convenient.

To make it easier to find products that work great with Google Home, we're updating our “Works with” partner program. Works with Hey Google is now Works with Google Home. Partner devices that carry this badge have gone the extra mile to build high-quality experiences with Google using Matter or our existing integrations. It’ll take some time for all our partners to start using the new badge — but if you spot either of these badges on a smart home product, you’ll know they easily connect with Google and our home control features like routines, voice control through Google Assistant devices and Android home controls.

Build more connected smart home devices

Developers, take note: With Matter, there’s no need to build multiple versions of a smart home device to work across different ecosystems. You’ll only have to build once, and that device will work right away with Google Home and other smart home brands. This means you can spend less time building multiple connectivity paths, and more time innovating and delivering devices and features.

To help you do that, we’ve launched a new Google Home Developer Center that brings together all our resources for developers and businesses. You can start learning today how to build smart home devices and Android apps with Matter, discover new features to integrate into your devices and explore marketing resources to help grow your business. You’ll also find new community support tools for device makers building with Google Home.

On June 30, we’ll launch the Google Home Developer Console, including two new software development kits (SDKs) to make it easy to build Matter devices and apps. The Google Home Device SDK is the fastest way to start building Matter devices. This SDK will also introduce Intelligence Clusters, which will share Google Intelligence — starting with Home & Away Routines — with developers who meet certain security and privacy requirements.

The new Google Home Mobile SDK will make it easy to build apps that connect directly with Matter devices using new built-in connectivity support in Android. This makes the set-up process simpler, more consistent and reliable for Android users. And with connectivity taken care of, developers can spend more time building unique features and experiences.

We can’t wait to see how you use Matter, Thread and Google Home to build and create the smart home experience that best suits you. Check out home.google and developers.home.google.com to learn more and sign up for future updates.

New delegated VirusTotal privilege in the Alert Center

What’s changing 

In 2021, we announced an integration between the Alert Center and VirusTotal. At that time, any admin who had the Alert Center privilege could access all VirusTotal reports. Now, we’ve added the ability for admins to control who can view VirusTotal reports. 




Important note: Once this feature is rolled out in your domain, some admins may lose access to VirusTotal. If so, super admins will have to re-provision access by going to Admin Privileges > View VirusTotal Reports


Who’s impacted 

Admins 


Why you’d use it 

This change will help ensure only those with proper privileges can view VirusTotal reports regarding sensitive data. The VirusTotal integration provides an added layer of investigation on top of existing alerts, empowering admins to take deeper look into threats and potential abuse, helping them better protect their organization and data. Visit the Help Center to learn more about using VirusTotal reports in the Alert Center


Additional details 

VirusTotal provides an investigation layer on top of alerts but isn’t being used directly for detection or alerting. No customer information is shared from Google to VirusTotal. 


Getting started 


Rollout pace 


Availability 

  • Available to Google Workspace Business Plus, Enterprise Standard, Enterprise Plus, Education Fundamentals and Education Plus customers 
  • Not available to Google Workspace Essentials, Business Starter, Business Standard, Enterprise Essentials, Frontline, and Nonprofits, as well as G Suite Basic and Business customers 

Resources 

Introducing the Google Wallet API

Posted by Petra Cross, Engineer, Google Wallet and Jose Ugia, Google Developer Relations Engineer

Google Pay API for Passes is now called Google Wallet API

Formerly known as Google Pay API for Passes, the Google Wallet API lets you digitize everything from boarding passes to loyalty programs, and engage your customers with notifications and real-time updates.

New features in Google Wallet API

Support for Generic Pass Type

The Google Pay API for Passes supported 7 types of passes: offers, loyalty cards, gift cards, event tickets, boarding passes, transit tickets and vaccine cards. But what if you want to issue passes or cards that do not fit into any of these categories, such as membership cards, or insurance cards?

We are thrilled to announce support for generic passes to the Google Wallet API so you can customize your pass objects to adapt to your program characteristics. The options are endless. If it is a card and has some text, a barcode or a QR code, it can be saved as a generic card.

You now have the flexibility to control the look and design of the card itself, by providing a card template that can contain up to 3 rows with 1-3 fields per row. You can also configure a number of attributes such as the barcode, QR code, or a hero image. Check out our documentation to learn more about how to create generic passes.

While generic passes can be used to mimic the appearance of any existing supported pass type (such as a loyalty card), we recommend you to continue to use specialized pass types when available. For example, when you use the boarding pass type for boarding passes your users receive flight delay notifications.

Grouping passes and mixing pass types

With the new Google Wallet API, you can also group passes to offer a better experience to your users when multiple passes are needed. For example, you can group the entry ticket, a parking pass, and food vouchers for a concert.

In your user’s list of passes, your users see a pass tile with a badge showing the number of items in the group. When they tap on this tile, a carousel with all passes appears, allowing them to easily swipe between all passes in the group.


Here is an example JSON Web Token payload showing one offer and one event ticket, mixed together and sharing the same groupingId. Later, if you need to add or remove passes to/from the group, you can use the REST API to update the grouping information.

{

  "iss""OWNER_EMAIL_ADDRESS",

  "aud""google",

  "typ""savetowallet",

  "iat""UNIX_TIME",

  "origins": [],

  "payload": {

    "offerObjects": [

      {

        "classId""YOUR_ISSUER_ID.OFFER_CLASS_ID",

        "id""YOUR_ISSUER_ID.OFFER_ID",

        "groupingInfo": {

          "groupingId""groupId1",

          "sortIndex"2

        }

      }

    ],

    "eventTicketObjects": [

      {

        "classId""YOUR_ISSUER_ID.EVENT_CLASS_ID",

        "id""YOUR_ISSUER_ID.EVENT_ID",

        "groupingInfo": {

          "groupingId""groupId1",

          "sortIndex"1

        }

      }

    ] 

  }

}


A note about Google Pay API for Passes:

Although we are introducing the Google Wallet API, all existing developer integrations with the previous Google Pay Passes API will continue to work. When the Google Wallet app is launched in just a few weeks, make sure to use the new “Add to Google Wallet” button in the updated button guidelines.

We’re really excited to build a great digital wallet experience with you, and can’t wait to see how you use the Google Wallet API to enhance your user experience.

Learn more