New ways to stay connected and entertained in your car

Our work in cars has always been guided by our goal to help make your driving experience easier and safer. Today, we’re introducing several updates for cars compatible with Android Auto and cars with Google built-in to help you stay connected and entertained while enhancing your experience on the road.

A brand-new look for Android Auto

Since it first launched, Android Auto has expanded to support more than 150 million cars across nearly every car brand. And over the years, we’ve found there are three main functionalities that drivers prioritize in their cars: navigation, media and communication. This summer, Android Auto will roll out a brand new interface that will help you get directions faster, control your media more easily and have more functionality at your fingertips.

Car dashboard with display showcasing new Android Auto design in different screen sizes

Built to adapt to any screen size

With split screen mode, now standard across all screen types and sizes, you’ll have access to your most-used features all in one place — no need to return to your home screen or scroll through a list of apps. With your navigation and media always on, you won’t have to worry about missing your next turn while changing your favorite commute podcast. And with the new design able to adapt to different screen sizes, it looks great across widescreen, portrait and more.

New features for Android Auto

Google Assistant is bringing contextual suggestions to help you be more productive in the car. From suggested replies, to messages, to sharing arrival times with a friend, or even playing recommended music, Google Assistant is helping you do more in the car efficiently.

In addition to using your voice, you can now quickly message and call favorite contacts with just one tap, and reply to messages by simply selecting a suggested response on the screen – helping you communicate effectively, while allowing you to keep your eyes on the road. Keep an eye out for these updates to Android Auto in the coming months

Stay connected and entertained with Google built-in

Cars with Google built-in often come with large displays, and we’re continuing to build new experiences for those displays while your car is parked. We previously announced we’re bringing YouTube to cars with Google built-in and more video streaming apps will join the queue, including Tubi and Epix Now. So, when you’re parked waiting for your car to charge or at curbside pickup, you’ll be able to enjoy video directly from your car display.

As we work to add more capabilities to cars with Google built-in in the future, you’ll be able to not only browse the web directly from your car display, but also cast your own content from your phone to your car screen.

Car dashboard with display showcasing Tubi

Enjoy video content directly from your car’s screen while parked

Across Android Auto and cars with Google built-in, we’re working hard to ensure every drive is a helpful and connected experience.

100 things we announced at I/O

And that’s a wrap on I/O 2022! We returned to our live keynote event, packed in more than a few product surprises, showed off some experimental projects and… actually, let’s just dive right in. Here are 100 things we announced at I/O 2022.

Gear news galore

Pixel products grouped together on a white background. Products include Pixel Bud Pro, Google Pixel Watch and Pixel phones.
  1. Let’s start at the very beginning — with some previews. We showed off a first look at the upcoming Pixel 7 and Pixel 7 Pro[1ac74e], powered by the next version of Google Tensor
  2. We showed off an early look at Google Pixel Watch! It’s our first-ever all-Google built watch: 80% recycled stainless steel[ec662b], Wear OS, Fitbit integration, Assistant access…and it’s coming this fall.
  3. Fitbit is coming to Google Pixel Watch. More experiences built for your wrist are coming later this year from apps like Deezer and Soundcloud.
  4. Later this year, you’ll start to see more devices powered with Wear OS from Samsung, Fossil Group, Montblanc and others.
  5. Google Assistant is coming soon to the Samsung Galaxy Watch 4 series.
  6. The new Pixel Buds Pro use Active Noise Cancellation (ANC), a feature powered by a custom 6-core audio chip and Google algorithms to put the focus on your music — and nothing else.
  7. Silent Seal™ helps Pixel Buds Pro adapt to the shape of your ear, for better sound. Later this year, Pixel Buds Pro will also support spatial audio to put you in the middle of the action when watching a movie or TV show with a compatible device and supported content.
  8. They also come in new colors: Charcoal, Fog, Coral and Lemongrass. Ahem, multiple colors — the Pixel Buds Pro have a two-tone design.
  9. With Multipoint connectivity, Pixel Buds Pro can automatically switch between your previously paired Bluetooth devices — including compatible laptops, tablets, TVs, and Android and iOS phones.
  10. Plus, the earbuds and their case are water-resistant[a53326].
  11. …And you can preorder them on July 21.
  12. Then there’s the brand new Pixel 6a, which comes with the full Material You experience.
  13. The new Pixel 6a has the same Google Tensor processor and hardware security architecture with Titan M2 as the Pixel 6 and Pixel 6 Pro.
  14. It also has two dual rear cameras — main and ultrawide lenses.
  15. You’ve got three Pixel 6a color options: Chalk, Charcoal and Sage. The options keep going if you pair it with one of the new translucent cases.
  16. It costs $449 and will be available for pre-order on July 21.
  17. We also showed off an early look at the upcoming Pixel tablet[a12f26], which we’re aiming to make available next year.

Android updates

18. In the last year, over 1 billion new Android phones have been activated.

19. You’ll no longer need to grant location to apps to enable Wi-Fi scanning in Android 13.

20. Android 13 will automatically delete your clipboard history after a short time to preemptively block apps from seeing old copied information

21. Android 13’s new photo picker lets you select the exact photos or videos you want to grant access to, without needing to share your entire media library with an app.

22. You’ll soon be able to copy a URL or picture from your phone, and paste it on your tablet in Android 13.

23. Android 13 allows you to select different language preferences for different apps.

24. The latest Android OS will also require apps to get your permission before sending you notifications.

25. And later this year, you’ll see a new Security & Privacy settings page with Android 13.

26. Google’s Messages app already has half a billion monthly active users with RCS, a new standard that enables you to share high-quality photos, see type indicators, message over Wi-Fi and get a better group messaging experience.

27. Messages is getting a public beta of end-to-end encryption for group conversations.

28. Early earthquake warnings are coming to more high-risk regions around the world.

29. On select headphones, you’ll soon be able to automatically switch audio between the devices you’re listening on with Android.

30. Stream and use messaging apps from your Android phone to laptop with Chromebook’s Phone Hub, and you won’t even have to install any apps.

31. Google Wallet is here! It’s a new home for things like your student ID, transit tickets, vaccine card, credit cards, debits cards.

32. You can even use Google Wallet to hold your Walt Disney World park pass.

33. Google Wallet is coming to Wear OS, too.

34. Improved app experiences are coming for Android tablets: YouTube Music, Google Maps and Messages will take advantage of the extra screen space, and more apps coming soon include TikTok, Zoom, Facebook, Canva and many others.

Developer deep dive

Illustration depicting a smart home, with lights, thermostat, television, screen and mobile device.

35. The Google Home and Google Home Mobile software developer kit (SDK) for Matter will be launching in June as developer previews.

36. The Google Home SDK introduces Intelligence Clusters, which make intelligence features like Home and Away, available to developers.

37. Developers can even create QR codes for Google Wallet to create their own passes for any use case they’d like.

38. Matter support is coming to the Nest Thermostat.

39. The Google Home Developer Center has lots of updates to check out.

40. There’s now built-in support for Matter on Android, so you can use Fast Pair to quickly connect Matter-enabled smart home devices to your network, Google Home and other accompanying apps in just a few taps.

41. The ARCore Geospatial API makes Google Maps’ Live View technology available to developers for free. Companies like Lime are using it to help people find parking spots for their scooters and save time.

42. DOCOMO and Curiosity are using the ARCore Geospatial API to build a new game that lets you fend off virtual dragons with robot companions in front of iconic Tokyo landmarks, like the Tokyo Tower.

43. AlloyDB is a new, fully-managed PostgreSQL-compatible database service designed to help developers manage enterprise database workloads — in our performance tests, it’s more than four times faster for transactional workloads and up to 100 times faster for analytical queries than standard PostgreSQL.

44. AlloyDB uses the same infrastructure building blocks that power large-scale products like YouTube, Search, Maps and Gmail.

45. Google Cloud’s machine learning cluster powered by Cloud TPU v4 Pods is super powerful — in fact, we believe it’s the world’s largest publicly available machine learning hub in terms of compute power…

46. …and it operates at 90% carbon-free energy.

47. We also announced a preview of Cloud Run jobs, which reduces the time developers spend running administrative tasks like database migration or batch data transformation.

48. We announced Flutter 3.0, which will enable developers to publish production-ready apps to six platforms at once, from one code base (Android, iOS, Desktop Web, Linux, Desktop Windows and MacOS).

49. To help developers build beautiful Wear apps, we announced the beta of Jetpack Compose for Wear OS.

50. We’re making it faster and easier for developers to build modern, high-quality apps with new Live edit features in Android Studio.

Help for the home

GIF of a man baking cookies with a speech bubble saying “Set a timer for 10 minutes.” His Google Nest Hub Max responds with a speech bubble saying “OK, 10 min. And that’s starting…now.”

51. Many Nest Devices will become Matter controllers, which means they can serve as central hubs to control Matter-enabled devices both locally and remotely from the Google Home app.

52. Works with Hey Google is now Works with Google Home.

53. The new home.google is your new hub for finding out everything you can do with your Google Home system.

54. Nest Hub Max is getting Look and Talk, where you can simply look at your device to ask a question without saying “Hey Google.”

55. Look and Talk works when Voice Match and Face Match recognize that it’s you.

56. And video from Look and Talk interactions is processed entirely on-device, so it isn’t shared with Google or anyone else.

57. Look and Talk is opt-in. Oh, and FYI, you can still say “Hey Google” whenever you want!

58. Want to learn more about it? Just say “Hey Google, what is Look and Talk?” or “Hey Google, how do you enable Look and Talk?”

59. We’re also expanding quick phrases to Nest Hub Max, so you can skip saying “Hey Google” for some of your most common daily tasks – things like “set a timer for 10 minutes” or “turn off the living room lights.”

60. You can choose the quick phrases you want to turn on.

61. Your quick phrases will work when Voice Match recognizes it’s you .

62. And looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.

Taking care of business

Animated GIF  demonstrating portrait light, bringing studio-quality lighting effects to Google Meet.

63. Google Meet video calls will now look better thanks to portrait restore and portrait light, which use AI and machine learning to improve quality and lighting on video calls.

64. Later this year we’re scaling the phishing and malware protections that guard Gmail to Google Docs, Sheets and Slides.

65. Live sharing is coming to Google Meet, meaning users will be able to share controls and interact directly within the meeting, whether it’s watching an icebreaker video from YouTube or sharing a playlist.

66. Automated built-in summaries are coming to Spaces so you can get a helpful digest of conversations to catch up quickly.

67. De-reverberation for Google Meet will filter out echoes in spaces with hard surfaces, giving you conference-room audio quality whether you’re in a basement, a kitchen, or a big empty room.

68. Later this year, we're bringing automated transcriptions of Google Meet meetings to Google Workspace, so people can catch up quickly on meetings they couldn't attend.

Apps for on-the-go

A picture of London in immersive view.

69. Google Wallet users will be able to check the balance of transit passes and top up within Google Maps.

70. Google Translate added 24 new languages.

71. As part of this update, Indigenous languages of the Americas (Quechua, Guarani and Aymara) and an English dialect (Sierra Leonean Krio) have also been added to Translate for the first time.

72. Google Translate now supports a total of 133 languages used around the globe.

73. These are the first languages we’ve added using Zero-resource Machine Translation, where a machine learning model only sees monolingual text — meaning, it learns to translate into another language without ever seeing an example.

74. Google Maps’ new immersive view is a whole new way to explore so you can see what an area truly looks and feels like.

75. Immersive view will work on nearly any phone or tablet; you don’t need the fanciest or newest device.

76. Immersive view will first be available in L.A., London, New York, San Francisco and Tokyo — with more places coming soon.

77. Last year we launched eco-friendly routing in the U.S. and Canada. Since then, people have used it to travel 86 billion miles, which saved more than half a million metric tons of carbon emissions — that’s like taking 100,000 cars off the road.

78. And we’re expanding eco-friendly routing to more places, like Europe.

All in on AI

Ten circles in a row, ranging from dark to light.

The 10 shades of the Monk Skin Tone Scale.

79. A team at Google Research partnered with Harvard’s Dr. Ellis Monk to openly release the Monk Skin Tone Scale, a new tool for measuring skin tone that can help build more inclusive products.

80. Google Search will use the Monk Skin Tone Scale to make it easier to find more relevant results — for instance, if you search for “bridal makeup,” you’ll see an option to filter by skin tone so you can refine to results that meet your needs.

81. Oh, and the Monk Skin Tone Scale was used to evaluate a new set of Real Tone filters for Photos that are designed to work well across skin tones. These filters were created and tested in partnership with artists like Kennedi Carter and Joshua Kissi.

82. We’re releasing LaMDA 2, as a part of the AI Test Kitchen, a new space to learn, improve, and innovate responsibly on this technology together.

83. PaLM is a new language model that can solve complex math word problems, and even explain its thought process, step-by-step.

84. Nest Hub Max’s new Look and Talk feature uses six machine learning models to process more than 100 signals in real time to detect whether you’re intending to make eye contact with your device so you can talk to Google Assistant and not just giving it a passing glance.

85. We recently launched multisearch in the Google app, which lets you search by taking a photo and asking a question at the same time. At I/O, we announced that later this year, you'll be able to take a picture or screenshot and add "near me" to get local results from restaurants, retailers and more.

86. We introduced you to an advancement called “scene exploration,” where in the future, you’ll be able to use multisearch to pan your camera and instantly glean insights about multiple objects in a wider scene.

Privacy, security and information

A GIF that shows someone’s Google account with a yellow alert icon, flagging recommended actions they should take to secure their account.

87. We’ve expanded our support for Project Shield to protect the websites of 200+ Ukrainian government agencies, news outlets and more.

88. Account Safety Status will add a simple yellow alert icon to flag actions you should take to secure your Google Account.

89. Phishing protections in Google Workspace are expanding to Docs, Slides and Sheets.

90. My Ad Center is now giving you even more control over the ads you see on YouTube, Search, and your Discover feed.

91. Virtual cards are coming to Chrome and Android this summer, adding an additional layer of security and eliminating the need to enter certain card details at checkout.

92. In the coming months, you’ll be able to request removal of Google Search results that have your contact info with an easy-to-use tool.

93. Protected Computing, a toolkit that helps minimize your data footprint, de-identifies your data and restricts access to your sensitive data.

94. On-device encryption is now available for Google Password Manager.

95. We’re continuing to auto enroll people in 2-Step Verification to reduce phishing risks.

What else?!

Illustration of a black one-story building with large windows. Inside are people walking around wooden tables and white walls containing Google hardware products. There is a Google Store logo on top of the building.

96. A new Google Store is opening in Williamsburg.

97. This is our first “neighborhood store” — it’s in a more intimate setting that highlights the community. You can find it at 134 N 6th St., opening on June 16.

98. The store will feature an installation by Brooklyn-based artist Olalekan Jeyifous.

99. Visitors there can picture everyday life with Google products through interactive displays that show how our hardware and services work together, and even get hands-on help with devices from Google experts.

100. We showed a prototype of what happens when we bring technologies like transcription and translation to your line of sight.

Helping you build across devices, platforms, and the world

Posted by Jeanine Banks, VP & General Manager of Developer X & Head of Developer Relations

We’re thrilled to be back at the Shoreline Amphitheatre hosting Google I/O this week. It’s great to connect with you all from around the world virtually and in person.

I/O is our love letter to you, the developer. Developers are the engine which enables the information revolution. But more than that, it’s developers who turn information and ideas into code that powers the way we learn, work, communicate, and play.

A few decades ago, building a digital experience meant publishing a static website and reaching thousands of people on their desktops. Today, it means a lightning-fast, interactive experience across browsers, desktops, phones, tablets, virtual assistants, TVs, gaming consoles, cars, watches, and more. People expect new features faster than ever -- all while we respect and uphold the highest standards for privacy and safety.

To help you deal with the complexity and rising expectations, we want to bring simplicity to the challenges you face. This week at I/O, we shared the beginning of a long-term effort to connect our developer products to work even better together, and provide more guidance and best practices to optimize your end-to-end workflow. Here are just a few highlights of what we announced in the developer keynote:

  • The new ARCore Geospatial API, that lets you place AR content at real-world locations in 87 countries without physically being there.
  • Modern Android Development for the best experiences on any screen, including new Jetpack Compose support for WearOS and tablets, an upgrade to Android Studio with Live Edit, and much more.
  • Chrome DevTools’ new Performance Insights panel and support coming in WebAssembly for managed programming languages like Dart, Java, and Kotlin.
  • Flutter 3, our open source multi-platform UI framework, now supports six platforms for building beautiful applications from a single code base.
  • Firebase Crashlytics seamlessly integrated across Android Studio, Flutter, and Google Play for consistent and actionable crash reporting.
  • Cloud Run jobs to execute batch data transformation, administrative tasks or scheduled jobs, and AlloyDB for PostgreSQL, our new fully managed, relational database that’s more than 4x faster than standard PostgreSQL for transactional workloads.
  • Exciting research in AI-assisted coding and the AI for Code (AI4Code) challenge on Kaggle in partnership with X, the moonshot factory.

Watch the developer keynote or this recap video to get a fuller taste of what's new this year across many of our platforms including Android, ARCore, Chrome OS, Cloud, Flutter, Firebase, Google Play, Kaggle, Machine Learning, and Web Platform:

Whether you are looking to build your first app, expand what your products can do, or leverage ML easily and responsibly, we hope you will be inspired by the vast space in front of you to make your ideas a reality and make people’s lives better.

Make connections that Matter in Google Home

We’re entering a new era of the smart home built on openness and collaboration — one where you should have no problem using devices from different smart home brands to turn on your lights, warm up your living room and set your morning alarm. All of them should work together in harmony.

Matter, the new smart home industry standard we developed with other leading technology companies, is making this possible. Whether you’re shopping for or building your own smart home devices, let’s take a closer look at how Matter can help you make more connections with Google products and beyond when it launches later this year.

Connect your favorite smart home brands

When you buy a Matter-enabled device, the set-up process will be quick and consistent. In just a few taps, you can easily link it to your home network, another smart home ecosystem and your favorite apps. Support for Matter through Fast Pair on Android makes it as easy as connecting a new pair of headphones. And because Matter devices connect and communicate locally over Wi-Fi and Thread, a wireless mesh networking technology, they’re more reliable and responsive — reducing lag and potential connection interruptions.

To help you get ready for Matter, we’ll update many Google Nest devices to be Matter controllers. This will let you connect all your Matter-enabled devices to Google Home, and control them both locally and remotely with the Google Home app, smart home controls on your Android phone or Google Assistant. Matter controllers will include the original Google Home speaker, Google Mini, Nest Mini, Nest Hub (1st and 2nd gen), Nest Hub Max, Nest Audio and Nest Wifi.

Meanwhile, Nest Wifi, Nest Hub Max and Nest Hub (2nd gen) will also serve as Thread border routers, allowing you to connect devices built with Thread — like backyard lights that need long-range connectivity — to your home network.

We’ve also rolled out a new Google Home site to help you explore everything you can do with your Google Home in one spot. You can discover thousands of smart home devices that work with Google Home and learn how to get the most out of your helpful home — including automated routines to make everyday life easier, safer and more convenient.

To make it easier to find products that work great with Google Home, we're updating our “Works with” partner program. Works with Hey Google is now Works with Google Home. Partner devices that carry this badge have gone the extra mile to build high-quality experiences with Google using Matter or our existing integrations. It’ll take some time for all our partners to start using the new badge — but if you spot either of these badges on a smart home product, you’ll know they easily connect with Google and our home control features like routines, voice control through Google Assistant devices and Android home controls.

Build more connected smart home devices

Developers, take note: With Matter, there’s no need to build multiple versions of a smart home device to work across different ecosystems. You’ll only have to build once, and that device will work right away with Google Home and other smart home brands. This means you can spend less time building multiple connectivity paths, and more time innovating and delivering devices and features.

To help you do that, we’ve launched a new Google Home Developer Center that brings together all our resources for developers and businesses. You can start learning today how to build smart home devices and Android apps with Matter, discover new features to integrate into your devices and explore marketing resources to help grow your business. You’ll also find new community support tools for device makers building with Google Home.

On June 30, we’ll launch the Google Home Developer Console, including two new software development kits (SDKs) to make it easy to build Matter devices and apps. The Google Home Device SDK is the fastest way to start building Matter devices. This SDK will also introduce Intelligence Clusters, which will share Google Intelligence — starting with Home & Away Routines — with developers who meet certain security and privacy requirements.

The new Google Home Mobile SDK will make it easy to build apps that connect directly with Matter devices using new built-in connectivity support in Android. This makes the set-up process simpler, more consistent and reliable for Android users. And with connectivity taken care of, developers can spend more time building unique features and experiences.

We can’t wait to see how you use Matter, Thread and Google Home to build and create the smart home experience that best suits you. Check out home.google and developers.home.google.com to learn more and sign up for future updates.

New delegated VirusTotal privilege in the Alert Center

What’s changing 

In 2021, we announced an integration between the Alert Center and VirusTotal. At that time, any admin who had the Alert Center privilege could access all VirusTotal reports. Now, we’ve added the ability for admins to control who can view VirusTotal reports. 




Important note: Once this feature is rolled out in your domain, some admins may lose access to VirusTotal. If so, super admins will have to re-provision access by going to Admin Privileges > View VirusTotal Reports


Who’s impacted 

Admins 


Why you’d use it 

This change will help ensure only those with proper privileges can view VirusTotal reports regarding sensitive data. The VirusTotal integration provides an added layer of investigation on top of existing alerts, empowering admins to take deeper look into threats and potential abuse, helping them better protect their organization and data. Visit the Help Center to learn more about using VirusTotal reports in the Alert Center


Additional details 

VirusTotal provides an investigation layer on top of alerts but isn’t being used directly for detection or alerting. No customer information is shared from Google to VirusTotal. 


Getting started 


Rollout pace 


Availability 

  • Available to Google Workspace Business Plus, Enterprise Standard, Enterprise Plus, Education Fundamentals and Education Plus customers 
  • Not available to Google Workspace Essentials, Business Starter, Business Standard, Enterprise Essentials, Frontline, and Nonprofits, as well as G Suite Basic and Business customers 

Resources 

Introducing the Google Wallet API

Posted by Petra Cross, Engineer, Google Wallet and Jose Ugia, Google Developer Relations Engineer

Google Pay API for Passes is now called Google Wallet API

Formerly known as Google Pay API for Passes, the Google Wallet API lets you digitize everything from boarding passes to loyalty programs, and engage your customers with notifications and real-time updates.

New features in Google Wallet API

Support for Generic Pass Type

The Google Pay API for Passes supported 7 types of passes: offers, loyalty cards, gift cards, event tickets, boarding passes, transit tickets and vaccine cards. But what if you want to issue passes or cards that do not fit into any of these categories, such as membership cards, or insurance cards?

We are thrilled to announce support for generic passes to the Google Wallet API so you can customize your pass objects to adapt to your program characteristics. The options are endless. If it is a card and has some text, a barcode or a QR code, it can be saved as a generic card.

You now have the flexibility to control the look and design of the card itself, by providing a card template that can contain up to 3 rows with 1-3 fields per row. You can also configure a number of attributes such as the barcode, QR code, or a hero image. Check out our documentation to learn more about how to create generic passes.

While generic passes can be used to mimic the appearance of any existing supported pass type (such as a loyalty card), we recommend you to continue to use specialized pass types when available. For example, when you use the boarding pass type for boarding passes your users receive flight delay notifications.

Grouping passes and mixing pass types

With the new Google Wallet API, you can also group passes to offer a better experience to your users when multiple passes are needed. For example, you can group the entry ticket, a parking pass, and food vouchers for a concert.

In your user’s list of passes, your users see a pass tile with a badge showing the number of items in the group. When they tap on this tile, a carousel with all passes appears, allowing them to easily swipe between all passes in the group.


Here is an example JSON Web Token payload showing one offer and one event ticket, mixed together and sharing the same groupingId. Later, if you need to add or remove passes to/from the group, you can use the REST API to update the grouping information.

{

  "iss""OWNER_EMAIL_ADDRESS",

  "aud""google",

  "typ""savetowallet",

  "iat""UNIX_TIME",

  "origins": [],

  "payload": {

    "offerObjects": [

      {

        "classId""YOUR_ISSUER_ID.OFFER_CLASS_ID",

        "id""YOUR_ISSUER_ID.OFFER_ID",

        "groupingInfo": {

          "groupingId""groupId1",

          "sortIndex"2

        }

      }

    ],

    "eventTicketObjects": [

      {

        "classId""YOUR_ISSUER_ID.EVENT_CLASS_ID",

        "id""YOUR_ISSUER_ID.EVENT_ID",

        "groupingInfo": {

          "groupingId""groupId1",

          "sortIndex"1

        }

      }

    ] 

  }

}


A note about Google Pay API for Passes:

Although we are introducing the Google Wallet API, all existing developer integrations with the previous Google Pay Passes API will continue to work. When the Google Wallet app is launched in just a few weeks, make sure to use the new “Add to Google Wallet” button in the updated button guidelines.

We’re really excited to build a great digital wallet experience with you, and can’t wait to see how you use the Google Wallet API to enhance your user experience.

Learn more

Simpler Google Wallet integration for Android developers

Posted by Petra Cross, Engineer, Google Wallet and Jose Ugia, Google Developer Relations Engineer

Today more than ever, consumers expect to be able to digitize their physical wallet, from payments and loyalty to tickets and IDs. At Google I/O we announced Google Wallet, which allows users to do exactly that. Consumers can securely store and manage their payment and loyalty cards, board a flight, access a gym and much more, all with just their Android phone.

For Android developers, who manage their own digital passes, Google Wallet offers a fast and secure entry point, especially when quick access is needed. Google Wallet will be quickly accessible from the device lock screen on Pixel devices and from the pulldown shade. Your users will be able to quickly access their passes when they need them - all in one place.

Integrating with Google Wallet became even easier and more flexible. We’ve summarized the news of what you can expect as an Android developer.

New Android SDK

The existing Android SDK supports saving three types of passes: offers, loyalty cards, and gift cards. You asked us to add support for other pass types, and we’ve heard you. Today, we are announcing a new, more extensible API and Android SDK, that in addition to tickets, boarding passes, and transit tickets, and additional pass types, includes support for the new generic pass, which lets your users store any pass or card to Google Wallet. The Android SDK lets you create passes using JSON or JSON Web Token as a payload without a backend integration.

Using the Android SDK is straightforward. First, you create a payload with information about the pass. You can either build it directly in your Android app, or retrieve it from your backend stack. Then, you call the savePasses or savePassesJwt method in the "PayClient" to add the pass to Google Wallet.

Here is how you define and save a sample generic pass object:

{

  "id""ISSUER_ID.OBJECT_ID",

  "classId""CLASS_ID",

  "genericType""GENERIC_TYPE_UNSPECIFIED",

  "cardTitle": {

    "defaultValue": {

      "language""en",

      "value""Your Program Name"

    }

  },

  "header": {

    "defaultValue": {

      "language""en",

      "value""Alex McJacobs"

    }

  }

}


private val addToGoogleWalletRequestCode = 1000

private val walletClientPayClient = Pay.getClient(application)

private val jwtString = "" // Fetch a previously created JWT with pass data

walletClient.savePassesJwt(jwtString, thisaddToGoogleWalletRequestCode)

Once your app calls savePassesJwt, the process guides your users through the flow of adding a pass to Google Wallet, and allows them to preview the pass before confirming the save operation.

Developer documentation, samples and codelabs

You can find the new Wallet API documentation on developers.google.com/wallet. We customized our developer guides for each pass type to make all the information easily accessible for your specific needs. You will also find plenty of code samples demonstrating how to check for availability of the Google Wallet API on the Android device, how to handle errors, and how to add the “Add to Google Wallet” button to your app.

Don’t forget to play with our interactive passes visual demo, which lets you fill in the fields and create your own custom pass prototype without writing a single line of code. The tool also generates code samples that you can use to build this pass’ data structures which we call “classes” and “objects”.

We’re really excited to build a great digital wallet experience with you, and can’t wait to see how you use the Google Wallet API to enrich your customer experience. Take a look at our hands-on workshop "Digitize any wallet object with the Google Wallet API" to see a full integration tutorial on Android.

Learn more

Manage your passes from Google Pay and Wallet Console

Posted by Ryan Novas, Product Manager, Google Pay’s Business Console and Jose Ugia, Developer Relations Engineer, Google Pay

Author Picture

Today, we are introducing the Google Pay & Wallet Console, a platform that helps developers discover, integrate with, and manage Google Pay and Google Wallet features for their businesses. Integrating Google Pay and Google Wallet products has become easier and faster, with features like a common business profile and a unified dashboard. Check out the new Google Wallet section in the console’s left-hand navigation bar, where you can manage all your tickets, loyalty programs, offers and other passes resources from one place. Google Pay & Wallet Console features a more familiar and intuitive user interface that helps you reuse common bits of information, like your business information, and lets you easily browse and discover products, such as the Online API.

The new Google Wallet section in Google Pay & Wallet Console lets you request access to the API and manage your passes alongside other Google Pay and Google Wallet resources.


You can also manage authentication keys for your Smart Tap integration directly from the console, and let customers use eligible passes saved to Google Pay by simply holding their phones to NFC point-of-sale terminals.

Visit Google Pay & Wallet Console today, and start managing your existing products, or discover and integrate with new ones.

Here is what early users are saying about managing passes in the console:

“The cleaner and consistent look of Google Pay & Wallet Console helps us manage our Google Pay and Google Wallet resources more intuitively." Or Maoz, Senior Director of R&D at EngagedMedia said.

The user management additions also helped EngagedMedia better represent their team in the console:

“The new user roles and controls on Google Pay & Wallet Console help us handle permissions more intuitively and accurately, and allow us to assign roles that better reflect our team structure more easily.”

We are committed to continuously evolving Google Pay & Wallet Console to make it your go-to place to discover and manage Google Pay and Google Wallet integrations. We’d love to hear about your experience. You can share feedback with us from the “Feedback” section in the console. We’re looking forward to learning how we can make Google Pay and Google Wallet even more helpful for you in the future.

Learn more

Want to learn more?

Make the world your canvas with the ARCore Geospatial API

Posted by Bilawal Sidhu, Senior Product Manager, Google Maps and Eric Lai, Group Product Manager, ARCore

ARCore, our AR developer platform, works across billions of devices, providing developers with simple yet powerful tools to build immersive experiences that seamlessly blend the digital and physical worlds.

In 2019, we launched the ARCore Cloud Anchors API for developers to anchor content to specific locations and design experiences that can be shared over time by multiple people across many different devices. Since then, we’ve been listening to developer feedback on how to make it easier to create and deploy AR experiences at scale.

Today, we’re taking a leap forward by launching the ARCore Geospatial API in ARCore SDKs for Android and iOS across all compatible ARCore-enabled devices. This API is available now at no cost to download and opens up nearly 15 years of our understanding of the world through Google Maps to help developers build AR experiences that are more immersive, richer and more useful.

The Geospatial API provides access to global localization — the same technology that has been powering Live View in Google Maps since 2019, providing people with helpful AR powered arrows and turn-by-turn directions. Based on the Visual Positioning Service (VPS) with tens of billions of images in Street View, developers can now anchor content by latitude, longitude and altitude in over 87 countries, without being there or having to scan the physical space, saving significant time and resources.

Using machine learning to compute a 3D point-cloud of the environment from Google Street View imagery

For end users, discovering and interacting with AR is faster and more accurate as images from the scanned environment are instantaneously matched against our model of the world. This model is built using advanced machine-learning techniques, which extract trillions of 3D points from Street View images that are then used to compute the device position and orientation in less than a second. In other words, users can be anywhere Street View is available, and just by pointing their camera, their device understands exactly where it is, which way it is pointed and where the AR content should appear, almost immediately.

We’ve been working with early access partners like the NBA, Snap, Lyft, and more to explore and build applications for different industries, including education, entertainment and utility. For example, micro mobility companies Bird, Lime and WeMo are using the API to remove friction from parking e-scooters and e-bikes, adding pinpoint accuracy so riders know exactly when their vehicle is in a valid parking spot. Lime has been piloting the experience in London, Paris, Tel Aviv, Bordeaux, Madrid, and San Diego.

Bird (left) and Lime (right) use the ARCore Geospatial API to enable more precise location-based AR experiences

Telstra and Accenture are using the API to help sports fans and concertgoers find their seats, concession stands and restrooms at Marvel Stadium in Melbourne, Australia. DOCOMO and Curiosity are building a new game that lets you fend off virtual dragons with robot-companions in front of iconic Tokyo landmarks.


Telstra and Accenture (left) and DOCOMO (right) use the ARCore Geospatial API to create new, entertaining AR experiences

To help you get started, we’re also releasing two open source demo apps to clone and extend into your own applications. Balloon Pop lets people place and use balloons as targets around the world, together and at the same time. Pocket Garden lets you adorn your neighborhood with a colorful AR community garden.


Balloon Pop (left) and Pocket Garden (right) are open source demo apps that showcase the ARCore Geospatial API

With the introduction of the ARCore Geospatial API we're providing the foundation for building world scale AR experiences. Get started today at g.co/ARCore. We’re excited to see what you create when the world is your canvas!

13 Things to know for Android developers at Google I/O!

Posted by Maru Ahues Bouza, Director of Android Developer Relations

Android I/O updates: Jetpack, Wear OS, etc 

There aren’t many platforms where you can build something and instantly reach billions of people around the world, not only on their phones—but their TVs, cars, tablets, watches, and more. Today, at Google I/O, we covered a number of ways Android helps you make the most of this opportunity, and how Modern Android Development brings as much commonality as possible, to make it faster and easier for you to create experiences that tailor to all the different screens we use in our daily lives.

We’ve rounded up the top 13 things to know for Android developers—from Jetpack Compose to tablets to Wear OS and of course… Android 13! And stick around for Day 2 of Google I/O, when Android’s full track of 26 technical talks and 4 workshops drop. We’re also bringing back the Android fireside Q&A in another episode of #TheAndroidShow; tweet us your questions now using #AskAndroid, and we’ve assembled a team of experts to answer live on-air, May 12 at 12:30PM PT.


MODERN ANDROID DEVELOPMENT

#1: Jetpack Compose Beta 1.2, with support for more advanced use cases

Android’s modern UI toolkit, Jetpack Compose, continues to bring the APIs you need to support more advanced use cases like downloadable fonts, LazyGrids, window insets, nested scrolling interop and more tooling support with features like LiveEdit, Recomposition Debugging and Animation Preview. Check out the blog post for more details.

Jetpack Compose 1.2 Beta  

#2: Android Studio: introducing Live Edit

Get more done faster with Android Studio Dolphin Beta and Electric Eel Canary! Android Studio Dolphin includes new features and improvements for Jetpack Compose and Wear OS development and an updated Logcat experience. Android Studio Electric Eel comes with integrations with the new Google Play SDK Index and Firebase Crashlytics. It also offers a new resizable emulator to test your app on large screens and the new Live Edit feature to immediately deploy code changes made within composable functions. Watch the What’s new in Android Development Tools session and read the Android Studio I/O blog post here.

#3: Baseline Profiles - speed up your app load time!

The speed of your app right after installation can make a big difference on user retention. To improve that experience, we created Baseline Profiles. Baseline Profiles allow apps and libraries to provide the Android runtime with metadata about code path usage, which it uses to prioritize ahead-of-time compilation. We've seen up to 30% faster app startup times thanks to adding baseline profiles alone, no other code changes required! We’re already using baseline profiles within Jetpack: we’ve added baselines to popular libraries like Fragments and Compose – to help provide a better end-user experience. Watch the What’s new in app performance talk, and read the Jetpack blog post here.

Modern Android Development 

BETTER TOGETHER

#4: Going big on Android tablets

Google is all in on tablets. Since last I/O we launched Android 12L, a release focused on large screen optimizations, and Android 13 includes all those improvements and more. We also announced the Pixel tablet, coming next year. With amazing new hardware, an updated operating system & Google apps, improved guidelines and libraries, and exciting changes to the Play store, there has never been a better time to review your apps and get them ready for large screens and Android 13. That’s why at this year’s I/O we have four talks and a workshop to take you from design to implementation for large screens.


#5: Wear OS: Compose + more!

With the latest updates to Wear OS, you can rethink what is possible when developing for wearables. Jetpack Compose for Wear OS is now in beta, so you can create beautiful Wear OS apps with fewer lines of code. Health Services is also now in beta, bringing a ton of innovation to the health and fitness developer community. And last, but certainly not least, we announced the launch of The Google Pixel Watch - coming this Fall - which brings together the best of Fitbit and Wear OS. You can learn more about all the most exciting updates for wearables by watching the Wear OS technical session and reading our Jetpack Compose for Wear OS announcement.

Compose for Wear OS 

#6: Introducing Health Connect

Health Connect is a new platform built in close collaboration between Google and Samsung, that simplifies connectivity between apps making it easier to reach more users with less work, so you can securely access and share user health and fitness data across apps and devices. Today, we’re opening up access to Health Connect through Jetpack Health—read our announcement or watch the I/O session to find out more!

#7: Android for Cars & Android TV OS

Android for Cars and Android TV OS continue to grow in the US and abroad. As more users drive connected or tune-in, we’re introducing new features to make it even easier to develop apps for cars and TV this year. Catch the “What’s new with Android for Cars” and “What's new with Google TV and Android TV” sessions on Day 2 (May 12th) at 9:00 AM PT to learn more.

#8: Add Voice Across Devices

We’re making it easier for users to access your apps via voice across devices with Google Assistant, by expanding developer access to Shortcuts API for Android for Cars, with support for Wear OS apps coming later this year. We’re also making it easier to build those experiences with Smarter Custom Intents, enabling Assistant to better detect broader instances of user queries through ML, without any NLU training heavy lift. Additionally, we’re introducing improvements that drive discovery to your apps via voice on Mobile, first through Brandless Queries, that drive app usage even when the user hasn’t explicitly said your app’s name, and App Install Suggestions that appear if your isn’t installed yet–these are automatically enabled for existing App Actions today.


AND THE LATEST FROM ANDROID, PLAY, AND MORE:

#9: What’s new in Play!

Get the latest updates from Google Play, including new ways Play can help you grow your business. Highlights include the ability to deep-link and create up to 50 custom listings; our LiveOps beta, which will allow more developers to submit content to be considered for featuring on the Play Store; and even more flexibility in selling subscriptions. Learn about these updates and more in our blog post.

#10: Google Play SDK Index

Evaluate if an SDK is right for your app with the new Google Play SDK index. This new public portal lists over 100 of the most widely used commercial SDKs and information like which app permissions the SDK requests, statistics on the apps that use them, and which version of the SDK is most popular. Learn more on our blog post and watch “What’s new in Google Play” and “What’s new in Android development tools” sessions.

#11: Privacy Sandbox on Android

Privacy Sandbox on Android provides a path for new advertising solutions to improve user privacy without putting access to free content and services at risk. We recently released the first Privacy Sandbox on Android Developer Preview so you can get an early look at the SDK Runtime and Topics API. You can conduct preliminary testing of these new technologies, evaluate how you might adopt them for your solutions, and share feedback with us.

#12: The new Google Wallet API

The new Google Wallet gives users fast and secure access to everyday essentials across Android and Wear OS. We’re enhancing the Google Wallet API, previously called Google Pay Passes API, to support generic passes, grouping and mixing passes together, for example grouping an event ticket with a voucher, and launching a new Android SDK which allows you to save passes directly from your app without a backend integration. To learn more, read the full blog post, watch the session, or read the docs at developers.google.com/wallet.

#13: And of course, Android 13!

The second Beta of Android 13 is available today! Get your apps ready for the latest features for privacy and security, like the new notification permission, the privacy-protecting photo picker, and improved permissions for pairing with nearby devices and accessing media files. Enhance your app with features like app-specific language support and themed app icons. Build with modern standards like HDR video and Bluetooth LE Audio. You can get started by enrolling your Pixel device here, or try Android 13 Beta on select phones, tablets, and foldables from our partners - visit developer.android.com/13 to learn more.

That’s just a snapshot of some of the highlights for Android developers at this year’s Google I/O. Be sure to watch the What’s New in Android talk to get the landscape on the full Android technical track at Google I/O, which includes 26 talks and 4 workshops. Enjoy!