Tag Archives: VR

Updates to the Android XR SDK: Introducing Developer Preview 2

Posted by Matthew McCullough – VP of Product Management, Android Developer

Since launching the Android XR SDK Developer Preview alongside Samsung, Qualcomm, and Unity last year, we’ve been blown away by all of the excitement we’ve been hearing from the broader Android community. Whether it's through coding live-streams or local Google Developer Group talks, it's been an outstanding experience participating in the community to build the future of XR together, and we're just getting started.

Today we’re excited to share an update to the Android XR SDK: Developer Preview 2, packed with new features and improvements to help you develop helpful and delightful immersive experiences with familiar Android APIs, tools and open standards created for XR.

At Google I/O, we have two technical sessions related to Android XR. The first is Building differentiated apps for Android XR with 3D content, which covers many features present in Jetpack SceneCore and ARCore for Jetpack XR. The future is now, with Compose and AI on Android XR covers creating XR-differentiated UI and our vision on the intersection of XR with cutting-edge AI capabilities.

Android XR sessions at Google I/O 2025
Building differentiated apps for Android XR with 3D content and The future is now, with Compose and AI on Android XR

What’s new in Developer Preview 2

Since the release of Developer Preview 1, we’ve been focused on making the APIs easier to use and adding new immersive Android XR features. Your feedback has helped us shape the development of the tools, SDKs, and the platform itself.

With the Jetpack XR SDK, you can now play back 180° and 360° videos, which can be stereoscopic by encoding with the MV-HEVC specification or by encoding view-frames adjacently. The MV-HEVC standard is optimized and designed for stereoscopic video, allowing your app to efficiently play back immersive videos at great quality. Apps built with Jetpack Compose for XR can use the SpatialExternalSurface composable to render media, including stereoscopic videos.

Using Jetpack Compose for XR, you can now also define layouts that adapt to different XR display configurations. For example, use a SubspaceModifier to specify the size of a Subspace as a percentage of the device’s recommended viewing size, so a panel effortlessly fills the space it's positioned in.

Material Design for XR now supports more component overrides for TopAppBar, AlertDialog, and ListDetailPaneScaffold, helping your large-screen enabled apps that use Material Design effortlessly adapt to the new world of XR.

An app adapts to XR using Material Design for XR with the new component overrides
An app adapts to XR using Material Design for XR with the new component overrides

In ARCore for Jetpack XR, you can now track hands after requesting the appropriate permissions. Hands are a collection of 26 posed hand joints that can be used to detect hand gestures and bring a whole new level of interaction to your Android XR apps:

moving image demonstrates how hands bring a natural input method to your Android XR experience.
Hands bring a natural input method to your Android XR experience.

For more guidance on developing apps for Android XR, check out our Android XR Fundamentals codelab, the updates to our Hello Android XR sample project, and a new version of JetStream with Android XR support.

The Android XR Emulator has also received updates to stability, support for AMD GPUs, and is now fully integrated within the Android Studio UI.

the Android XR Emulator in Android STudio
The Android XR Emulator is now integrated in Android Studio

Developers using Unity have already successfully created and ported existing games and apps to Android XR. Today, you can upgrade to the Pre-Release version 2 of the Unity OpenXR: Android XR package! This update adds many performance improvements such as support for Dynamic Refresh Rate, which optimizes your app’s performance and power consumption. Shaders made with Shader Graph now support SpaceWarp, making it easier to use SpaceWarp to reduce compute load on the device. Hand meshes are now exposed with occlusion, which enables realistic hand visualization.

Check out Unity’s improved Mixed Reality template for Android XR, which now includes support for occlusion and persistent anchors.

We recently launched Android XR Samples for Unity, which demonstrate capabilities on the Android XR platform such as hand tracking, plane tracking, face tracking, and passthrough.

moving image of Google’s open-source Unity samples demonstrating platform features and showing how they’re implemented
Google’s open-source Unity samples demonstrate platform features and show how they’re implemented

The Firebase AI Logic for Unity is now in public preview! This makes it easy for you to integrate gen AI into your apps, enabling the creation of AI-powered experiences with Gemini and Android XR. The Firebase AI Logic fully supports Gemini's capabilities, including multimodal input and output, and bi-directional streaming for immersive conversational interfaces. Built with production readiness in mind, Firebase AI Logic is integrated with core Firebase services like App Check, Remote Config, and Cloud Storage for enhanced security, configurability, and data management. Learn more about this on the Firebase blog or go straight to the Gemini API using Vertex AI in Firebase SDK documentation to get started.

Continuing to build the future together

Our commitment to open standards continues with the glTF Interactivity specification, in collaboration with the Khronos Group. which will be supported in glTF models rendered by Jetpack XR later this year. Models using the glTF Interactivity specification are self-contained interactive assets that can have many pre-programmed behaviors, like rotating objects on a button press or changing the color of a material over time.

Android XR will be available first on Samsung’s Project Moohan, launching later this year. Soon after, our partners at XREAL will release the next Android XR device. Codenamed Project Aura, it’s a portable and tethered device that gives users access to their favorite Android apps, including those that have been built for XR. It will launch as a developer edition, specifically for you to begin creating and experimenting. The best news? With the familiar tools you use to build Android apps today, you can build for these devices too.

product image of XREAL’s Project Aura against a nebulous black background
XREAL’s Project Aura

The Google Play Store is also getting ready for Android XR. It will list supported 2D Android apps on the Android XR Play Store when it launches later this year. If you are working on an Android XR differentiated app, you can get it ready for the big launch and be one of the first differentiated apps on the Android XR Play Store:

And we know many of you are excited for the future of Android XR on glasses. We are shaping the developer experience now and will share more details on how you can participate later this year.

To get started creating and developing for Android XR, check out developer.android.com/develop/xr where you will find all of the tools, libraries, and resources you need to work with the Android XR SDK. In particular, try out our samples and codelabs.

We welcome your feedback, suggestions, and ideas as you’re helping shape Android XR. Your passion, expertise, and bold ideas are vital as we continue to develop Android XR together. We look forward to seeing your XR-differentiated apps when Android XR devices launch later this year!

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.


Introducing Android XR SDK Developer Preview

Posted by Matthew McCullough – VP of Product Management, Android Developer

Today, we're launching the developer preview of the Android XR SDK - a comprehensive development kit for Android XR. It's the newest platform in the Android family built for extended reality (XR) headsets (and glasses in the future!). You’ll have endless opportunities to create and develop experiences that blend digital and physical worlds, using familiar Android APIs, tools and open standards created for XR. All of this means: if you build for Android, you're already building for XR! Read on to get started with development for headsets.

With the Android XR SDK you can:

    • Break free of traditional screens by spatializing your app with rich 3D elements, spatial panels, and spatial audio that bring a natural sense of depth, scale, and tangible realism
    • Transport your users to a fantastical virtual space, or engage with them in their own homes or workplaces
    • Take advantage of natural, multimodal interaction capabilities such as hands and eyes
"We believe Android XR is a game-changer for storytelling. It allows us to merge narrative depth with advanced interactive features, creating an immersive world where audiences can engage with characters and stories like never before." 
- Jed Weintrob, Partner at 30 Ninjas

Your apps on Android XR

The Android XR SDK is built on the existing foundations of Android app development. We're also bringing the Play Store to Android XR, where most Android apps will automatically be made available without any additional development effort. Users will be able to discover and use your existing apps in a whole new dimension. To differentiate your existing Compose app, you may opt-in, to automatically spatialize Material Design (M3) components and Compose for adaptive layouts in XR.

Moving image showing sizing capabilities in Android XR
Apps optimized for large screens take advantage of sizing capabilities in Android XR

The Android XR SDK has something for every developer:

Building with Kotlin and Android Studio? You'll feel right at home with the Jetpack XR SDK, a suite of familiar libraries and tools to simplify development and accelerate productivity.

    • Using Unity’s real-time 3D engine? The Android XR Extensions for Unity provides the packages you need to build or port powerful, immersive experiences.
    • Developing on the web? Use WebXR to add immersive experiences supported on Chrome.
    • Working with native languages like C/C++? Android XR supports the OpenXR 1.1 standard.

Creating with Jetpack XR SDK

The Jetpack XR SDK includes new Jetpack libraries purpose-built for XR. The highlights include:

    • Jetpack Compose for XR - enables you to declaratively create spatial UI layouts and spatialize your existing 2D UI built with Compose or Views
    • ARCore for Jetpack XR - brings powerful perception capabilities for your app to understand the real world
“With Android XR, we can bring Calm directly into your world, capturing the senses and allowing you to experience it in a deeper and more transformative way. By collaborating closely with the Android XR team on this cutting-edge technology, we’ve reimagined how to create a sense of depth and space, resulting in a level of immersion that instantly helps you feel more present, focused, and relaxed.” 
- Dan Szeto, Vice President at Calm Studios

Kickstart your Jetpack XR SDK journey with the Hello XR Sample, a straightforward introduction to the essential features of Jetpack Compose for XR.

Learn more about developing with the Jetpack XR SDK.

Moving image of the JetNews sample app adapted for Android XR
The JetNews sample app is an Android large-screen app adapted for Android XR

We're also introducing new tools and capabilities to the latest preview of Android Studio Meerkat to boost productivity and simplify your creation process for Android XR.

    • Use the new Android XR Emulator to create a virtualized XR device for deploying and testing apps built with the Jetpack XR SDK. The emulator includes XR-specific controls for using a keyboard and mouse to navigate an emulated virtual space.
    • Use the Android XR template to get a jump-start on creating an app with Jetpack Compose for XR.
    • Use the updated Layout Inspector to inspect and debug spatialized UI components created with Jetpack Compose for XR.

Learn more about the XR enabled tools in Android Studio and the Android XR Emulator.

Moving image of the The Android XR Emulator in Android Studio
The Android XR Emulator in Android Studio has new controls to explore 3D space within the emulator

Creating with Unity

We've partnered with Unity to natively integrate their real-time 3D engine with Android XR starting with Unity 6. Unity is introducing the Unity OpenXR: Android XR package for bringing your multi-platform XR experiences to Android XR.

Unity is adding Android XR support to these popular XR packages:

We're also rolling out the Android XR Extensions for Unity with samples and innovative features such as mouse interaction profile, environment blend mode, personalized hand mesh, object tracking, and more.

"Having already brought Demeo to most commercially available platforms, it's safe to say we were impressed with the process of adapting the game to run on Android XR." 
– Johan Gastrin, CTO at Resolution Games

Check out our getting started guide for unity and Unity’s blog post to learn more.

Moving image of the The Vacation Simulator
Vacation Simulator has been updated to Unity 6 and supports Android XR

Creating for the Web

Chrome on Android XR supports the WebXR standard. If you're building for the web, you can enhance existing sites with 3D content or build new immersive experiences. You can also use full-featured frameworks like three.js, A-Frame, or PlayCanvas to create virtual worlds, or you can use a simpler API like model-viewer so your users can visualize products in an e-commerce site. And because WebXR is an open standard, the same experiences you build for mobile AR devices or dedicated VR hardware seamlessly work on Android XR.

Learn more about developing with WebXR.

Moving image demonstrating virtual objects interacting with real world surfaces in Chrome on Android XR
Chrome on Android XR supports WebXR features including depth maps allowing virtual objects to interact with real world surfaces

Built on Open Standards

We’re continuing the Android tradition of building with open standards. At the heart of the Android perception stack is OpenXR - a high-performance, cross-platform API focused on portability. Android XR is compliant with OpenXR 1.1, and we’re also expanding the Open XR standards with leading-edge vendor extensions to introduce powerful world-sensing capabilities such as:

    • AI-powered hand mesh, designed to adapt to the shape and size of hands to better represent the diversity of your users
    • Sophisticated light estimation, for lighting your digital content to match real-world lighting conditions
    • New trackables that let you bring real world objects like laptops, phones, keyboards, and mice into a virtual environment

The Android XR SDK also supports open standard formats such as glTF 2.0 for 3D models and OpenEXR for high-dynamic-range environments.

Building the future together

We couldn't be more proud or excited to be announcing the Developer Preview of the Android XR SDK. We’re releasing this developer preview, because we want to build the future of XR together with you. We welcome your feedback and can’t wait to work with you and build your ideas and suggestions into the platform. Your passion, expertise, and bold ideas are absolutely essential as we continue to build Android XR.

We look forward to interacting with your apps, reimagined to take advantage of the unique spatial capabilities of Android XR, using familiar tools like Android Studio and Jetpack Compose. We’re eager to visit the amazing 3D worlds you build using powerful tools and open standards like Unity and OpenXR. Most of all, we can’t wait to go on this journey with all of you that make up the amazing community of Android and Unity developers.

To get started creating and developing for Android XR, check out developer.android.com/develop/xr where you will find all of the tools, libraries and resources you need to create with the Android XR SDK! If you are interested in getting access to prerelease hardware and collaborating with the Android XR team, express your interest to participate in an Android XR Developer Bootcamp in 2025 by filling out this form.

Google Cardboard XR Plugin for Unity

Late in 2019, we decided to open source Google Cardboard. Since then, our developer community has had access to create a plethora of experiences on both iOS and Android platforms, while reaching millions of users around the world. While this release has been considered a success by our developer community, we also promised that we would release a plugin for Unity. Our users have long preferred developing Cardboard experiences in Unity, so we made it a priority to develop a Unity SDK. Today, we have fulfilled that promise, and the Google Cardboard open source plugin for Unity is now available via the Unity Asset Store

What's Included in the Cardboard Unity SDK

Today, we’re releasing the Cardboard Unity SDK to our users so that they can continue creating smartphone XR experiences using Unity. Unity is one of the most popular 3D and XR development platforms in the world, and our release of this SDK will give our content creators a smoother workflow with Unity when developing for Cardboard.

In addition to the Unity SDK, we are also providing a sample application for iOS/Android, which will be a great aid for developers trying to debug their own creations. This release not only fulfills a promise we made to our Cardboard community, but also shows our support, as we move away from smartphone VR and leave it in the more-than-capable hands of our development community.



If you’re interested in learning how to develop with the Cardboard open source project, please see our developer documentation or visit the Google VR GitHub repo to access source code, build the project, and download the latest release.

By Jonathan Goodlow, Product Manager, AR & VR

Google and Binomial partner to open source high quality basis universal

Today, Google and Binomial are excited to announce the high quality update to the original Basis Universal release.

Basis Universal allows you to have state of the art web performance with your images, keeping images compressed even on the GPU. Older systems like JPEG and PNG may look small in storage size, but once they hit the GPU they are processed as uncompressed data! The original Basis Universal codec created images that were 6-8 times smaller than JPEG on the GPU while maintaining a similar storage size.

Today we release a high quality Basis Universal codec that utilizes the highest quality formats modern GPUs support, finally bringing the web up to modern GPU texture standards—with cross platform support. The textures are larger in storage size and GPU compressed size, but are still 3-4 times smaller than sending a JPEG or PNG file to be processed on the GPU, and can transcode to a lower quality format for older GPUs.
Original Image by Erol Ahmed from Unsplash.com
Visual comparison of Basis Universal High Quality

Best of all, we are actively working on standardizing Basis Universal with the Khronos Group.

Since our original release in Summer 2019 we’ve seen widespread adoption of Basis Universal in engines like three.js, Babylon.js, Godot, and more, changing what is possible for people to create on the web. Now that a high quality option is available, we expect to see even more adoption and groundbreaking applications created with it.

Please feel free to join our community on Github and check out the full demo there as well. You can also follow standardization efforts via Khronos Group events and forums.

By Stephanie Hurlburt, Binomial and Jamieson Brettle, Chrome Media

New experimental features for Daydream

Posted by Jonathan Huang, Senior Product Manager, Google AR/VR

Since we first launched Daydream, developers have responded by creating virtual reality (VR) experiences that are entertaining, educational and useful. Today, we're announcing a new set of experimental features for developers to use on the Lenovo Mirage Solo—our standalone Daydream headset—to continue to push the platform forward. Here's what's coming:

Experimental 6DoF Controllers

First, we're adding APIs to support positional controller tracking with six degrees of freedom—or 6DoF—to the Mirage Solo. With 6DoF tracking, you can move your hands more naturally in VR, just like you would in the physical world. To date, this type of experience has been limited to expensive PC-based VR with external tracking.

We've also created experimental 6DoF controllers that use a unique optical tracking system to help developers start building with 6DoF features on the Mirage Solo. Instead of using expensive external cameras and sensors that have to be carefully calibrated, our system uses machine learning and off-the-shelf parts to accurately estimate the 3D position and orientation of the controllers. We're excited about this approach because it can reduce the need for expensive hardware and make 6DoF experiences more accessible to more people.

We've already put these experimental controllers in the hands of a few developers and we're excited for more developers to start testing them soon.

Experimental 6DoF controllers

See-Through Mode

We're also introducing what we call see-through mode, which gives you the ability to see what's right in front of you in the physical world while you're wearing your VR headset.

See-through mode takes advantage of our WorldSense technology, which was built to provide accurate, low latency tracking. And, because the tracking cameras on the Mirage Solo are positioned at approximately eye-distance apart, you also get precise depth perception. The result is a see-through mode good enough to let you play ping pong with your headset on.

Playing ping pong with see-through mode on the Mirage Solo.

The combination of see-through mode and the Mirage Solo's tracking technology also opens up the door for developers to blend the digital and physical worlds in new ways by building Augmented Reality (AR) prototypes. Imagine, for example, an interior designer being able to plan a new layout for a room by adding virtual chairs, tables and decorations on top of the actual space.

Experimental app using objects from Poly, see-through mode and 6DoF Controllers to design a space in our office.

Smartphone Android Apps in VR

Finally, we're introducing the capability to open any smartphone Android app on your Daydream device, so you can use your favorite games, tools and more in VR. For example, you can play the popular indie game Mini Metro on a virtual big screen, so you have more space to view and plan your own intricate public transit system.

Playing Mini Metro on a virtual big screen in VR.

With support for Android Apps in VR, developers will be able to add Daydream VR support to their existing 2D applications without having to start from scratch. The Chrome team re-used the existing 2D interfaces for Chrome Browser Sync, settings and more to provide a feature-rich browsing experience in Daydream.

The Chrome app on Daydream uses the 2D settings within VR.

Try These Features

We've really loved building with these tools and can't wait to see what you do with them. See-through mode and Android Apps in VR will be available for all developers to try soon.

If you're a developer in the U.S., click here to learn more and apply now for an experimental 6DoF controller developer kit.

Open sourcing Seurat: bringing high-fidelity scenes to mobile VR

Crossposted from the Google Developers Blog

Great VR experiences make you feel like you’re really somewhere else. To create deeply immersive experiences, there are a lot of factors that need to come together: amazing graphics, spatialized audio, and the ability to move around and feel like the world is responding to you.

Last year at I/O, we announced Seurat as a powerful tool to help developers and creators bring high-fidelity graphics to standalone VR headsets with full positional tracking, like the Lenovo Mirage Solo with Daydream. Seurat is a scene simplification technology designed to process very complex 3D scenes into a representation that renders efficiently on mobile hardware. Here’s how ILMxLAB was able to use Seurat to bring an incredibly detailed ‘Rogue One: A Star Wars Story’ scene to a standalone VR experience.

Today, we’re open sourcing Seurat to the developer community. You can now use Seurat to bring visually stunning scenes to your own VR applications and have the flexibility to customize the tool for your own workflows.

Behind the scenes: how Seurat works

Seurat works by taking advantage of the fact that VR scenes are typically viewed from within a limited viewing region, and leverages this to optimize the geometry and textures in your scene. It takes RGBD images (color and depth) as input and generates a textured mesh, targeting a configurable number of triangles, texture size, and fill rate, to simplify scenes beyond what traditional methods can achieve.


To demonstrate what Seurat can do, here’s a snippet from Blade Runner: Revelations, which launched today with the Lenovo Mirage Solo.

Blade Runner: Revolution by Alcon Interactive and Seismic Games
The Blade Runner universe is known for its stunning worlds, and in Revelations, you get to unravel a mystery around fugitive Replicants in the futuristic but gritty streets. To create the look and feel for Revelations, Seismic used Seurat to bring a scene of 46.6 million triangles down to only 307,000, improving performance by more than 100x with almost no loss in visual quality:

Original scene:

Seurat-processed scene: 

If you’re interested in learning more about Seurat or trying it out yourself, visit the Seurat GitHub page to access the documentation and source code. We’re looking forward to seeing what you build!

By Manfred Ernst, Software Engineer

Open Sourcing Resonance Audio

Posted by Eric Mauskopf, Product Manager

Spatial audio adds to your sense of presence when you're in VR or AR, making it feel and sound, like you're surrounded by a virtual or augmented world. And regardless of the display hardware you're using, spatial audio makes it possible to hear sounds coming from all around you.

Resonance Audio, our spatial audio SDK launched last year, enables developers to create more realistic VR and AR experiences on mobile and desktop. We've seen a number of exciting experiences emerge across a variety of platforms using our SDK. Recent examples include apps like Pixar's Coco VR for Gear VR, Disney's Star WarsTM: Jedi Challenges AR app for Android and iOS, and Runaway's Flutter VR for Daydream, which all used Resonance Audio technology.

To accelerate adoption of immersive audio technology and strengthen the developer community around it, we’re opening Resonance Audio to a community-driven development model. By creating an open source spatial audio project optimized for mobile and desktop computing, any platform or software development tool provider can easily integrate with Resonance Audio. More cross-platform and tooling support means more distribution opportunities for content creators, without the worry of investing in costly porting projects.

What's Included in the Open Source Project

As part of our open source project, we're providing a reference implementation of YouTube's Ambisonic-based spatial audio decoder, compatible with the same Ambisonics format (Ambix ACN/SN3D) used by others in the industry. Using our reference implementation, developers can easily render Ambisonic content in their VR media and other applications, while benefiting from Ambisonics open source, royalty-free model. The project also includes encoding, sound field manipulation and decoding techniques, as well as head related transfer functions (HRTFs) that we've used to achieve rich spatial audio that scales across a wide spectrum of device types and platforms. Lastly, we're making our entire library of highly optimized DSP classes and functions, open to all. This includes resamplers, convolvers, filters, delay lines and other DSP capabilities. Additionally, developers can now use Resonance Audio's brand new Spectral Reverb, an efficient, high quality, constant complexity reverb effect, in their own projects.

We've open sourced Resonance Audio as a standalone library and associated engine plugins, VST plugin, tutorials, and examples with the Apache 2.0 license. Most importantly, this means Resonance Audio is yours, so you're free to use Resonance Audio in your projects, no matter where you work. And if you see something you'd like to improve, submit a GitHub pull request to be reviewed by the Resonance Audio project committers. While the engine plugins for Unity, Unreal, FMOD, and Wwise will remain open source, going forward they will be maintained by project committers from our partners, Unity, Epic, Firelight Technologies, and Audiokinetic, respectively.

If you're interested in learning more about Resonance Audio, check out the documentation on our developer site. If you want to get more involved, visit our GitHub to access the source code, build the project, download the latest release, or even start contributing. We're looking forward to building the future of immersive audio with all of you.

Open sourcing Resonance Audio

Spatial audio adds to your sense of presence when you’re in VR or AR, making it feel and sound, like you’re surrounded by a virtual or augmented world. And regardless of the display hardware you’re using, spatial audio makes it possible to hear sounds coming from all around you.

Resonance Audio, our spatial audio SDK launched last year, enables developers to create more realistic VR and AR experiences on mobile and desktop. We’ve seen a number of exciting experiences emerge across a variety of platforms using our SDK. Recent examples include apps like Pixar’s Coco VR for Gear VR, Disney’s Star Wars™: Jedi Challenges AR app for Android and iOS, and Runaway’s Flutter VR for Daydream, which all used Resonance Audio technology.

To accelerate adoption of immersive audio technology and strengthen the developer community around it, we’re opening Resonance Audio to a community-driven development model. By creating an open source spatial audio project optimized for mobile and desktop computing, any platform or software development tool provider can easily integrate with Resonance Audio. More cross-platform and tooling support means more distribution opportunities for content creators, without the worry of investing in costly porting projects.

What’s Included in the Open Source Project

As part of our open source project, we’re providing a reference implementation of YouTube’s Ambisonic-based spatial audio decoder, compatible with the same Ambisonics format (Ambix ACN/SN3D) used by others in the industry. Using our reference implementation, developers can easily render Ambisonic content in their VR media and other applications, while benefiting from Ambisonics open source, royalty-free model. The project also includes encoding, sound field manipulation and decoding techniques, as well as head related transfer functions (HRTFs) that we’ve used to achieve rich spatial audio that scales across a wide spectrum of device types and platforms. Lastly, we’re making our entire library of highly optimized DSP classes and functions, open to all. This includes resamplers, convolvers, filters, delay lines and other DSP capabilities. Additionally, developers can now use Resonance Audio’s brand new Spectral Reverb, an efficient, high quality, constant complexity reverb effect, in their own projects.

We’ve open sourced Resonance Audio as a standalone library and associated engine plugins, VST plugin, tutorials, and examples with the Apache 2.0 license. This means Resonance Audio is yours, so you’re free to use Resonance Audio in your projects, no matter where you work. And if you see something you’d like to improve, submit a GitHub pull request to be reviewed by the Resonance Audio project committers. While the engine plugins for Unity, Unreal, FMOD, and Wwise will remain open source, going forward they will be maintained by project committers from our partners, Unity, Epic, Firelight Technologies, and Audiokinetic, respectively.

If you’re interested in learning more about Resonance Audio, check out the documentation on our developer site. If you want to get more involved, visit our GitHub to access the source code, build the project, download the latest release, or even start contributing. We’re looking forward to building the future of immersive audio with all of you.

By Eric Mauskopf, Google AR/VR Team

Announcing ARCore 1.0 and new updates to Google Lens

Anuj Gosalia, Director of Engineering, AR

With ARCore and Google Lens, we're working to make smartphone cameras smarter. ARCore enables developers to build apps that can understand your environment and place objects and information in it. Google Lens uses your camera to help make sense of what you see, whether that's automatically creating contact information from a business card before you lose it, or soon being able to identify the breed of a cute dog you saw in the park. At Mobile World Congress, we're launching ARCore 1.0 along with new support for developers, and we're releasing updates for Lens and rolling it out to more people.

ARCore, Google's augmented reality SDK for Android, is out of preview and launching as version 1.0. Developers can now publish AR apps to the Play Store, and it's a great time to start building. ARCore works on 100 million Android smartphones, and advanced AR capabilities are available on all of these devices. It works on 13 different models right now (Google's Pixel, Pixel XL, Pixel 2 and Pixel 2 XL; Samsung's Galaxy S8, S8+, Note8, S7 and S7 edge; LGE's V30 and V30+ (Android O only); ASUS's Zenfone AR; and OnePlus's OnePlus 5). And beyond those available today, we're partnering with many manufacturers to enable their upcoming devices this year, including Samsung, Huawei, LGE, Motorola, ASUS, Xiaomi, HMD/Nokia, ZTE, Sony Mobile, and Vivo.

Making ARCore work on more devices is only part of the equation. We're bringing developers additional improvements and support to make their AR development process faster and easier. ARCore 1.0 features improved environmental understanding that enables users to place virtual assets on textured surfaces like posters, furniture, toy boxes, books, cans and more. Android Studio Beta now supports ARCore in the Emulator, so you can quickly test your app in a virtual environment right from your desktop.

Everyone should get to experience augmented reality, so we're working to bring it to people everywhere, including China. We'll be supporting ARCore in China on partner devices sold there— starting with Huawei, Xiaomi and Samsung—to enable them to distribute AR apps through their app stores.

We've partnered with a few great developers to showcase how they're planning to use AR in their apps. Snapchat has created an immersive experience that invites you into a "portal"—in this case, FC Barcelona's legendary Camp Nou stadium. Visualize different room interiors inside your home with Sotheby's International Realty. See Porsche's Mission E Concept vehicle right in your driveway, and explore how it works. With OTTO AR, choose pieces from an exclusive set of furniture and place them, true to scale, in a room. Ghostbusters World, based on the film franchise, is coming soon. In China, place furniture and over 100,000 other pieces with Easyhome Homestyler, see items and place them in your home when you shop on JD.com, or play games from NetEase, Wargaming and Game Insight.

With Google Lens, your phone's camera can help you understand the world around you, and, we're expanding availability of the Google Lens preview. With Lens in Google Photos, when you take a picture, you can get more information about what's in your photo. In the coming weeks, Lens will be available to all Google Photos English-language users who have the latest version of the app on Android and iOS. Also over the coming weeks, English-language users on compatible flagship devices will get the camera-based Lens experience within the Google Assistant. We'll add support for more devices over time.

And while it's still a preview, we've continued to make improvements to Google Lens. Since launch, we've added text selection features, the ability to create contacts and events from a photo in one tap, and—in the coming weeks—improved support for recognizing common animals and plants, like different dog breeds and flowers.

Smarter cameras will enable our smartphones to do more. With ARCore 1.0, developers can start building delightful and helpful AR experiences for them right now. And Lens, powered by AI and computer vision, makes it easier to search and take action on what you see. As these technologies continue to grow, we'll see more ways that they can help people have fun and get more done on their phones.

Diagnose and understand your app’s GPU behavior with GAPID

Posted by Andrew Woloszyn, Software Engineer

Developing for 3D is complicated. Whether you're using a native graphics API or enlisting the help of your favorite game engine, there are thousands of graphics commands that have to come together perfectly to produce beautiful 3D images on your phone, desktop or VR headsets.

GAPID (Graphics API Debugger) is a new tool that helps developers diagnose rendering and performance issues with their applications. With GAPID, you can capture a trace of your application and step through each graphics command one-by-one. This lets you visualize how your final image is built and isolate problematic calls, so you spend less time debugging through trial-and-error.

GAPID supports OpenGL ES on Android, and Vulkan on Android, Windows and Linux.

Debugging in action, one draw call at a time

GAPID not only enables you to diagnose issues with your rendering commands, but also acts as a tool to run quick experiments and see immediately how these changes would affect the presented frame.

Here are a few examples where GAPID can help you isolate and fix issues with your application:

What's the GPU doing?

Why isn't my text appearing?!

Working with a graphics API can be frustrating when you get an unexpected result, whether it's a blank screen, an upside-down triangle, or a missing mesh. As an offline debugger, GAPID lets you take a trace of these applications, and then inspect the calls afterwards. You can track down exactly which command produced the incorrect result by looking at the framebuffer, and inspect the state at that point to help you diagnose the issue.

What happens if I do X?

Using GAPID to edit shader code

Even when a program is working as expected, sometimes you want to experiment. GAPID allows you to modify API calls and shaders at will, so you can test things like:

  • What if I used a different texture on this object?
  • What if I changed the calculation of bloom in this shader?

With GAPID, you can now iterate on the look and feel of your app without having to recompile your application or rebuild your assets.

Whether you're building a stunning new desktop game with Vulkan or a beautifully immersive VR experience on Android, we hope that GAPID will save you both time and frustration and help you get the most out of your GPU. To get started with GAPID and see just how powerful it is, download it, take your favorite application, and capture a trace!