Tag Archives: Android XR SDK

Updates to the Android XR SDK: Introducing Developer Preview 2

Posted by Matthew McCullough – VP of Product Management, Android Developer

Since launching the Android XR SDK Developer Preview alongside Samsung, Qualcomm, and Unity last year, we’ve been blown away by all of the excitement we’ve been hearing from the broader Android community. Whether it's through coding live-streams or local Google Developer Group talks, it's been an outstanding experience participating in the community to build the future of XR together, and we're just getting started.

Today we’re excited to share an update to the Android XR SDK: Developer Preview 2, packed with new features and improvements to help you develop helpful and delightful immersive experiences with familiar Android APIs, tools and open standards created for XR.

At Google I/O, we have two technical sessions related to Android XR. The first is Building differentiated apps for Android XR with 3D content, which covers many features present in Jetpack SceneCore and ARCore for Jetpack XR. The future is now, with Compose and AI on Android XR covers creating XR-differentiated UI and our vision on the intersection of XR with cutting-edge AI capabilities.

Android XR sessions at Google I/O 2025
Building differentiated apps for Android XR with 3D content and The future is now, with Compose and AI on Android XR

What’s new in Developer Preview 2

Since the release of Developer Preview 1, we’ve been focused on making the APIs easier to use and adding new immersive Android XR features. Your feedback has helped us shape the development of the tools, SDKs, and the platform itself.

With the Jetpack XR SDK, you can now play back 180° and 360° videos, which can be stereoscopic by encoding with the MV-HEVC specification or by encoding view-frames adjacently. The MV-HEVC standard is optimized and designed for stereoscopic video, allowing your app to efficiently play back immersive videos at great quality. Apps built with Jetpack Compose for XR can use the SpatialExternalSurface composable to render media, including stereoscopic videos.

Using Jetpack Compose for XR, you can now also define layouts that adapt to different XR display configurations. For example, use a SubspaceModifier to specify the size of a Subspace as a percentage of the device’s recommended viewing size, so a panel effortlessly fills the space it's positioned in.

Material Design for XR now supports more component overrides for TopAppBar, AlertDialog, and ListDetailPaneScaffold, helping your large-screen enabled apps that use Material Design effortlessly adapt to the new world of XR.

An app adapts to XR using Material Design for XR with the new component overrides
An app adapts to XR using Material Design for XR with the new component overrides

In ARCore for Jetpack XR, you can now track hands after requesting the appropriate permissions. Hands are a collection of 26 posed hand joints that can be used to detect hand gestures and bring a whole new level of interaction to your Android XR apps:

moving image demonstrates how hands bring a natural input method to your Android XR experience.
Hands bring a natural input method to your Android XR experience.

For more guidance on developing apps for Android XR, check out our Android XR Fundamentals codelab, the updates to our Hello Android XR sample project, and a new version of JetStream with Android XR support.

The Android XR Emulator has also received updates to stability, support for AMD GPUs, and is now fully integrated within the Android Studio UI.

the Android XR Emulator in Android STudio
The Android XR Emulator is now integrated in Android Studio

Developers using Unity have already successfully created and ported existing games and apps to Android XR. Today, you can upgrade to the Pre-Release version 2 of the Unity OpenXR: Android XR package! This update adds many performance improvements such as support for Dynamic Refresh Rate, which optimizes your app’s performance and power consumption. Shaders made with Shader Graph now support SpaceWarp, making it easier to use SpaceWarp to reduce compute load on the device. Hand meshes are now exposed with occlusion, which enables realistic hand visualization.

Check out Unity’s improved Mixed Reality template for Android XR, which now includes support for occlusion and persistent anchors.

We recently launched Android XR Samples for Unity, which demonstrate capabilities on the Android XR platform such as hand tracking, plane tracking, face tracking, and passthrough.

moving image of Google’s open-source Unity samples demonstrating platform features and showing how they’re implemented
Google’s open-source Unity samples demonstrate platform features and show how they’re implemented

The Firebase AI Logic for Unity is now in public preview! This makes it easy for you to integrate gen AI into your apps, enabling the creation of AI-powered experiences with Gemini and Android XR. The Firebase AI Logic fully supports Gemini's capabilities, including multimodal input and output, and bi-directional streaming for immersive conversational interfaces. Built with production readiness in mind, Firebase AI Logic is integrated with core Firebase services like App Check, Remote Config, and Cloud Storage for enhanced security, configurability, and data management. Learn more about this on the Firebase blog or go straight to the Gemini API using Vertex AI in Firebase SDK documentation to get started.

Continuing to build the future together

Our commitment to open standards continues with the glTF Interactivity specification, in collaboration with the Khronos Group. which will be supported in glTF models rendered by Jetpack XR later this year. Models using the glTF Interactivity specification are self-contained interactive assets that can have many pre-programmed behaviors, like rotating objects on a button press or changing the color of a material over time.

Android XR will be available first on Samsung’s Project Moohan, launching later this year. Soon after, our partners at XREAL will release the next Android XR device. Codenamed Project Aura, it’s a portable and tethered device that gives users access to their favorite Android apps, including those that have been built for XR. It will launch as a developer edition, specifically for you to begin creating and experimenting. The best news? With the familiar tools you use to build Android apps today, you can build for these devices too.

product image of XREAL’s Project Aura against a nebulous black background
XREAL’s Project Aura

The Google Play Store is also getting ready for Android XR. It will list supported 2D Android apps on the Android XR Play Store when it launches later this year. If you are working on an Android XR differentiated app, you can get it ready for the big launch and be one of the first differentiated apps on the Android XR Play Store:

And we know many of you are excited for the future of Android XR on glasses. We are shaping the developer experience now and will share more details on how you can participate later this year.

To get started creating and developing for Android XR, check out developer.android.com/develop/xr where you will find all of the tools, libraries, and resources you need to work with the Android XR SDK. In particular, try out our samples and codelabs.

We welcome your feedback, suggestions, and ideas as you’re helping shape Android XR. Your passion, expertise, and bold ideas are vital as we continue to develop Android XR together. We look forward to seeing your XR-differentiated apps when Android XR devices launch later this year!

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.


Introducing Android XR SDK Developer Preview

Posted by Matthew McCullough – VP of Product Management, Android Developer

Today, we're launching the developer preview of the Android XR SDK - a comprehensive development kit for Android XR. It's the newest platform in the Android family built for extended reality (XR) headsets (and glasses in the future!). You’ll have endless opportunities to create and develop experiences that blend digital and physical worlds, using familiar Android APIs, tools and open standards created for XR. All of this means: if you build for Android, you're already building for XR! Read on to get started with development for headsets.

With the Android XR SDK you can:

    • Break free of traditional screens by spatializing your app with rich 3D elements, spatial panels, and spatial audio that bring a natural sense of depth, scale, and tangible realism
    • Transport your users to a fantastical virtual space, or engage with them in their own homes or workplaces
    • Take advantage of natural, multimodal interaction capabilities such as hands and eyes
"We believe Android XR is a game-changer for storytelling. It allows us to merge narrative depth with advanced interactive features, creating an immersive world where audiences can engage with characters and stories like never before." 
- Jed Weintrob, Partner at 30 Ninjas

Your apps on Android XR

The Android XR SDK is built on the existing foundations of Android app development. We're also bringing the Play Store to Android XR, where most Android apps will automatically be made available without any additional development effort. Users will be able to discover and use your existing apps in a whole new dimension. To differentiate your existing Compose app, you may opt-in, to automatically spatialize Material Design (M3) components and Compose for adaptive layouts in XR.

Moving image showing sizing capabilities in Android XR
Apps optimized for large screens take advantage of sizing capabilities in Android XR

The Android XR SDK has something for every developer:

Building with Kotlin and Android Studio? You'll feel right at home with the Jetpack XR SDK, a suite of familiar libraries and tools to simplify development and accelerate productivity.

    • Using Unity’s real-time 3D engine? The Android XR Extensions for Unity provides the packages you need to build or port powerful, immersive experiences.
    • Developing on the web? Use WebXR to add immersive experiences supported on Chrome.
    • Working with native languages like C/C++? Android XR supports the OpenXR 1.1 standard.

Creating with Jetpack XR SDK

The Jetpack XR SDK includes new Jetpack libraries purpose-built for XR. The highlights include:

    • Jetpack Compose for XR - enables you to declaratively create spatial UI layouts and spatialize your existing 2D UI built with Compose or Views
    • ARCore for Jetpack XR - brings powerful perception capabilities for your app to understand the real world
“With Android XR, we can bring Calm directly into your world, capturing the senses and allowing you to experience it in a deeper and more transformative way. By collaborating closely with the Android XR team on this cutting-edge technology, we’ve reimagined how to create a sense of depth and space, resulting in a level of immersion that instantly helps you feel more present, focused, and relaxed.” 
- Dan Szeto, Vice President at Calm Studios

Kickstart your Jetpack XR SDK journey with the Hello XR Sample, a straightforward introduction to the essential features of Jetpack Compose for XR.

Learn more about developing with the Jetpack XR SDK.

Moving image of the JetNews sample app adapted for Android XR
The JetNews sample app is an Android large-screen app adapted for Android XR

We're also introducing new tools and capabilities to the latest preview of Android Studio Meerkat to boost productivity and simplify your creation process for Android XR.

    • Use the new Android XR Emulator to create a virtualized XR device for deploying and testing apps built with the Jetpack XR SDK. The emulator includes XR-specific controls for using a keyboard and mouse to navigate an emulated virtual space.
    • Use the Android XR template to get a jump-start on creating an app with Jetpack Compose for XR.
    • Use the updated Layout Inspector to inspect and debug spatialized UI components created with Jetpack Compose for XR.

Learn more about the XR enabled tools in Android Studio and the Android XR Emulator.

Moving image of the The Android XR Emulator in Android Studio
The Android XR Emulator in Android Studio has new controls to explore 3D space within the emulator

Creating with Unity

We've partnered with Unity to natively integrate their real-time 3D engine with Android XR starting with Unity 6. Unity is introducing the Unity OpenXR: Android XR package for bringing your multi-platform XR experiences to Android XR.

Unity is adding Android XR support to these popular XR packages:

We're also rolling out the Android XR Extensions for Unity with samples and innovative features such as mouse interaction profile, environment blend mode, personalized hand mesh, object tracking, and more.

"Having already brought Demeo to most commercially available platforms, it's safe to say we were impressed with the process of adapting the game to run on Android XR." 
– Johan Gastrin, CTO at Resolution Games

Check out our getting started guide for unity and Unity’s blog post to learn more.

Moving image of the The Vacation Simulator
Vacation Simulator has been updated to Unity 6 and supports Android XR

Creating for the Web

Chrome on Android XR supports the WebXR standard. If you're building for the web, you can enhance existing sites with 3D content or build new immersive experiences. You can also use full-featured frameworks like three.js, A-Frame, or PlayCanvas to create virtual worlds, or you can use a simpler API like model-viewer so your users can visualize products in an e-commerce site. And because WebXR is an open standard, the same experiences you build for mobile AR devices or dedicated VR hardware seamlessly work on Android XR.

Learn more about developing with WebXR.

Moving image demonstrating virtual objects interacting with real world surfaces in Chrome on Android XR
Chrome on Android XR supports WebXR features including depth maps allowing virtual objects to interact with real world surfaces

Built on Open Standards

We’re continuing the Android tradition of building with open standards. At the heart of the Android perception stack is OpenXR - a high-performance, cross-platform API focused on portability. Android XR is compliant with OpenXR 1.1, and we’re also expanding the Open XR standards with leading-edge vendor extensions to introduce powerful world-sensing capabilities such as:

    • AI-powered hand mesh, designed to adapt to the shape and size of hands to better represent the diversity of your users
    • Sophisticated light estimation, for lighting your digital content to match real-world lighting conditions
    • New trackables that let you bring real world objects like laptops, phones, keyboards, and mice into a virtual environment

The Android XR SDK also supports open standard formats such as glTF 2.0 for 3D models and OpenEXR for high-dynamic-range environments.

Building the future together

We couldn't be more proud or excited to be announcing the Developer Preview of the Android XR SDK. We’re releasing this developer preview, because we want to build the future of XR together with you. We welcome your feedback and can’t wait to work with you and build your ideas and suggestions into the platform. Your passion, expertise, and bold ideas are absolutely essential as we continue to build Android XR.

We look forward to interacting with your apps, reimagined to take advantage of the unique spatial capabilities of Android XR, using familiar tools like Android Studio and Jetpack Compose. We’re eager to visit the amazing 3D worlds you build using powerful tools and open standards like Unity and OpenXR. Most of all, we can’t wait to go on this journey with all of you that make up the amazing community of Android and Unity developers.

To get started creating and developing for Android XR, check out developer.android.com/develop/xr where you will find all of the tools, libraries and resources you need to create with the Android XR SDK! If you are interested in getting access to prerelease hardware and collaborating with the Android XR team, express your interest to participate in an Android XR Developer Bootcamp in 2025 by filling out this form.