Tag Archives: Google VR

Daydream Labs: Experiments with ARCore

ARCore brings augmented reality capabilities to millions of Android phones. It’s available as an SDK preview, and developers can start experimenting with it right now. We’ve already seen some really fun, useful and delightful experiences come through; check out thisisarcore.com for some of our favorites.

Daydream Labs has been in on the fun and experimentation, too. We’re exploring new interactions, including unique ways to learn about the world around you, different ways to navigate, and new ways to socialize and play with friends.

Here’s some of what we’ve made so far!

Using AR as a magic window into Street View

We built a prototype that lets you zoom into The British Museum and see Street View panoramas from the front of Great Russell Street.

ARStreetView

Helping you see the future

With AR, we prototyped a way for architects to overlay models on top of construction in the real world to show how a finished home would look.

ARPorch

Skills training with ARCore

We brought our VR version of the Espresso Trainer into AR. You can use your phone to learn each step of making a perfect espresso. People who had never used the machine before made their first espresso from scratch, with perfect crema to boot!
EspressoAR

Controlling virtual position through reality

We built a way to explore Street View without having to click arrows—just walk forward in physical space to adjust your virtual position.

ARSVWLK

Highlight AR content

We played around with the idea of putting floating AR content in front of the user, and controlling depth of field and desaturation of the camera feed based on user motion. This experiment allows digital assets to “pop,” directing people's attention there and encouraging them to explore and interact.

ARHighlight
Blocks model: RAWRRR!! By Damon Pidhajecky

Share your position with VPS

We’ve been experimenting with Google’s VPS beta (Visual Positioning Service), announced at Google I/O in May. VPS enables shared world-scale AR experiences well beyond tabletops. For example, this prototype lets you share your position with a friend, and they’ll be guided right to you with VPS. We’ve played quite a few games of hide-and-seek with it!
ARFinder

Want to dive in further?

We’ve been having a ton of fun building with ARCore, and we encourage you to grab the Unity, Unreal or Android SDKs to see what you can create. We’ve also been playing with our new prototype AR-enabled browsers for Android and iOS—look for those experiments in the future. Don’t forget to tag your creations on social media with #ARCore.

Building for Daydream gets easier with new tools

With Daydream, our goal is to enable developers to build high-quality mobile VR experiences. We’re always trying to make the development process easier and more efficient, helping you focus on innovation by providing tools to optimize your apps, interactions, and workflow. With that in mind, we have some new updates and features for our tools.

Performance HUD

We built the Daydream Performance Heads-up Display (HUD), so you can easily monitor key performance metrics in VR, without removing your headset. With the Performance HUD, you get at-a-glance visibility into information about frame rate, process memory usage, thermal throttling status, and platform-specific metrics. So whether you’re an artist understanding how your assets are affecting performance, an engineer checking how your rendering parameters are affecting frame rate, or doing QA checks for issues and regressions, the Performance HUD makes it much easier to see what you need while working in VR.

PerfHud

Three new Daydream Elements

VR development is evolving rapidly, and we’re continuing to work on new ways to address its physiological, ergonomic, and technical challenges. Daydream Elements is a collection of tested solutions that showcases best practices for immersive design. You can check out the first six Elements here, and this release adds three more.

Great mechanics for object manipulation are key for making a VR experience feel immersive. This demo shows how developers can simulate weight on objects to make them feel lighter or heavier, and how to tune hinges and sliders so they behave in ways that feel natural for common interactions like opening doors and closing drawers.

ObjectManipDDE

The Constellation menu demonstrates a gesture-based interaction model that helps users navigate deep information hierarchies in a simple, responsive way. This helps with item inventories, file directories, and enterprise applications with large feature or data sets.

ConstGif

The Arm Model demo shows how you can use mathematical arm models to approximate the physical location of the Daydream controller in VR. You can then simulate the interactions of a fully tracked (rotation and translation) controller with a controller that only tracks rotation. Tuning custom arm models from scratch can be a complex process, but when done correctly, the arm model provides a fluid and natural interface for a wide range of different gestures. This demo includes a number of custom models specifically tuned to simulate different types of controller interactions.

armModelB03

Making Instant Preview even smoother

Instant Preview lets you make changes in Unity or Unreal editor previews and see them reflected instantly in VR, on device, skipping the need to compile and re-deploy projects to see a new change. This enables more efficient development and tight iteration cycles. With the V1.1 release for Unity, Instant Preview is even faster, smoother and easier to use.

In addition to being able to connect over USB, you can also now connect your phone to the PC via WiFi.

InstantP

New support for Metal on OSX makes Instant Preview run even better, giving developers a noticeably smoother experience on Mac. V1.1 also includes lots of other little goodies like controller emulator compatibility, the ability to see the controller battery level on the rendered controller, and a new streamlined setup process that lets you auto-push the APK to your phone and get started with Instant Preview immediately.

You can check all these tools out on the Google VR developer site, and we look forward to your feedback and input.

Step inside of music

What if you could step inside your favorite song and get a closer look at how music is made? That’s the idea behind our new interactive experiment Inside Music.

The project is a collaboration with the popular podcast Song Exploder and some of our favorite artists across different genres—Phoenix, Perfume Genius, Natalia Lafourcade, Ibeyi, Alarm Will Sound, and Clipping. The experiment lets you explore layers of music all around you, using spatial audio to understand how a piece of music is composed. You can even turn layers on and off, letting you hear the individual pieces of a song in a new way.

It’s built using technology called WebVR, which lets you open it in your web browser, without installing any apps. You can try it on a virtual reality headset, phone or laptop. And we’ve made the code open-source so that people who make music can create new interactive experiments.

Watch the video above to learn more, and check it out at g.co/insidemusic.

Daydream Elements: Foundational VR Design

Daydream Elements launched at Google I/O this year. It’s a set of best practices and reusable code for some of the most fundamental things you do in VR — like walking around and interacting with the environment. So whether you're a seasoned app developer, an immersive design enthusiast, or just curious about VR, you can take  a look at the first set of interactions.

Getting around


Teleportation

Teleportation is popular in VR because it allows you to easily explore a large virtual environment. In Elements, the Daydream controller's ability to detect touch is used to make teleportation more discoverable. When you touch the touchpad, an arc is drawn to the teleportation target, so you can click to teleport to that destination point.

TeleportGif
A simple touch on the touchpad draws the teleportation arc to a destination point.

The Teleportation Element also addresses a previous downside: becoming disoriented right after you teleport. With an instant jump or a fade to black, you need a few seconds to get your bearings again. However, you can avoid this with a very fast warp effect that quickly flies you to your new location. The warp effect also helps you remain situated, and it’s fast enough to prevent any potential discomfort.

Tunneling

Tunnel1
The field of view gets smaller and the background is replaced with a grid when walking.

Motion causes some people to feel uncomfortable in VR, even if they don’t feel that way when they play video games on a TV. That’s because the living room (with its chairs, lamps, and mid-century modern asymmetrical coffee table with hairpin legs) helps keep you grounded. The Tunneling Element for VR takes advantage of this same living room effect, and can be used during rotation or when virtually moving around. In the Tunneling Element, the use of a stable grid was the most effective at reducing discomfort. This interaction model was also used and tested in Earth VR and is now available for use in any app.

Teleport2
Bringing in tunneling during rotation helps improve comfort for users who are highly sensitive to major movement.

Chase Camera

Sometimes you need to follow a character while in-game, but if it happens automatically and isn’t under your control, it may be uncomfortable. Our Chase Camera Element is fine tuned to avoid these issues. It emphasizes control and expectations: you determine your target destination with a touchpad click so that moving in that direction feels natural and expected.

CC1
The user is in control of the camera movement by selecting the target destination for the fox.

It also avoids automatic camera rotation. You can still manually rotate the camera with the touchpad, and the Tunneling Element will turn on during the rotation or if you look in a different direction from the camera movement—which is one of the biggest reasons that chase cameras are so hard to get right in VR.  Also note that tunneling effects the environment, but not the character—the fox can run into the tunnel so the user never loses it.

CC2

Menu Systems

Click Menu

The Daydream controller is simple and easy to use, but sometimes you need more action choices. Your touchpad-touches may be used for moving around, and your touchpad-click may be dedicated to a primary app action, but you still may want easy access to more options.

CM1
Clicking the app button reveals a radial menu of options that are hidden from the user at first but are always easily accessible.

In Elements, there’s an example of a drawing application where you click the app button to see a radial menu for access to a larger set of drawing tools and colors. That way, the touchpad can remain dedicated to a primary action (in this case, point-and-click drawing).

Swipe Menu

What if you're fighting a menacing fire-breathing dragon in the boss level of your favorite RPG, and you need to quickly switch between your sword and crossbow, not to mention replenish your health with your limited supply of potions? The swipe menu is a great way to quickly swap between tools. In Daydream Elements, no dragons were available, but you can see the swipe menu in action against some balloons (still menacing).

SwipeMenu
The swipe menu provides very fast access to actions.

Daydream Renderer

Last, there’s a new real time rendering system for Daydream, and it's pretty awesome. You can play with dynamic light sources and textures, and it shows that mobile devices are now capable of rich visual experiences. This demo scene is just one example of how content creators can use the Daydream Renderer to make their apps look beautiful. If you’re interested in the technical details of how all this works, check out the documentation.

DDR

The goal of Daydream Elements is to make VR design and development accessible to everyone who wants to create great immersive content. Keep your eyes peeled for more Elements coming soon!

ARCore: Augmented reality at Android scale

With more than two billion active devices, Android is the largest mobile platform in the world. And for the past nine years, we’ve worked to create a rich set of tools, frameworks and APIs that deliver developers’ creations to people everywhere. Today, we’re releasing a preview of a new software development kit (SDK) called ARCore. It brings augmented reality capabilities to existing and future Android phones. Developers can start experimenting with it right now.

We’ve been developing the fundamental technologies that power mobile AR over the last three years with Tango, and ARCore is built on that work. But, it works without any additional hardware, which means it can scale across the Android ecosystem. ARCore will run on millions of devices, starting today with the Pixel and Samsung’s S8, running 7.0 Nougat and above. We’re targeting 100 million devices at the end of the preview. We’re working with manufacturers like Samsung, Huawei, LG, ASUS and others to make this possible with a consistent bar for quality and high performance.

ARCoreBlocks

ARCore works with Java/OpenGL, Unity and Unreal and focuses on three things:

  • Motion tracking: Using the phone’s camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed.

  • Environmental understanding: It’s common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking.

  • Light estimation: ARCore observes the ambient light in the environment and makes it possible for developers to light virtual objects in ways that match their surroundings, making their appearance even more realistic.
Tinman

Alongside ARCore, we’ve been investing in apps and services which will further support developers in creating great AR experiences. We built Blocks and Tilt Brush to make it easy for anyone to quickly create great 3D content for use in AR apps. As we mentioned at I/O, we’re also working on Visual Positioning Service (VPS), a service which will enable world scale AR experiences well beyond a tabletop. And we think the Web will be a critical component of the future of AR, so we’re also releasing prototype browsers for web developers so they can start experimenting with AR, too. These custom browsers allow developers to create AR-enhanced websites and run them on both Android/ARCore and iOS/ARKit.

ARCore is our next step in bringing AR to everyone, and we’ll have more to share later this year. Let us know what you think through GitHub, and check out our new AR Experiments showcase where you can find some fun examples of what’s possible. Show us what you build on social media with #ARCore; we’ll be resharing some of our favorites.

Google and 826 Valencia invite you to a “planet ruled by love”

In 2015, 826 Valencia—an organization that helps under-resourced students develop their writing skills—won a Google.org Impact Challenge grant to expand their programs in San Francisco’s Tenderloin neighborhood. After receiving the grant, 826 turned to a group of volunteer Googlers to figure out how to use technology to amplify students’ voices. The result was a story about a “planet ruled by love,” written by young students and told through a new medium—virtual reality.

The Keyword team sat down with two of the Googlers behind the project—Rebecca Sills and Ryan Chen—as well as Lauren Hall, Director of Grants and Evaluation at 826 Valencia.

Documentary about the "planet ruled by love."

Keyword: How did everyone get involved in this project?

Lauren: I first walked into 826 Valencia 12 years ago and couldn’t believe what I stumbled upon—it was the perfect wedding of my passions, writing and social justice. I’ve worked there ever since. Technology has changed a lot in the last 12 years, and though 826 will always make books, we’re exploring storytelling mediums that are more technologically relevant for younger generations. So we tapped into Google’s creative brainpower to incorporate technology into our programs and the way our kids tell stories.

Rebecca: The magic of 826 is the simple act of an adult sitting down with a kid to unleash the power of their voice. I wanted to be a part of that magic—and the effort at Google was scrappy from the get-go. I recruited Googlers with different skill sets to get involved and it got mightier and mightier. Our guiding principle was to use Google’s technology to empower students to tell their stories in new ways. And we thought that VR was an exciting way to do that.

Ryan: I wanted to get involved in this project in a way that only Google could, so when Rebecca and team came up with the idea of telling the story using Tilt Brush (a virtual reality app that lets you draw and paint in three-dimensional space), I jumped on it.

How did you come up with the “planet ruled by love” idea?

Lauren: The Google team proposed creating a story in Tilt Brush as a totally new experience for the kids, and our immediate reaction was “what the heck is Tilt Brush?” But the idea had so much energy that it was an emphatic “Yes!” on our end. Leading up to the election in the U.S., we felt a division in the country, in our communities, even on school campuses. Someone suggested that we prompt the kids to write a story about a planet ruled by love, and we immediately went for it. It felt like an antidote to the division and drama around us.

How did the kids write the story?

Rebecca: We wanted to honor what already works at 826—helping kids express themselves through writing—and add a new layer. Students worked with volunteer tutors to develop, write and edit their own stories about the planet ruled by love. So many creative ideas came out of that! And then we worked with 826 staff to pull a line from each of the kids' story—homes made of marshmallows, unicorn wolves, and love spread by nice words, to name a few—to make a version that represented all of their visions. From there, we turned the combined story into a 360-degree experience that they could watch in Cardboard.

Ryan, what was it like working in Tilt Brush?

Ryan: Prior to this project, I had been a 3D animator and illustrator working on screens and tablets. With Tilt Brush, you are creating in VR—it’s a cross between drawing and sculpture. When you first do it, you’re like, “OMG this is crazy. I’m inside the drawing.” After the students wrote the story, I drew rough storyboards and thumbnail sketches, then created the color pallet of the planet ruled by love. I wanted viewers to feel like they had one foot in Google world and one foot in another world. Then, I moved into Tilt Brush and created the final scenes. 

How has 826’s approach to incorporating technology changed? Will you incorporate VR and AR into storytelling projects in the future?

Lauren: Technology has helped us create a wider audience for students’ stories. For example, we’ve started a program for kids to make their own podcasts. We put them on SoundCloud and the links get tweeted and forwarded, and now thousands of people can hear these students’ voices. But in terms of VR, we’d love to keep exploring—we think of it as a 21st-century version of storytelling. VR allows viewers to experience a story in a way that builds greater empathy, context and understanding.

What aspect of the project are you most proud of?

Rebecca: The moment when the kids first put on their Cardboards and stepped into their imaginary world—it was a definite career high! Most of the kids and their parents were experiencing Cardboard for the first time. We watched as they were transported to a new world, and it was so sweet to see the kids recognize their own voices and contributions.

Lauren: I agree! I loved watching them reach out to touch the homes made of marshmallows and the families spending time together. It was magical.

Dance Tonite, an ever-changing VR collaboration by LCD Soundsystem and fans

Sometimes beautiful things happen when worlds collide. In 2002, LCD Soundsystem mashed together electronic dance music and punk rock—an unlikely pairing that brought fun and humanity to two genres that had moved away from their experimental beginnings. I’ve always admired the band’s combination of minimalism, honesty, and contagious energy—and today I’m pleased to introduce Dance Tonite, a VR collaboration celebrating LCD Soundsystem’s latest single, “Tonite.”

dance_tonite_blogpost_img1.gif

Dance Tonite takes an exuberant, unexpected approach to virtual reality. It’s a dance party. And it’s also a dance viewing party. In it, you go from room to room experiencing a series of dance performances created entirely by fans. All choreography was recorded using room-scale VR setups, which use headset and controller tracking to reflect your physical movements in your virtual environment. Instead of just mirroring your movements, we turn your room-scale VR kit into a DIY motion capture tool; if you have one, you can add your own moves to the party.

This video might help to explain.

Individual performers in Dance Tonite are represented by simple moving objects—just a cone and two cylinders. Even though they’re all represented by the same basic shapes, the experience captures the idiosyncrasies of each person's movements. The constraints encourage creativity and diversity, while the overall experience expands and changes with each new contribution.

dance_tonite_blogpost_img2.jpg

Dance Tonite was designed to work across different devices. If you have Daydream View, you’re on stage watching the performance move around you.

dance_tonite_blogpost_img3.png

If you happen to have a room-scale VR headset, go on and add your moves to the experience.

dance_tonite_blogpost_img4.png

And if you don’t have a VR headset, not to worry. You can still watch the experience from a bird's-eye view with the ability to click on any performer's head to see it from their perspective.

dance_tonite_blogpost_img5.png

Dance Tonite uses WebVR, a new open standard that brings high-quality VR content to the web. That means that you can enter the experience through a single URL–no apps or downloads needed. As developers, we were able to create a scalable experience using web infrastructure and a single codebase.

Dance Tonite was directed by artists Jonathan Puckey and Moniker, in collaboration with the Data Arts Team— a specialized group within Google exploring the ongoing dialog between artists and emerging technologies. If you’d like to learn more about how this project was made, we’ve released the code open source. You can also read about our process and learnings using an experimental technology (WebVR) in a new medium (VR).

Dance Tonite was directed by artists Jonathan Puckey and Moniker, in collaboration with the Data Arts Team— a specialized group within Google exploring the ongoing dialog between artists and emerging technologies. If you’d like to learn more about how this project was made, we’ve released the code open source. You can also read about our process and learnings using an experimental technology (WebVR) in a new medium (VR).

See you at Dance Tonite. Remember to dance like nobody’s watching.*

*Millions of people are watching.

Turn around, bright eyes… and experience the total solar eclipse with Google

Move over, blue moon—there’s a more rare astronomical event in town. For the first time since 1979, a total eclipse of the sun is coming to the continental United States this Monday, August 21. Starting on the west coast around 9 a.m., the moon will begin to block the face of the sun. Not long later, the moon will completely cover the sun, leaving only the bright corona visible for as long as two minutes and 40 seconds.

Whether you’re traveling to see the “totality,” catching a glimpse of the partial eclipse from another location, or simply curious, Google can help you learn more about this unique moment. Grab your solar glasses and peep what we’ve got in store:

Live from the solar eclipse

Even if you’re not in the path of the solar eclipse you can tune to YouTube to watch the magic unfold live as it crosses over the U.S. Catch livestreams from NASA, The Weather Channel, Exploratorium, Discovery's Science Channel, and Univision.

Sun, moon and Google Earth

With a new Voyager story in Google Earth, you can learn more about the science behind the eclipse. You can also see what it will look like where you live.

Futures made of virtual totality

If you’re not in 70 mile wide path of totality, fret not. Travel to Mt. Jefferson, OR in Google Earth VR (on Rift and Vive) and view it in virtual reality. From the menu, select Total Solar Eclipse to get a view from the center of the action.

Lights, camera, astronomical action

We’re working with UC Berkeley, other partners and volunteer photographers to capture images of the sun’s corona at the moment of totality for use in scientific research. We’re also using our technology to algorithmically align these images into the Eclipse Megamovie, a continuous view of the eclipse. Read about some of the people involved in this project, and stay tuned for the complete Megamovie soon after the eclipse on https://eclipsemega.movie.

It’s a bird, it’s a plane, it’s Android O!

People worldwide have explained solar eclipses through the lens of myth and legend for centuries. This year, there’s a new supernatural being whose identity will be revealed as the sun and the moon do their celestial dance. Get ready to meet Android O at android.com/o.

While a solar eclipse is a pretty rare astrological event, don’t worry it’s not too early to start planning for the next one passing over the United States on October 14, 2023. You can always set a Google Calendar reminder to make sure you don’t forget.

Source: Android


Daydream brings you inside Vogue Supermodel Closets

Everyone has items of clothing that hold sentimental value. For Kendall Jenner, it could be that pair of boots that Kanye got for her or the matching snuggies that the Jenner/Kardashian clan wore on Christmas morning. Supermodels, they’re just like us! (Minus the boots gifted by Kanye part).

In partnership with Condé Nast Entertainment and Vogue, we created a VR series to give you a peek into the closets of models and hear about the stories (and sentimental value) behind their favorite articles of clothing. “Supermodel Closets” was created to celebrate Vogue’s 125th anniversary and their upcoming September issue. In the first of five episodes, you’ll hear from Kendall Jenner and see the Christmas snuggies for yourself.

This is one of the first productions to use YI HALO cameras, which are the next generation of Jump cameras for high quality, professional VR capture. You can look around (and even up!) thanks to the up camera and immersive 4k stereoscopic capture. Julina Tatlock, executive producer for 30 Ninjas, was able to easily use Jump even in tight spaces in each closet. Combined with unique graphics and post-production elements, this brings you even closer to the clothes.

VogueBTS

If you’ve got Cardboard or Daydream View at home, check out the first episode of Supermodel Closet Secrets on Vogue’s YouTube channel, with more episodes available in the coming weeks. Stay tuned for more Daydream and Jump productions coming this fall.

Daydream Summer Sale

Take a break from the summer heat and jump into virtual reality. Starting today until August 17, you can grab some of our favorite Daydream apps for up to 60% off.

No matter where you are this summer, make your own adventure with two of our favorite adventure apps:  

Lola and the Giant: Embark on an journey full of puzzles and fantastic creatures, and download the companion app to play with a friend.

LolaGif

Along Together: Explore extraordinary worlds and use your Daydream controller to make new paths when there are none to follow.

AlongGif

Want to stay sharp over summer? Check out three of our top puzzle apps:

Mekorama :  Help B, a little robot, find his way home by solving different puzzles.
mekogif

Claro: Travel to a zen-like world where you can manipulate the sun to grow a tree in each of 38 different puzzles.

ClaroGif

Keep Talking & Nobody Explodes : Race against the clock to defuse bombs with a friend.

KeepTalkingGif

Feeling competitive? Battle your enemies in beautiful, dynamic and otherworldly settings:

Battle Planet:  You're alone on a micro-planet. Defend yourself against a gigantic army of enemies.
BattlePlanetGif

Wands: Challenge other Wielders with your own set of spells and skills.

WandsGif

Toy Clash: Your desk has been invaded! Use your toys and magic to defend your towers.

ToyClashGif

Feeling artsy? Grab a comfy chair and let your inner sculptor shine with SculptrVR. You can build incredible worlds in VR with an entire 3D canvas at your disposal.

SculptrVRGif

Check out all the apps on sale on Google Play or your Daydream app.