Tag Archives: VR

Experience Virtual Reality on the web with Chrome

Virtual reality (VR) lets you tour the Turkish palace featured in “Die Another Day,” learn about life in a Syrian refugee camp firsthand, and walk through your dream home right from your living room. With the latest version of Chrome, we’re bringing VR to the web—making it as easy to step inside Air Force One as it is to access your favorite webpage.

For a fully immersive experience, use Chrome with your Daydream-ready phone and Daydream View—just browse to a VR experience you want to view, choose to enter VR, and put the phone in your Daydream View headset. If you don’t have a headset you can view VR content on any phone or desktop computer and interact using your finger or mouse.

You can already try out some great VR-enabled sites, with more coming soon. For example, explore the intersection of humans, nature and technology in the interactive documentary Bear 71. Questioning how we see the world through the lens of technology, this story blurs the lines between the wild world and the wired one.

Bear71
Bear 71: The intersection between humans, animals and technology.

Tour Matterport’s library of 300,000+ celebrity homes, museums, canyons, iconic architecture and other real places.

matterport
Matterport VR: The largest library of real world places in VR

Watch more than two dozen award-winning VR films with Within—from gripping tales set in worlds of pure imagination to documentaries taking you further inside the news than ever before.

VR WIthin NYT 16
Within: Extraordinary stories in virtual reality

Discover​ more than a million stunning 3D scenes in VR with ​Sketchfab, from your favorite anime and video game characters to famous works of art. Join the community and contribute your own creations, or just enjoy and share your favorites.

​Sketchfab
Sketchfab VR: enter new dimensions

Experiment and play in the WebVR Lab from PlayCanvas. Try teleporting around the space or playing a record with your Daydream controller.

webvr-lab
Explore the WebVR Lab from PlayCanvas

We want to bring VR to everyone on any device, and in the coming months we’ll add support for more headsets, including Google Cardboard. Try out these VR-enabled sites to be one of the first to experience the magic of VR on the web.

Source: Google Chrome


Experience Virtual Reality on the web with Chrome

Virtual reality (VR) lets you tour the Turkish palace featured in “Die Another Day,” learn about life in a Syrian refugee camp firsthand, and walk through your dream home right from your living room. With the latest version of Chrome, we’re bringing VR to the web—making it as easy to step inside Air Force One as it is to access your favorite webpage.

For a fully immersive experience, use Chrome with your Daydream-ready phone and Daydream View—just browse to a VR experience you want to view, choose to enter VR, and put the phone in your Daydream View headset. If you don’t have a headset you can view VR content on any phone or desktop computer and interact using your finger or mouse.

You can already try out some great VR-enabled sites, with more coming soon. For example, explore the intersection of humans, nature and technology in the interactive documentary Bear 71. Questioning how we see the world through the lens of technology, this story blurs the lines between the wild world and the wired one.

Bear71
Bear 71: The intersection between humans, animals and technology.

Tour Matterport’s library of 300,000+ celebrity homes, museums, canyons, iconic architecture and other real places.

matterport
Matterport VR: The largest library of real world places in VR

Watch more than two dozen award-winning VR films with Within—from gripping tales set in worlds of pure imagination to documentaries taking you further inside the news than ever before.

VR WIthin NYT 16
Within: Extraordinary stories in virtual reality

Discover​ more than a million stunning 3D scenes in VR with ​Sketchfab, from your favorite anime and video game characters to famous works of art. Join the community and contribute your own creations, or just enjoy and share your favorites.

​Sketchfab
Sketchfab VR: enter new dimensions

Experiment and play in the WebVR Lab from PlayCanvas. Try teleporting around the space or playing a record with your Daydream controller.

webvr-lab
Explore the WebVR Lab from PlayCanvas

We want to bring VR to everyone on any device, and in the coming months we’ll add support for more headsets, including Google Cardboard. Try out these VR-enabled sites to be one of the first to experience the magic of VR on the web.

Get ready for Google Developer Day at GDC 2017

Posted by Noah Falstein, Chief Game Designer at Google

The Game Developers Conference (GDC) kicks off on Monday, February 27th with our annual Google Developer Day. Join us as we demonstrate how new devices, platforms, and tools are helping developers build successful businesses and push the limits of mobile gaming on Android.

Expect exciting announcements, best practices, and tips covering a variety of topics including Google Play, Daydream VR, Firebase, Cloud Platform, machine learning, monetization, and more. In the afternoon, we'll host panels to hear from developers first-hand about their experiences launching mobile games, building thriving communities, and navigating the successes and challenges of being an indie developer.
Visit our site for more info and the Google Developer Day schedule. These events are part of the official Game Developer's Conference, so you will need a pass to attend. For those who can't make it in person, watch the live stream on YouTube starting at 10am PST on Monday, February 27th.


How useful did you find this blogpost?
                                                                               

Introducing Draco: compression for 3D graphics

3D graphics are a fundamental part of many applications, including gaming, design and data visualization. As graphics processors and creation tools continue to improve, larger and more complex 3D models will become commonplace and help fuel new applications in immersive virtual reality (VR) and augmented reality (AR).  Because of this increased model complexity, storage and bandwidth requirements are forced to keep pace with the explosion of 3D data.

The Chrome Media team has created Draco, an open source compression library to improve the storage and transmission of 3D graphics. Draco can be used to compress meshes and point-cloud data. It also supports compressing points, connectivity information, texture coordinates, color information, normals and any other generic attributes associated with geometry.

With Draco, applications using 3D graphics can be significantly smaller without compromising visual fidelity. For users this means apps can now be downloaded faster, 3D graphics in the browser can load quicker, and VR and AR scenes can now be transmitted with a fraction of the bandwidth, rendered quickly and look fantastic.


Sample Draco compression ratios and encode/decode performance*

Transmitting 3D graphics for web-based applications is significantly faster using Draco’s JavaScript decoder, which can be tied to a 3D web viewer. The following video shows how efficient transmitting and decoding 3D objects in the browser can be - even over poor network connections.



Video and audio compression have shaped the internet over the past 10 years with streaming video and music on demand. With the emergence of VR and AR, on the web and on mobile (and the increasing proliferation of sensors like LIDAR) we will soon be swimming in a sea of geometric data. Compression technologies, like Draco, will play a critical role in ensuring these experiences are fast and accessible to anyone with an internet connection. More exciting developments are in store for Draco, including support for creating multiple levels of detail from a single model to further improve the speed of loading meshes.

We look forward to seeing what people do with Draco now that it's open source. Check out the code on GitHub and let us know what you think. Also available is a JavaScript decoder with examples on how to incorporate Draco into the three.js 3D viewer.

By Jamieson Brettle and Frank Galligan, Chrome Media Team

* Specifications: Tests ran with textures and positions quantized at 14-bit precision, normal vectors at 7-bit precision. Ran on a single-core of a 2013 MacBook Pro.  JavaScript decoded using Chrome 54 on Mac OS X.

ETC2Comp: fast texture compression for games and VR

For mobile game and VR developers the ETC2 texture format has become an increasingly valuable tool for texture compression. It produces good on-GPU sizes (it stays compressed in memory) and higher quality textures (compared to its ETC1 counterpart).

These benefits come with a significant downside, however: ETC2 textures take significantly longer to compress than their ETC1 counterparts. As adoption of the ETC2 format increases in a project, so do build times. As such, developers have had to make the classic choice between quality and time.

We wanted to eliminate the need for developers to make that choice, so we’ve released ETC2Comp, a fast and high quality ETC2 encoder for games and VR developers.

ETC2 takes a long time to compress textures because the format defines a large number of possible combinations for encoding a block in the texture. To find the most perfect, highest quality compressed image means brute-forcing this incredibly large number of combinations, which clearly is not a time efficient option.

We designed ETC2Comp to get the same visual results at much faster speeds by deploying a few optimization techniques:

Directed Block Search. Rather than a brute-force search, ETC2Comp uses a much more limited, targeted search for the best encoding for a given block. ETC2Comp comes with a precomputed set of archetype blocks, where each archetype is associated with a sorted list of the ETC2 block format types that provide its best encodings. During the actual compression of a texture, each block is initially assigned an archetype, and multiple passes are done to test the block against its block format list to find the best encoding. As a result, the best option can be found much quicker than with a brute-force method.

Full effort setting. During each pass of the encoding process, all the blocks of the image are sorted by their visual quality (worst-looking to best-looking). ETC2Comp takes an effort parameter whose value specifies what percentage of the blocks to update during each pass of encoding. An effort value of 25, for instance, means that on each pass, only the 25% worst looking blocks are tested against the next format in their archetypes' format-chains. The result is a tradeoff between optimizing blocks that already look good, and the time it takes to do it.

Highly multi-threaded code. Since blocks can be evaluated independently during each pass, it’s straightforward to apply multithreading to the work. During encoding ETC2comp can take advantage of available parallel threads, and it even accepts a jobs parameter, where you can define exactly the number of threads you’d like it to use... in case you have a 256 core machine.

Check out the code on GitHub to get started with ETC2Comp and let us know what you think. You can use the tool from the command line or embed the C++ library in your project. If you want to know more about what’s going on under the hood, check out this blog post.

By Colt McAnlis, Developer Advocate

Learn by doing with the Udacity VR Developer Nanodegree

Posted by Nathan Martz, Product Manager, Google VR

With Google Cardboard and Daydream, our Google VR team is working to bring virtual reality to everyone. In addition to making VR more accessible by using the smartphone in your pocket, we recently launched the Google VR SDK out of beta, with native integration for Unity and UE4, to help make it easier for more developers to join the fold.

To further support and encourage new developers to build VR experiences, we’ve partnered with Udacity to create the VR Developer Nanodegree. Students will learn how to create 3D environments, define behaviors, and make VR experiences comfortable, immersive, and performant.


Even with more than 50 million installs of Google Cardboard apps on Google Play, these are still the early days of VR. Students who complete the VR Developer Nanodegree learn by doing, and will graduate having completed a portfolio of VR experiences.

Learn more and sign up to receive VR Developer Nanodegree program updates at https://www.udacity.com/vr

Google VR SDK graduates out of beta

Posted by Nathan Martz, Product Manager, Google VR

At Google I/O, we announced Daydream—Google's platform for high quality, mobile virtual reality—and released early developer resources to get the community started with building for Daydream. Since then, the team has been hard at work, listening to feedback and evolving these resources into a suite of powerful developer tools.

Today, we are proud to announce that the Google VR SDK 1.0 with support for Daydream has graduated out of beta, and is now available on the Daydream developer site. Our updated SDK simplifies common VR development tasks so you can focus on building immersive, interactive mobile VR applications for Daydream-ready phones and headsets, and supports integrated asynchronous reprojection, high fidelity spatialized audio, and interactions using the Daydream controller.

To make it even easier to start developing with the Google VR SDK 1.0, we’ve partnered with Unity and Unreal so you can use the game engines and tools you’re already familiar with. We’ve also updated the site with full documentation, reference sample apps, and tutorials.

Native Unity integration

This release marks the debut of native Daydream integration in Unity, which enables Daydream developers to take full advantage of all of Unity’s optimizations in VR rendering. It also adds support for features like head tracking, deep linking, and easy Android manifest configuration. Many Daydream launch apps are already working with the newest integration features, and you can now download the new Unity binary here and the Daydream plugin here.

Native UE4 integration

We’ve made significant improvements to our UE4 native integration that will help developers build better production-quality Daydream apps. The latest version introduces Daydream controller support in the editor, a neck model, new rendering optimizations, and much more. UE4 developers can download the source here.

Get started today

While the first Daydream-ready phones and headset are coming this fall, you can start developing high-quality Daydream apps right now with the Google VR SDK 1.0 and the DIY developer kit.

We’re also opening applications to our Daydream Access Program (DAP) so we can work closely with even more developers building great content for Daydream. Submit your Daydream app proposal to apply to be part of our DAP.

When you create content for the Daydream platform, you know your apps will work seamlessly across every Daydream-ready phone and headset. Daydream is just getting started, and we’re looking forward to working together to help you build new immersive, interactive VR experiences. Stay tuned for more information about Daydream-ready phones and the Daydream headset and controller coming soon.

Daydream Labs: positive social experiences in VR

Posted by Robbie Tilton, UX Designer, Google VR

At Daydream Labs, we have experimented with social interactions in VR. Just like in real reality, people naturally want to share and connect with others in VR. As developers and designers, we are excited to build social experiences that are fun and easy to use—but it’s just as important to make it safe and comfortable for all involved. Over the last year, we’ve learned a few ways to nudge people towards positive social experiences.

What can happen without clear social norms

People are curious and will test the limits of your VR experience. For example, when some people join a multiplayer app or game, they might wonder if they can reach their hand through another player’s head or stand inside another avatar’s body. Even with good intentions, this can make other people feel unsafe or uncomfortable.

For example, in a shopping experiment we built for the HTC Vive, two people could enter a virtual storefront and try on different hats, sunglasses, and accessories. There was no limit to how or where they could place a virtual accessory, so some people stuck hats on friends anywhere they would stick—like in front of their eyes. This had the unfortunate effect of blocking their vision. If they couldn’t remove the hat in front of their eyes with their controllers, they had no other recourse than to take off their headset and end their VR experience.


Protecting user safety

Everyone should feel safe and comfortable in VR. If we can anticipate the actions of others, then we may be able to discourage negative social behavior before it starts. For example, by designing personal space around each user, you can prevent other people from invading that personal space.

We built an experiment around playing poker where we tried new ways to discourage trolling. If someone left their seat at the poker table, their environment desaturated to black and white and their avatar would disappear from the other player’s view. A glowing blue personal space bubble would guide the person back to their seat. We found it’s enough to prevent a player from approaching their opponents to steal chips or invade personal space.


Reward positive behavior

If you want people to interact in positive ways—like high-fiving ? —try giving them an incentive. In one experiment, we detected when two different avatars “touched” each other’s hands at high speed. This triggered a loud slapping sound and a fireworks animation. It sounds simple, but people loved it. Meanwhile, if you tried to do something more aggressive, like punching an avatar’s body, nothing would happen. You can guess which behavior people naturally preferred.


Explore India and the world, with Google Arts & Culture

Just as the world’s precious artworks and monuments need a touch-up to look their best, the home we’ve built to host the world’s cultural treasures online needs a lick of paint every now and then. We’re ready to pull off the dust sheets and introduce the new Google Arts & Culture website and app, by the Google Cultural Institute.


Our new tools will help you discover works and artifacts, allowing you to become immersed in cultural experiences across art, history and wonders from India and the world—from more than a thousand museums across 70 countries:








With a virtual reality viewer like Google Cardboard, you can use the Google Arts & Culture app on iOS and Android to take a virtual tour of the Sanskriti Museum which was established in 1978, and is home to one of the largest collections of Indian arts and crafts. You can also subscribe to the new Google Arts & Culture YouTube channel, where you’ll find original content dedicated to culture, hosted by YouTubers.

We’re sure you’ll want to see some of the artworks in real life too—and the Google Arts & Culture app is there to help. Click “Visit” on a museum’s page to get opening times, find out what’s on that day and navigate there in one click. We’ve also been experimenting with a new feature. The Art Recognizer is now available in London’s Dulwich Picture Gallery, Sydney’s Art Gallery of New South Wales and the National Gallery of Art in Washington DC. Just pull up the app, point your phone’s camera to a painting on display and find all the information you want to know about the artwork. We’re planning to roll this out to museums around the world and in India—so stay tuned.


Posted by Duncan Osborn, Product Manager, Google Cultural Institute

Daydream Labs: animating 3D objects in VR

Rob Jagnow, Software Engineer, Google VR

Whether you're playing a game or watching a video, VR lets you step inside a new world and become the hero of a story. But what if you want to tell a story of your own?

Producing immersive 3D animation can be difficult and expensive. It requires complex software to set keyframes with splined interpolation or costly motion capture setups to track how live actors move through a scene. Professional animators spend considerable effort to create sequences that look expressive and natural.

At Daydream Labs, we've been experimenting with ways to reduce technical complexity and even add a greater sense of play when animating in VR. In one experiment we built, people could bring characters to life by picking up toys, moving them through space and time, and then replay the scene.


As we saw people play with the animation experiment we built, we noticed a few things:

The need for complex metaphors goes away in VR: What can be complicated in 2D can be made intuitive in 3D. Instead of animating with graph editors or icons representing location, people could simply reach out, grab a virtual toy, and carry it through the scene. These simple animations had a handmade charm that conveyed a surprising degree of emotion.

The learning curve drops to zero: People were already familiar with how to interact with real toys, so they jumped right in and got started telling their stories. They didn't need a lengthy tutorial, and they were able to modify their animations and even add new characters without any additional help.

People react to virtual environments the same way they react to real ones: When people entered a playful VR environment, they understood it was safe space to play with the toys around them. They felt comfortable performing and speaking in funny voices. They took more risks knowing the virtual environment was designed for play.

To create more intricate animations, we also built another experiment that let people independently animate the joints of a single character. It let you record your character’s movement as you separately animated the feet, hands, and head — just like you would with a puppet.


VR allows us to rethink software and make certain use cases more natural and intuitive. While this kind of animation system won’t replace professional tools, it can allow anyone to tell their own stories. There are many examples of using VR for storytelling, especially with video and animation, and we’re excited to see new perspectives as more creators share their stories in VR.