Tag Archives: Google VR

The world as you see it with VR180

Virtual reality helps creators bring their audiences to new, amazing, and even impossible-to-visit places. As a viewer, you get a whole new angle on shows, sports, and concerts you care about. You can walk around the Eiffel Tower, dive to the bottom of the Great Barrier Reef, or get a new perspective by meeting people face-to-face in a way that isn’t possible with a flat view of the world.

We know that virtual reality videos can be really powerful, which is why we have invested in supporting 360 and VR formats for over two years. And today, VR video is the most popular way to experience VR. But, we’ve heard from creators and viewers who want to make and see even more immersive videos on YouTube. So, we’ve been working with Google’s Daydream team on a brand new video format, called VR180, that we believe will make VR content even easier to create.

VR180 videos focus on what’s in front of you, are high resolution, and look great on desktop and on mobile. They transition seamlessly to a VR experience when viewed with Cardboard, Daydream, and PSVR, which allow you to view the images stereoscopically in 3-D, where near things look near, and far things appear far. VR180 also supports livestreaming videos so creators and fans can be together in real time.

Introducing VR180 GIF

For creators, you’ll be able to set up and film your videos the way you normally would with any other camera. And, soon, you’ll be able to edit using familiar tools like Adobe Premiere Pro. From vlogs, to makeup tutorials to music videos - your videos will work great in VR.

But supporting the format is just the beginning. We want to make cameras that are easy to work with too. The Daydream team is working with several manufacturers to build cameras from the ground up for VR180. These cameras are not only great for creators looking to easily make VR content, but also anyone who wants to capture life’s highlights in VR. They will be as easy to use as point-and-shoot cameras, for around the same price. Videos and livestreams will be easy to upload to YouTube. Cameras from YI, Lenovo, and LG are on the way, and the first ones will hit shelves this winter. For other manufacturers, we’re opening up a VR180 certification program and Z CAM will be one of our first partners. Learn more and sign up for updates at vr.google.com/vr180. If you can’t wait to try these out, eligible creators can apply to loan a VR180-enabled camera from one of our YouTube Spaces around the globe.

VR180 will unlock opportunities for anyone looking to easily make VR memories. We're just starting to scratch the surface of what is possible and look forward to seeing your new videos!

The world as you see it with VR180

Virtual reality helps creators bring their audiences to new, amazing, and even impossible-to-visit places. As a viewer, you get a whole new angle on shows, sports, and concerts you care about. You can walk around the Eiffel Tower, dive to the bottom of the Great Barrier Reef, or get a new perspective by meeting people face-to-face in a way that isn’t possible with a flat view of the world.

We know that virtual reality videos can be really powerful, which is why we have invested in supporting 360 and VR formats for over two years. And today, VR video is the most popular way to experience VR. But, we’ve heard from creators and viewers who want to make and see even more immersive videos on YouTube. So, we’ve been working with Google’s Daydream team on a brand new video format, called VR180, that we believe will make VR content even easier to create.

VR180 videos focus on what’s in front of you, are high resolution, and look great on desktop and on mobile. They transition seamlessly to a VR experience when viewed with Cardboard, Daydream, and PSVR, which allow you to view the images stereoscopically in 3-D, where near things look near, and far things appear far. VR180 also supports livestreaming videos so creators and fans can be together in real time.

Introducing VR180 GIF

For creators, you’ll be able to set up and film your videos the way you normally would with any other camera. And, soon, you’ll be able to edit using familiar tools like Adobe Premiere Pro. From vlogs, to makeup tutorials to music videos - your videos will work great in VR.

But supporting the format is just the beginning. We want to make cameras that are easy to work with too. The Daydream team is working with several manufacturers to build cameras from the ground up for VR180. These cameras are not only great for creators looking to easily make VR content, but also anyone who wants to capture life’s highlights in VR. They will be as easy to use as point-and-shoot cameras, for around the same price. Videos and livestreams will be easy to upload to YouTube. Cameras from YI, Lenovo, and LG are on the way, and the first ones will hit shelves this winter. For other manufacturers, we’re opening up a VR180 certification program and Z CAM will be one of our first partners. Learn more and sign up for updates at vr.google.com/vr180. If you can’t wait to try these out, eligible creators can apply to loan a VR180-enabled camera from one of our YouTube Spaces around the globe.

VR180 will unlock opportunities for anyone looking to easily make VR memories. We're just starting to scratch the surface of what is possible and look forward to seeing your new videos!

We wear culture: Discover why we wear what we wear with Google Arts & Culture

Are you wearing jeans today? Is there a floral tie or a black dress hanging in your wardrobe? Remember those platform shoes from the ‘90s? These have one thing in common: They all tell a story, sometimes spanning hundreds of years of history.

As the legendary Vogue editor-in-chief Diana Vreeland once said, “You can even see the approaching revolution in clothes. You can see and feel everything in clothes.” That’s one reason we’re excited to unveil “We wear culture,” a new project on Google Arts & Culture that brings you the stories behind the clothes you wear.

More than 180 museums, fashion institutions, schools, archives and other organizations from the fashion hubs of New York, London, Paris, Tokyo, São Paulo and elsewhere came together to put three millennia of fashion at your fingertips. You can browse 30,000 fashion pieces: try searching for hats and sort them by color or shoes by time. In 450+ exhibits, you can find stories from the ancient Silk Road to the ferocious fashion of the British punk. Or meet icons and trendsetters like Coco Chanel, Cristóbal Balenciaga, Yves Saint Laurent or Vivienne Westwood.

We’ve also created virtual reality films bringing to life the stories of iconic pieces. Step inside the places where fashion history lives on YouTube or with a virtual reality viewer:

There's more to clothes than meets the eye. See how shoemakers, jewellers, tie-dyers and bag-makers master their crafts through generations, turning design sketches and tailoring patterns into clothes you can wear. Zoom into ultra-high resolution images made with our Art Camera and see the craftsmanship in unprecedented detail, like this famous Schiaparelli evening coat, a surrealist drawing turned into a bold fashion statement. Step inside the world’s largest costume collection at the Metropolitan Museum of Art's Costume Institute Conservation Laboratory in 360 degrees, and see what it takes to preserve these objects for future generations. Explore the machinery that keeps one of the largest industries in the world in motion and meet the communities that are built on the production of textiles, like the Avani Society in India.

We also teamed up with YouTube star Ingrid Nilsen to go through the wardrobe and discover even more stories behind the clothes you wear today. Before you hide under your hoodie or put on a pair of ripped jeans, hop over to our YouTube channel to take a closer look at the historic thread running through today's fashions.

“We wear culture” is now live and online at g.co/wewearculture and through the Google Arts & Culture mobile app on iOS and Android. With this project, the world of fashion joins more than a thousand institutions of art and history that share their collections on Google Arts & Culture, letting you explore even more of our culture in one place. Click away and you’ll see how fashion is stitched into the fabric of our societies. And join in the conversation on social media with #WeWearCulture!

We wear culture: Discover why we wear what we wear with Google Arts & Culture

Are you wearing jeans today? Is there a floral tie or a black dress hanging in your wardrobe? Remember those platform shoes from the ‘90s? These have one thing in common: They all tell a story, sometimes spanning hundreds of years of history.

As the legendary Vogue editor-in-chief Diana Vreeland once said, “You can even see the approaching revolution in clothes. You can see and feel everything in clothes.” That’s one reason we’re excited to unveil “We wear culture,” a new project on Google Arts & Culture that brings you the stories behind the clothes you wear.

More than 180 museums, fashion institutions, schools, archives and other organizations from the fashion hubs of New York, London, Paris, Tokyo, São Paulo and elsewhere came together to put three millennia of fashion at your fingertips. You can browse 30,000 fashion pieces: try searching for hats and sort them by color or shoes by time. In 450+ exhibits, you can find stories from the ancient Silk Road to the ferocious fashion of the British punk. Or meet icons and trendsetters like Coco Chanel, Cristóbal Balenciaga, Yves Saint Laurent or Vivienne Westwood.

We’ve also created virtual reality films bringing to life the stories of iconic pieces. Step inside the places where fashion history lives on YouTube or with a virtual reality viewer:

There's more to clothes than meets the eye. See how shoemakers, jewellers, tie-dyers and bag-makers master their crafts through generations, turning design sketches and tailoring patterns into clothes you can wear. Zoom into ultra-high resolution images made with our Art Camera and see the craftsmanship in unprecedented detail, like this famous Schiaparelli evening coat, a surrealist drawing turned into a bold fashion statement. Step inside the world’s largest costume collection at the Metropolitan Museum of Art's Costume Institute Conservation Laboratory in 360 degrees, and see what it takes to preserve these objects for future generations. Explore the machinery that keeps one of the largest industries in the world in motion and meet the communities that are built on the production of textiles, like the Avani Society in India.

We also teamed up with YouTube star Ingrid Nilsen to go through the wardrobe and discover even more stories behind the clothes you wear today. Before you hide under your hoodie or put on a pair of ripped jeans, hop over to our YouTube channel to take a closer look at the historic thread running through today's fashions.

“We wear culture” is now live and online at g.co/wewearculture and through the Google Arts & Culture mobile app on iOS and Android. With this project, the world of fashion joins more than a thousand institutions of art and history that share their collections on Google Arts & Culture, letting you explore even more of our culture in one place. Click away and you’ll see how fashion is stitched into the fabric of our societies. And join in the conversation on social media with #WeWearCulture!

We wear culture: Discover why we wear what we wear with Google Arts & Culture

Are you wearing jeans today? Is there a floral tie or a black dress hanging in your wardrobe? Remember those platform shoes from the ‘90s? These have one thing in common: They all tell a story, sometimes spanning hundreds of years of history.

As the legendary Vogue editor-in-chief Diana Vreeland once said, “You can even see the approaching revolution in clothes. You can see and feel everything in clothes.” That’s one reason we’re excited to unveil “We wear culture,” a new project on Google Arts & Culture that brings you the stories behind the clothes you wear.

More than 180 museums, fashion institutions, schools, archives and other organizations from the fashion hubs of New York, London, Paris, Tokyo, São Paulo and elsewhere came together to put three millennia of fashion at your fingertips. You can browse 30,000 fashion pieces: try searching for hats and sort them by color or shoes by time. In 450+ exhibits, you can find stories from the ancient Silk Road to the ferocious fashion of the British punk. Or meet icons and trendsetters like Coco Chanel, Cristóbal Balenciaga, Yves Saint Laurent or Vivienne Westwood.

We’ve also created virtual reality films bringing to life the stories of iconic pieces. Step inside the places where fashion history lives on YouTube or with a virtual reality viewer:

There's more to clothes than meets the eye. See how shoemakers, jewellers, tie-dyers and bag-makers master their crafts through generations, turning design sketches and tailoring patterns into clothes you can wear. Zoom into ultra-high resolution images made with our Art Camera and see the craftsmanship in unprecedented detail, like this famous Schiaparelli evening coat, a surrealist drawing turned into a bold fashion statement. Step inside the world’s largest costume collection at the Metropolitan Museum of Art's Costume Institute Conservation Laboratory in 360 degrees, and see what it takes to preserve these objects for future generations. Explore the machinery that keeps one of the largest industries in the world in motion and meet the communities that are built on the production of textiles, like the Avani Society in India.

We also teamed up with YouTube star Ingrid Nilsen to go through the wardrobe and discover even more stories behind the clothes you wear today. Before you hide under your hoodie or put on a pair of ripped jeans, hop over to our YouTube channel to take a closer look at the historic thread running through today's fashions.

“We wear culture” is now live and online at g.co/wewearculture and through the Google Arts & Culture mobile app on iOS and Android. With this project, the world of fashion joins more than a thousand institutions of art and history that share their collections on Google Arts & Culture, letting you explore even more of our culture in one place. Click away and you’ll see how fashion is stitched into the fabric of our societies. And join in the conversation on social media with #WeWearCulture!

Daydream Labs: Locomotion in VR

Getting from Point A to Point B in real life is relatively straightforward, but in virtual reality, it can be difficult to build an experience where moving through a 3D environment feels natural. VR developers need to prevent motion sickness to keep people comfortable in VR, and the user experience for moving around—also known as locomotion—isn’t a solved problem.

There are a variety of different ways to achieve effective locomotion, each with their own set of tradeoffs. Daydream Labs and teams across Google have explored ways to make locomotion comfortable, intuitive, and fun. We recently released Daydream Elements, a collection of tech demos that showcase principles and best practices for developing high-quality VR experiences. The core mechanics in each demo are built to be easily configurable and reusable for your own apps. Here are a few things we’ve learned about locomotion:

1. Constant velocity. Locomotion in VR can cause motion sickness when there’s a conflict between a person’s vision and their sense of balance. For example, if you see images showing you accelerating through space, like on a roller coaster, but you’re actually sitting stationary in a room, then your vision and vestibular system will disagree. A way to mitigate this is to use constant velocity during locomotion. Although acceleration can be used to produce more realistic transitions, constant velocity is far more comfortable than acceleration in VR.

Constant velocity

While changing velocity is well received in mobile apps, constant velocity is far more comfortable in VR experiences.

2. Tunneling. Tunneling is a technique used with first-person locomotion (such as walking) where, during movement, the camera is cropped and a stable grid is displayed in your peripheral vision. This is analogous to watching first-person locomotion on a television set.

Even though TV shows and movies contain moving images with acceleration, most people don’t experience motion sickness while watching TV. This is perhaps because the TV only takes up a small part of your field of view and your peripheral vision is grounded by a stationary room. VR developers can simulate this by showing people a visual “tunnel” while they’re moving in a 3D environment. We also found it helps to fade the tunnel effect in and out to avoid making it a distraction. We used this approach in Google Earth VR in a feature called Comfort Mode.

Google Earth VR
Comfort Mode in Google Earth VR helps provide a constant frame of reference in your peripheral vision.

3. Teleportation. Teleportation is a locomotion technique for apps using first-person perspective that allows you to near-instantaneously move to a target location. This technique reduces the simulator sickness that many people feel when the virtual camera moves. However, it also makes it harder for people to maintain spatial context—“where am I, and how did I get here?” We found there are subtle things that can ease the transition and improve context. For example, Google Street View on Daydream fades before and after teleportation. Also, when you teleport to a new location, the app quickly moves the entire scene toward you to convey directional motion. This effect is called “implied motion.”

Google Street View
Displaying a fade or dissolve transition when teleporting from point to point creates implied motion in Google Street View on Daydream.

4. Rotation. It’s often tempting to design a VR experience where we assume that people will be either standing or sitting in a swivel chair. Unfortunately, hardware limitations or physical constraints may not allow for full 360-degree rotation. To make sure people can get where they want to go in a VR environment, consider giving them the ability to rotate themselves within the virtual space. Continuous and animated rotations tend to induce motion sickness. Instead, we’ve found that discrete, instantaneous rotations of about 10-20 degrees feel comfortable and provide sufficient visual context to keep people oriented.

We hope this helps give you a few ways to think about locomotion in VR, and we’ll share more with the community as we continue to explore.

Daydream Labs: Locomotion in VR

Getting from Point A to Point B in real life is relatively straightforward, but in virtual reality, it can be difficult to build an experience where moving through a 3D environment feels natural. VR developers need to prevent motion sickness to keep people comfortable in VR, and the user experience for moving around—also known as locomotion—isn’t a solved problem.

There are a variety of different ways to achieve effective locomotion, each with their own set of tradeoffs. Daydream Labs and teams across Google have explored ways to make locomotion comfortable, intuitive, and fun. We recently released Daydream Elements, a collection of tech demos that showcase principles and best practices for developing high-quality VR experiences. The core mechanics in each demo are built to be easily configurable and reusable for your own apps. Here are a few things we’ve learned about locomotion:

1. Constant velocity. Locomotion in VR can cause motion sickness when there’s a conflict between a person’s vision and their sense of balance. For example, if you see images showing you accelerating through space, like on a roller coaster, but you’re actually sitting stationary in a room, then your vision and vestibular system will disagree. A way to mitigate this is to use constant velocity during locomotion. Although acceleration can be used to produce more realistic transitions, constant velocity is far more comfortable than acceleration in VR.

Constant velocity

While changing velocity is well received in mobile apps, constant velocity is far more comfortable in VR experiences.

2. Tunneling. Tunneling is a technique used with first-person locomotion (such as walking) where, during movement, the camera is cropped and a stable grid is displayed in your peripheral vision. This is analogous to watching first-person locomotion on a television set.

Even though TV shows and movies contain moving images with acceleration, most people don’t experience motion sickness while watching TV. This is perhaps because the TV only takes up a small part of your field of view and your peripheral vision is grounded by a stationary room. VR developers can simulate this by showing people a visual “tunnel” while they’re moving in a 3D environment. We also found it helps to fade the tunnel effect in and out to avoid making it a distraction. We used this approach in Google Earth VR in a feature called Comfort Mode.

Google Earth VR
Comfort Mode in Google Earth VR helps provide a constant frame of reference in your peripheral vision.

3. Teleportation. Teleportation is a locomotion technique for apps using first-person perspective that allows you to near-instantaneously move to a target location. This technique reduces the simulator sickness that many people feel when the virtual camera moves. However, it also makes it harder for people to maintain spatial context—“where am I, and how did I get here?” We found there are subtle things that can ease the transition and improve context. For example, Google Street View on Daydream fades before and after teleportation. Also, when you teleport to a new location, the app quickly moves the entire scene toward you to convey directional motion. This effect is called “implied motion.”

Google Street View
Displaying a fade or dissolve transition when teleporting from point to point creates implied motion in Google Street View on Daydream.

4. Rotation. It’s often tempting to design a VR experience where we assume that people will be either standing or sitting in a swivel chair. Unfortunately, hardware limitations or physical constraints may not allow for full 360-degree rotation. To make sure people can get where they want to go in a VR environment, consider giving them the ability to rotate themselves within the virtual space. Continuous and animated rotations tend to induce motion sickness. Instead, we’ve found that discrete, instantaneous rotations of about 10-20 degrees feel comfortable and provide sufficient visual context to keep people oriented.

We hope this helps give you a few ways to think about locomotion in VR, and we’ll share more with the community as we continue to explore.

Daydream Labs: Locomotion in VR

Getting from Point A to Point B in real life is relatively straightforward, but in virtual reality, it can be difficult to build an experience where moving through a 3D environment feels natural. VR developers need to prevent motion sickness to keep people comfortable in VR, and the user experience for moving around—also known as locomotion—isn’t a solved problem.

There are a variety of different ways to achieve effective locomotion, each with their own set of tradeoffs. Daydream Labs and teams across Google have explored ways to make locomotion comfortable, intuitive, and fun. We recently released Daydream Elements, a collection of tech demos that showcase principles and best practices for developing high-quality VR experiences. The core mechanics in each demo are built to be easily configurable and reusable for your own apps. Here are a few things we’ve learned about locomotion:

1. Constant velocity. Locomotion in VR can cause motion sickness when there’s a conflict between a person’s vision and their sense of balance. For example, if you see images showing you accelerating through space, like on a roller coaster, but you’re actually sitting stationary in a room, then your vision and vestibular system will disagree. A way to mitigate this is to use constant velocity during locomotion. Although acceleration can be used to produce more realistic transitions, constant velocity is far more comfortable than acceleration in VR.

Constant velocity

While changing velocity is well received in mobile apps, constant velocity is far more comfortable in VR experiences.

2. Tunneling. Tunneling is a technique used with first-person locomotion (such as walking) where, during movement, the camera is cropped and a stable grid is displayed in your peripheral vision. This is analogous to watching first-person locomotion on a television set.

Even though TV shows and movies contain moving images with acceleration, most people don’t experience motion sickness while watching TV. This is perhaps because the TV only takes up a small part of your field of view and your peripheral vision is grounded by a stationary room. VR developers can simulate this by showing people a visual “tunnel” while they’re moving in a 3D environment. We also found it helps to fade the tunnel effect in and out to avoid making it a distraction. We used this approach in Google Earth VR in a feature called Comfort Mode.

Google Earth VR
Comfort Mode in Google Earth VR helps provide a constant frame of reference in your peripheral vision.

3. Teleportation. Teleportation is a locomotion technique for apps using first-person perspective that allows you to near-instantaneously move to a target location. This technique reduces the simulator sickness that many people feel when the virtual camera moves. However, it also makes it harder for people to maintain spatial context—“where am I, and how did I get here?” We found there are subtle things that can ease the transition and improve context. For example, Google Street View on Daydream fades before and after teleportation. Also, when you teleport to a new location, the app quickly moves the entire scene toward you to convey directional motion. This effect is called “implied motion.”

Google Street View
Displaying a fade or dissolve transition when teleporting from point to point creates implied motion in Google Street View on Daydream.

4. Rotation. It’s often tempting to design a VR experience where we assume that people will be either standing or sitting in a swivel chair. Unfortunately, hardware limitations or physical constraints may not allow for full 360-degree rotation. To make sure people can get where they want to go in a VR environment, consider giving them the ability to rotate themselves within the virtual space. Continuous and animated rotations tend to induce motion sickness. Instead, we’ve found that discrete, instantaneous rotations of about 10-20 degrees feel comfortable and provide sufficient visual context to keep people oriented.

We hope this helps give you a few ways to think about locomotion in VR, and we’ll share more with the community as we continue to explore.

More on Daydream, Tango, and Developer tools for VR and AR

This morning at Google I/O, we went into more detail about the investments we’re making in the core technologies that enable VR and AR, and in platforms that make them accessible to more people. 

Tango 

Tango enables devices to track motion and understand depth and space, and it’s a fundamental enabling technology for both virtual and augmented reality. WorldSense, the positional tracking technology which makes our new Daydream standalone VR headsets work without any external sensors, is derived from Tango. 

Tango also enables smartphone AR. With it, devices can provide indoor directions and place digital objects in the space around us. You can see what furniture looks like in your bedroom before you buy it, build interactive worlds in your living room, or summon dinosaurs into your kitchen to learn more about them. And with Expeditions AR, students can have a shared experience of digital objects, like the rings of Saturn or an erupting volcano, right in the classroom.

Daydream 

Daydream is our platform for mobile VR. There are lots of Daydream-ready phones already available, with more to come this year—including Samsung’s Galaxy S8 and S8+, and devices from LG, Motorola and ASUS. Standalone headsets, a new category of devices built by our partners, are also coming to Daydream later this year. They’re easy to use, and the form factor enables partners to optimize things like sensors and displays for VR. And with more than 150 apps, there’s lots to explore, watch and do in VR—regardless of which Daydream-ready device you choose.

The upcoming 2.0 release for all headsets, Daydream Euphrates, has features that make VR more fun and easier to share with others. You’ll be able to capture what you’re seeing, as well as cast your virtual world right onto the screen in your living room. And, soon, you’ll be able to watch YouTube videos in VR with other people and share the experience in the same virtual space.

DDEuph_Casting

DDEuph_YTVR

Developers and the Web

In order for immersive computing to be “for everyone,” developers need to build great apps and experiences. We’re working on tools and tech to help them do that. 

First, with Instant Preview, developers can make changes on a computer and see them reflected on a headset in seconds, not minutes, making development much faster for VR. 

Second, a new technology we’ve developed called Seurat – named after the great French painter – makes it possible to render high-fidelity scenes on mobile VR headsets in real time. It uses some clever tricks to help you achieve desktop-level graphics or better with a mobile GPU. Seurat enabled ILMxLAB, the branch of Lucasfilm focused on pioneering next generation immersive experiences, to bring the cinema-quality world of Rogue One to a mobile VR headset. We’ll have more to share on Seurat later this year, so stay tuned.

Seurat: Bringing high-fidelity rendering to mobile VR

Finally, our investments in the web mean that developers can distribute their creations to anyone, regardless of device—whether it’s desktops, phones, or VR and AR enabled devices. We were an early supporter and contributor to WebVR standards. Chrome VR, which will make it possible to browse the web in virtual reality, is coming to Daydream this summer. And we’re excited to support AR for the web, too. We’re releasing an experimental build of Chromium with an AR API that you can try out now.

These are just the first steps, but we’re excited about where this leads. Be sure to catch our other I/O sessions on VR and AR to learn more.

More on Daydream, Tango, and Developer tools for VR and AR

This morning at Google I/O, we went into more detail about the investments we're making in the core technologies that enable VR and AR, and in platforms that make them accessible to more people. 

Tango 

Tango enables devices to track motion and understand depth and space, and it’s a fundamental enabling technology for both virtual and augmented reality. WorldSense, the positional tracking technology which makes our new Daydream standalone VR headsets work without any external sensors, is derived from Tango. 

Tango also enables smartphone AR. With it, devices can provide indoor directions and place digital objects in the space around us. You can see what furniture looks like in your bedroom before you buy it, build interactive worlds in your living room, or summon dinosaurs into your kitchen to learn more about them. And with Expeditions AR, students can have a shared experience of digital objects, like the rings of Saturn or an erupting volcano, right in the classroom.

Daydream 

Daydream is our platform for mobile VR. There are lots of Daydream-ready phones already available, with more to come this year—including Samsung’s Galaxy S8 and S8+, and devices from LG, Motorola and ASUS. Standalone headsets, a new category of devices built by our partners, are also coming to Daydream later this year. They’re easy to use, and the form factor enables partners to optimize things like sensors and displays for VR. And with more than 150 apps, there’s lots to explore, watch and do in VR—regardless of which Daydream-ready device you choose.

The upcoming 2.0 release for all headsets, Daydream Euphrates, has features that make VR more fun and easier to share with others. You’ll be able to capture what you’re seeing, as well as cast your virtual world right onto the screen in your living room. And, soon, you’ll be able to watch YouTube videos in VR with other people and share the experience in the same virtual space.

DDEuph_Casting
DDEuph_YTVR

Developers and the Web

In order for immersive computing to be “for everyone,” developers need to build great apps and experiences. We’re working on tools and tech to help them do that. 

First, with Instant Preview, developers can make changes on a computer and see them reflected on a headset in seconds, not minutes, making development much faster for VR. 

Second, a new technology we've developed called Seurat – named after the great French painter – makes it possible to render high-fidelity scenes on mobile VR headsets in real time. It uses some clever tricks to help you achieve desktop-level graphics or better with a mobile GPU. Seurat enabled ILMxLAB, the branch of Lucasfilm focused on pioneering next generation immersive experiences, to bring the cinema-quality world of Rogue One to a mobile VR headset. We'll have more to share on Seurat later this year, so stay tuned.
Seurat: Bringing high-fidelity rendering to mobile VR

Finally, our investments in the web mean that developers can distribute their creations to anyone, regardless of device—whether it’s desktops, phones, or VR and AR enabled devices. We were an early supporter and contributor to WebVR standards. Chrome VR, which will make it possible to browse the web in virtual reality, is coming to Daydream this summer. And we’re excited to support AR for the web, too. We’re releasing an experimental build of Chromium with an AR API that you can try out now.

These are just the first steps, but we’re excited about where this leads. Be sure to catch our other I/O sessions on VR and AR to learn more.