Tag Archives: Google AR and VR

Chromebook tablets for versatile learning

This past January, students in Kristine Kuwano and Bonnie Chow's third grade classrooms were buzzing with excitement at De Vargas Elementary School in Cupertino, CA. Tasked with writing out math equations to upload to Google Classroom, the students grabbed their new tablets from the cart, pulled out the stylus, and logged into Chrome. “They love technology and they have grown up working with touch devices, so tablets are intuitive for them,” said Kuwano.

Since their debut, schools have chosen Chromebooks because they are fast, easy-to-use and manage, shareable, secure and affordable. We've listened carefully to feedback from educators around the world, and one common theme is that they want all the benefits of Chromebooks in a tablet form.

Starting today, with the new Acer Chromebook Tab 10, we're doing just that. It’s the first education tablet made for Chrome OS, and gives schools the easy management and shareability of Chromebook laptops. With touch and stylus functionality, this lightweight device is perfect for students creating multimedia projects—and also comes with a world of immersive experiences with Google Expeditions AR.

Chromebook Tablet_Versatile Learning_2.jpg
The new Acer Chromebook Tab 10 is easy to pass around the room from student to student.

Shareable, secure, and easy to manage

Whether overseeing 100 or 100,000 devices, IT admins can manage these new Chromebook tablets alongside other Chrome devices with the Chrome Education license. This lets students access everything they need to learn, while giving admins control from a single, scalable console.

Because Chrome OS lets students securely share devices, Chromebook tablets are perfect for computer carts. Just like Chromebook laptops, students can quickly and securely log on to any device for a personalized learning experience and just as easily log out from all apps when class is over. Verified boot checks security at every boot and all user data is encrypted, making each Chromebook tablet secure and shareable.

What’s awesome is we can manage these new Chromebook tablets like we manage our existing Chromebook laptops—all on one platform. We don’t have to move between different interfaces. I manage my Chromebooks here, my tablets here, all as one big fleet. Mark Loundy
Instructional Technology Specialist, De Vargas

Think outside the desk(top): touch, stylus and Expeditions

These new Chromebook tablets are lightweight and durable, allowing students to collaborate, create and learn from anywhere. They come with a low-cost Chromebook stylus inside that doesn’t require charging or pairing. The stylus uses advanced machine learning to predict student writing for a natural writing experience.

Chromebook Tablet_Versatile Learning_1.jpg
De Vargas Elementary School student upgrades from No.2 pencil to the wireless stylus for Acer Chromebook Tab 10.

Coming soon, teachers can take students on Google Expeditions to the Great Barrier Reef, the Colosseum, and even to the International Space Station—all from the screens of their Chrome devices. And with Expeditions AR, students will be able to stare into the eye of a miniature Category 5 hurricane or get up close with a strand of DNA.

Apps for every subject

Learning apps come to life in new ways when students have the flexibility of touchscreens, styluses and tablets. Student scientists can collect field notes in Science Journal and aspiring podcast producers can record and edit stories on the go with Soundtrap. Here are a few more apps that educators love to use with tablets: 

  • Get hands-on with handwriting: Students can use their stylus to jot down notes in Google Keep without the hassle of keeping track of (and losing) paper. In Squid, students can write directly on PDFs, and “paper” types like blank, wide-ruled, and grid. With the annotation feature in Google Classroom, teachers can illustrate complex concepts and give visual feedback, as well as assign PDF worksheets that students can annotate by hand.
  • Use your tablet in every class: For educators, creative apps like Adobe Illustrator Draw turn the classroom into a design studio, and let students and teachers draw and create vector designs. Teaching math or science? Apps like Texthelp EquatIO let students show their work by hand writing any math expression and adding it to a Google Doc in one click. Coding apps like Scratch Jr introduce younger students to the foundations of computational thinking, while enabling them to be creators.
  • Bring ideas to life: Amplify storytelling and allow students to animate their thinking on an infinitely interactive and collaborative whiteboard with Explain Everything. Book Creator lets students create and publish multimedia books, and WeVideo turns the classroom into a movie studio with features like collaborative editing and green screen. 

The Acer Chromebook Tab 10 comes with support for these and hundreds of other learning applications from our ever-growing catalog of apps in the Play Store. See a sample of other learning apps on Google Play.

No one knows what’s needed in the classroom more than teachers. As we continue to grow Chromebooks, we encourage educators and parents to try out new devices and apps, and let us know what you think. The Acer Chromebook Tab 10 will be on sale through education resellers this spring—check with your local reseller for more information.

Source: Google Chrome


Chromebook tablets for versatile learning

This past January, students in Kristine Kuwano’s third grade classroom were buzzing with excitement at De Vargas Elementary School in Cupertino, California. Tasked with writing out math equations to upload to Google Classroom, the students grabbed their new tablets from the cart, pulled out the stylus, and logged into Chrome. “They love technology and they have grown up working with touch devices, so tablets are intuitive for them,” said Kuwano.

Since their debut, schools have chosen Chromebooks because they are fast, easy-to-use and manage, shareable, secure and affordable. We've listened carefully to feedback from educators around the world, and one common theme is that they want all the benefits of Chromebooks in a tablet form.

Starting today, with the new Acer Chromebook Tab 10, we're doing just that. It’s the first education tablet made for Chrome OS, and gives schools the easy management and shareability of Chromebook laptops. With touch and stylus functionality, this lightweight device is perfect for students creating multimedia projects—and also comes with a world of immersive experiences with Google Expeditions AR.

Chromebook Tablet_Versatile Learning_2.jpg
The new Acer Chromebook Tab 10 is easy to pass around the room from student to student.

Shareable, secure, and easy to manage

Whether overseeing 100 or 100,000 devices, IT admins can manage these new Chromebook tablets alongside other Chrome devices with the Chrome Education license. This lets students access everything they need to learn, while giving admins control from a single, scalable console.

Because Chrome OS lets students securely share devices, Chromebook tablets are perfect for computer carts. Just like Chromebook laptops, students can quickly and securely log on to any device for a personalized learning experience and just as easily log out from all apps when class is over. Verified boot checks security at every boot and all user data is encrypted, making each Chromebook tablet secure and shareable.

What’s awesome is we can manage these new Chromebook tablets like we manage our existing Chromebook laptops—all on one platform. We don’t have to move between different interfaces. I manage my Chromebooks here, my tablets here, all as one big fleet. Mark Loundy
Instructional Technology Specialist, De Vargas

Think outside the desk(top): touch, stylus and Expeditions

These new Chromebook tablets are lightweight and durable, allowing students to collaborate, create and learn from anywhere. They come with a low-cost Chromebook stylus inside that doesn’t require charging or pairing. The stylus uses advanced machine learning to predict student writing for a natural writing experience.

Chromebook Tablet_Versatile Learning_1.jpg
De Vargas Elementary School student upgrades from No.2 pencil to the wireless stylus for Acer Chromebook Tab 10.

Coming soon, teachers can take students on Google Expeditions to the Great Barrier Reef, the Colosseum, and even to the International Space Station—all from the screens of their Chrome devices. And with Expeditions AR, students will be able to stare into the eye of a miniature Category 5 hurricane or get up close with a strand of DNA.

Apps for every subject

Learning apps come to life in new ways when students have the flexibility of touchscreens, styluses and tablets. Student scientists can collect field notes in Science Journal and aspiring podcast producers can record and edit stories on the go with Soundtrap. Here are a few more apps that educators love to use with tablets: 

  • Get hands-on with handwriting: Students can use their stylus to jot down notes in Google Keep without the hassle of keeping track of (and losing) paper. In Squid, students can write directly on PDFs, and “paper” types like blank, wide-ruled, and grid. With the annotation feature in Google Classroom, teachers can illustrate complex concepts and give visual feedback, as well as assign PDF worksheets that students can annotate by hand.
  • Use your tablet in every class: For educators, creative apps like Adobe Illustrator Draw turn the classroom into a design studio, and let students and teachers draw and create vector designs. Teaching math or science? Apps like Texthelp EquatIO let students show their work by hand writing any math expression and adding it to a Google Doc in one click. Coding apps like Scratch Jr introduce younger students to the foundations of computational thinking, while enabling them to be creators.
  • Bring ideas to life: Amplify storytelling and allow students to animate their thinking on an infinitely interactive and collaborative whiteboard with Explain Everything. Book Creator lets students create and publish multimedia books, and WeVideo turns the classroom into a movie studio with features like collaborative editing and green screen. 

The Acer Chromebook Tab 10 comes with support for these and hundreds of other learning applications from our ever-growing catalog of apps in the Play Store. See a sample of other learning apps on Google Play.

No one knows what’s needed in the classroom more than teachers. As we continue to grow Chromebooks, we encourage educators and parents to try out new devices and apps, and let us know what you think. The Acer Chromebook Tab 10 will be on sale through education resellers this spring—check with your local reseller for more information.

Early explorations with ARCore 1.0

We recently launched ARCore 1.0 to give developers the ability to build powerful augmented reality apps that make your phone’s camera smarter. It works on over 100 million Android devices, on more than a dozen different device models, so now more people can use AR to interact with the world in inspiring new ways.
While it’s only been a few weeks since launch, developers are already publishing new ARCore experiences on Google Play, across gaming, shopping and home, and creativity.

Gaming

For gaming, AR weaves the action right into the world around you, making the experience more immersive and unlocking a whole new way to play. Here are new titles built with ARCore:

giphy (3).gif

My Tamagotchi Forever

BANDAI NAMCO has released “My Tamagotchi Forever,” an experience in which players can raise Tamagotchi characters while building Tamatown, a virtual town you can play with in the real world.
TWDOW_KeyArt_4K_February2018 (1).jpg

Walking Dead Our World

Immerse yourself in the zombie apocalypse! Your mission, should you choose to accept it, is to defend your surroundings by fighting zombies in real-world environments. Walking Dead Our World is a great example of how to use Google Maps APIsand ARCore together to build a location-based AR game. It’s currently in pre-registration on Google Play, with a broader release planned soon.
TendARLandscape1 (2).png

TendAR

Tender Claws created TendAR, a game that features Guppy, a virtual fish that responds to users’ facial expressions and survives by “eating” other people’s emotions. The game was created by combining ARCore with Google Cloud APIs, which provides computer vision and object recognition. You can read more about how they created the experience in this case study. TendAR will be available to download starting in July 2018.

Shopping & Home

Augmented reality can bring anything into your space, which helps when you’re trying to understand the size and scale of things before you buy or ship them. Here are a few experiences built by our retail partners to aid you in making smarter decisions:

PotteryBarn2.gif

Pottery Barn 360 Room View

With Pottery Barn’s AR app, you can view furniture in your room to see how it pairs with your existing pieces, change the color and fabric of furniture before deciding which looks best, and can purchase what you’ve picked out directly from the app.

giphy (4).gif

eBay
eBay is using AR to solve a specific challenge facing their community of sellers: what size shipping container is needed to send that product? With the “Which Box” feature in eBay’s app, sellers can visualize shipping boxes to determine which container size they need to send any product.

Curate by Sotheby's International Realty, Streem
If you’re shopping for a new home or need help maintaining yours, AR can also come in handy. With ARCore, Sotheby’s International Realty is changing the way people stage furniture in the real estate world, and the Streem app connects customers with professionals to solve household maintenance requests.

Creativity

Over the last few months, we’ve been tinkering with experiments that show how AR can be used as a new creative medium for self-expression. We’ve worked with creators across different disciplines to explore what happens when AR is used by illustrators, choreographers, animators and more.

Now, we’re inviting more people to experiment with this technology through an app that lets you make simple drawings in AR, and then share your creation with a short video. The caveat: it’s “Just a Line.”

Make simple drawings in AR with Just a Line

We’re open sourcing the core code of the app so developers can use it as a starting point for their own ARCore projects, and we’re excited to see what people create with Just a Line. Download it on Google Play.

Anyone with an ARCore-enabled phone can jump into most of these experiences from the Play Store right now, and developers can get started building their own apps today.

Open sourcing Resonance Audio

Spatial audio adds to your sense of presence when you’re in VR or AR, making it feel and sound, like you’re surrounded by a virtual or augmented world. And regardless of the display hardware you’re using, spatial audio makes it possible to hear sounds coming from all around you.

Resonance Audio, our spatial audio SDK launched last year, enables developers to create more realistic VR and AR experiences on mobile and desktop. We’ve seen a number of exciting experiences emerge across a variety of platforms using our SDK. Recent examples include apps like Pixar’s Coco VR for Gear VR, Disney’s Star WarsTM: Jedi Challenges AR app for Android and iOS, and Runaway’s Flutter VR for Daydream, which all used Resonance Audio technology.

To accelerate adoption of immersive audio technology and strengthen the developer community around it, we’re opening Resonance Audio to a community-driven development model. By creating an open source spatial audio project optimized for mobile and desktop computing, any platform or software development tool provider can easily integrate with Resonance Audio. More cross-platform and tooling support means more distribution opportunities for content creators, without the worry of investing in costly porting projects.

What’s included in the open source project

As part of our open source project, we’re providing a reference implementation of YouTube’s Ambisonic-based spatial audio decoder, compatible with the same Ambisonics format (Ambix ACN/SN3D) used by others in the industry. Using our reference implementation, developers can easily render Ambisonic content in their VR media and other applications, while benefiting from Ambisonics open source, royalty-free model. The project also includes encoding, sound field manipulation and decoding techniques, as well as head related transfer functions (HRTFs) that we’ve used to achieve rich spatial audio that scales across a wide spectrum of device types and platforms. Lastly, we’re making our entire library of highly optimized DSP classes and functions, open to all. This includes resamplers, convolvers, filters, delay lines and other DSP capabilities. Additionally, developers can now use Resonance Audio’s brand new Spectral Reverb, an efficient, high quality, constant complexity reverb effect, in their own projects.

We’ve open sourced Resonance Audio as a standalone library and associated engine plugins, VST plugin, tutorials, and examples with the Apache 2.0 license. This means Resonance Audio is yours, so you’re free to use Resonance Audio in your projects, no matter where you work. And if you see something you’d like to improve, submit a GitHub pull request to be reviewed by the Resonance Audio project committers. While the engine plugins for Unity, Unreal, FMOD, and Wwise will remain open source, going forward they will be maintained by project committers from our partners, Unity, Epic, Firelight Technologies, and Audiokinetic, respectively.

If you’re interested in learning more about Resonance Audio, check out the documentation on our developer site. If you want to get more involved, visit our GitHub to access the source code, build the project, download the latest release, or even start contributing. We’re looking forward to building the future of immersive audio with all of you.

Experimenting with Light Fields

We’ve always believed in the power of virtual reality to take you places. That’s why we created Expeditions, to transport people around the world to hundreds of amazing, hard-to-reach or impossible-to-visit places. It’s why we launched Jump, which lets professional creators film beautiful scenes in stereoscopic 360 VR video, and it’s why we’re introducing VR180, a new format for anyone—even those unfamiliar with VR technology—to capture life’s special moments.

But to create the most realistic sense of presence, what we show in VR needs to be as close as possible to what you’d see if you were really there. When you’re actually in a place, the world reacts to you as you move your head around: light bounces off surfaces in different ways and you see things from different perspectives. To help create this more realistic sense of presence in VR, we’ve been experimenting with Light fields.

Light fields are a set of advanced capture, stitching, and rendering algorithms. Much more work needs to be done, but they create still captures that give you an extremely high-quality sense of presence by producing motion parallax and extremely realistic textures and lighting. To demonstrate the potential of this technology, we’re releasing “Welcome to Light Fields,” a free app available on Steam VR for HTC Vive, Oculus Rift, and Windows Mixed Reality headsets. Let’s take a look at how it works.

Capturing and processing a light field

With light fields, nearby objects seem near to you—as you move your head, they appear to shift a lot. Far-away objects shift less and light reflects off objects differently, so you get a strong cue that you’re in a 3D space. And when viewed through a VR headset that supports positional tracking, light fields can enable some truly amazing VR experiences based on footage captured in the real world.

This is possible because a light field records all the different rays of light coming into a volume of space. To record them, we modified a GoPro Odyssey Jump camera, bending it into a vertical arc of 16 cameras mounted on a rotating platform.

ShuttleOddityLoop_Compressed_15fps.gif
Left: A time lapse video of recording a spherical light field on the flight deck of Space Shuttle Discovery.
Right: Light field rendering allows us to synthesize new views of the scene anywhere within the spherical volume by sampling and interpolating the rays of light recorded by the cameras on the rig.

It takes about a minute for the camera rig to swing around and record about a thousand outward-facing viewpoints on a 70cm sphere. This gives us a two-foot wide diameter volume of light rays, which determines the size of the headspace that users have to lean around in to explore the scenes once they are processed. To render views for the headset, rays of light are sampled from the camera positions on the surface of the sphere to construct novel views as seen from inside the sphere to match how the user moves their head. They’re aligned and compressed in a custom dataset file that’s read by special rendering software we’ve implemented as a plug-in for the Unity game engine.

Light Fields.PNG

Recording the World with Light Fields

We chose a few special places to try out our light field-recording camera rig. We love the varnished teak and mahogany interiors at the Gamble House in Pasadena, the fragments of glossy ceramic and shiny mirrors adorning the Mosaic Tile House in Venice, and the sun-filled stained glass window at St. Stephen’s Church in Granada Hills. Best of all, the Smithsonian Institute’s Air and Space Museum and 3D Digitization Office gave us access to NASA’s Space Shuttle Discovery, providing an astronaut’s view inside the orbiter’s flight deck which has never been open to the public.  And we closed with recording a variety of light fields of people, experimenting with how eye contact can be made to work in a 6-degrees-of-freedom experience.

Try “Welcome to Light Fields”

VR video is a promising technology for exploring the world, and while they are still an experiment, light fields show a new level of how convincing virtual reality experiences can be. We hope you enjoy our “Welcome to Light Fields” experience, available now on Steam VR. Take the seven-minute Guided Tour to learn more about the technology and the locations, and then take your time exploring the spaces in the Gallery. This is only the beginning, and lots more needs to be done, but we’re excited about this step toward more realistic capture for VR.

Watch live performances at The FADER FORT from SXSW in VR180

For over 15 years, The FADER has introduced the world to new music artists at The FADER FORT, the global media company's annual live music event at South by Southwest (SXSW). FADER FORT has been the breakout, must-do gig for famous artists including Cardi B, Dua Lipa, Drake and many others. The event gives emerging and global artists an intimate stage to experiment on and allows those in attendance to experience performances up close and personal. But with an intimate experience, only a lucky few are able to make it into the must-see show making it one of the most in demand events at SXSW.

announcing-fort-2018.jpg.jpg

To bring The FADER FORT experience to more fans, we partnered with The FADER to livestream performances by Saweetie, Bloc Boy, Valee, Speedy Ortiz, YBN Nahmir and other special guests in VR180 on YouTube. No matter where you are, you can watch live on YouTube via your desktop or mobile device, or using Cardboard, Daydream View or PlayStation VR.

With VR180, those not in attendance at The FADER FORT in Austin will be able to experience three dimensional, 4K video of the show, providing a more immersive experience than a traditional video and making you feel like you are there.

From March 14-16th, we’ll livestream the the best acts of the day in VR180 and a cutdown of each set—that can be viewed at any time.

Check out the calendar below, grab your headset and get ready to see some of the best new artists on the scene without ever setting foot in Austin. Visit Faderfor the full lineup. See you at the Fort!

Making a video game in two days with Tilt Brush and Unity

Imagine you’re playing a video game, and you’re being attacked by a gang of angry space aliens. Wouldn't it be great if you could just paint an object in 3D space and use it to defend yourself? A talented team of artists and game fanatics explored this very premise at Global Game Jam 2018, a game development hackathon. Seeing Tilt Brush as a fast, powerful and fun 3D asset creation tool, the team at Another Circus used the Tilt Brush Toolkit to create a virtual reality game in less than 48 hours.

“Pac Tac Atac” casts you as a space adventurer who has landed on an alien planet and needs to beam a rescue message into intergalactic space. But watch out, the locals are angry and in the mood to smash your transmitter. It’s up to you to keep them away!

What the aliens don’t know is that you’re armed with two cans of spray paint, that let you  magically draw any object in your imagination to defend yourself.

tbigf1

Once you’ve got your magic object, you can start fighting off the aliens with slices and dices, or by throwing your weapon and calling it back like a boomerang.

tbgif2

“Pac Tac Attack” was built using the Unity game engine, using art exclusively painted in Tilt Brush and exported as 3D models. Using Tilt Brush provided a number of benefits over traditional 3D modeling. For example, to make creating lots of aliens easy for the development team, they first drew different body parts (heads, torso, arms and legs) in Tilt Brush. In Unity, they randomly assembled alien bodies using the body parts they originally painted in Tilt Brush. By procedurally generating bodies in this way, they could easily scale assembling dozens of alien bodies with unique movement styles.

tbgif3

One of the biggest challenges the team faced was optimizing the Tilt Brush art for in-game performance. Given the amount of detail generated by each brush stroke, they improvised by creating assets with fewer strokes (like Jonathan Yeo and his 3D-printed bronze self-portrait), and using Mesh Simplify, a Unity extension, that allows developers to reduce the poly count of their 3D models.  

“Pac Tac Atac” is available for the HTC Vive now. Check out more here.

Announcing ARCore 1.0 and new updates to Google Lens

With ARCore and Google Lens, we’re working to make smartphone cameras smarter. ARCore enables developers to build apps that can understand your environment and place objects and information in it. Google Lens uses your camera to help make sense of what you see, whether that’s automatically creating contact information from a business card before you lose it, or soon being able to identify the breed of a cute dog you saw in the park. At Mobile World Congress, we're launching ARCore 1.0 along with new support for developers, and we’re releasing updates for Lens and rolling it out to more people.

ARCore lockup

ARCore, Google’s augmented reality SDK for Android, is out of preview and launching as version 1.0. Developers can now publish AR apps to the Play Store, and it’s a great time to start building. ARCore works on 100 million Android smartphones, and advanced AR capabilities are available on all of these devices. It works on 13 different models right now (Google’s Pixel, Pixel XL, Pixel 2 and Pixel 2 XL; Samsung’s Galaxy S8, S8+, Note8, S7 and S7 edge; LGE’s V30 and V30+ (Android O only); ASUS’s Zenfone AR; and OnePlus’s OnePlus 5). And beyond those available today, we’re partnering with many manufacturers to enable their upcoming devices this year, including Samsung, Huawei, LGE, Motorola, ASUS, Xiaomi, HMD/Nokia, ZTE, Sony Mobile, and Vivo.

Making ARCore work on more devices is only part of the equation. We’re also bringing developers additional improvements and support to make their AR development process faster and easier. ARCore 1.0 features improved environmental understanding that enables users to place virtual assets on textured surfaces like posters, furniture, toy boxes, books, cans and more. Android Studio Beta now supports ARCore in the Emulator, so you can quickly test your app in a virtual environment right from your desktop.

TreeGif

Everyone should get to experience augmented reality, so we’re working to bring it to people everywhere, including China. We’ll be supporting ARCore in China on partner devices sold there—starting with Huawei, Xiaomi and Samsung—to enable them to distribute AR apps through their app stores.

We’ve partnered with a few great developers to showcase how they're planning to use AR in their apps. Snapchat has created an immersive experience that invites you into a “portal”—in this case, FC Barcelona’s legendary Camp Nou stadium. Visualize different room interiors inside your home with Sotheby’s International Realty. See Porsche’s Mission E Concept vehicle right in your driveway, and explore how it works. With OTTO AR, choose pieces from an exclusive set of furniture and place them, true to scale, in a room. Ghostbusters World, based on the film franchise, is coming soon. In China, place furniture and over 100,000 other pieces with Easyhome Homestyler, see items and place them in your home when you shop on JD.com, or play games from NetEase, Wargaming and Game Insight.

With Google Lens, your phone’s camera can help you understand the world around you, and we’re expanding availability of the Google Lens preview. With Lens in Google Photos, when you take a picture, you can get more information about what’s in your photo. In the coming weeks, Lens will be available to all Google Photos English-language users who have the latest version of the app on Android and iOS. Also over the coming weeks, English-language users on compatible flagship devices will get the camera-based Lens experience within the Google Assistant. We’ll add support for more devices over time.

And while it’s still a preview, we’ve continued to make improvements to Google Lens. Since launch, we’ve added text selection features, the ability to create contacts and events from a photo in one tap, and—in the coming weeks—improved support for recognizing common animals and plants, like different dog breeds and flowers.

TextSelection

Smarter cameras will enable our smartphones to do more. With ARCore 1.0, developers can start building delightful and helpful AR experiences for them right now. And Lens, powered by AI and computer vision, makes it easier to search and take action on what you see. As these technologies continue to grow, we'll see more ways that they can help people have fun and get more done on their phones.

Go behind the scenes of “Isle of Dogs” with Pixel

"Isle of Dogs" tells the story of Atari Kobayashi, 12-year-old ward to corrupt Mayor Kobayashi. When, by Executive Decree, all the canine pets of Megasaki City are exiled to a vast garbage-dump, Atari sets off alone in a miniature Junior-Turbo Prop and flies to Trash Island in search of his bodyguard-dog, Spots. There, with the assistance of a pack of newly-found mongrel friends, he begins an epic journey that will decide the fate and future of the entire Prefecture.

The film isn’t out until March 23—but Pixel owners will get an exclusive sneak peek this week.

In “Isle of Dogs Behind the Scenes (in Virtual Reality),” the audience is taken behind-the-scenes in a 360-degree VR experience featuring on-set interviews of the film’s cast (voiced by Bryan Cranston, Bill Murray, Edward Norton, Liev Schreiber, Jeff Goldblum, Scarlett Johansson, Tilda Swinton, F. Murray Abraham and Bob Balaban). Get nose-to-nose with Chief, Boss, Rex and the rest of the cast while the crew works around you, for an inside look at the unique craft of stop-motion animation.


Pixel’s powerful front-firing stereo speakers and brilliant display make it perfect for watching immersive VR content like this. Presented in 4K video with interactive spatial audio that responds to where you’re looking, “Isle of Dogs Behind the Scenes (in Virtual Reality)” is a collaboration between FoxNext VR Studio, Fox Searchlight Pictures, Felix & Paul Studios, the Isle of Dogs production team, and Google Spotlight Stories.

ISLEOFDOGS_Stories.jpg

“Isle of Dogs Behind the Scenes (in Virtual Reality)” is available today on the Google Spotlight Stories app, exclusively for Google Pixel phones (Pixel and Pixel 2) and best watched on the Daydream View headset. To watch, download the Spotlight Stories app.

On March 2, “Isle of Dogs Behind the Scenes (in Virtual Reality)” will become available in VR, 360 and 2D via YouTube VR and Fox Searchlight YouTube channel, and any platform that has the YouTube VR app, including Daydream and Sony PlayStation VR. “Isle of Dogs,” from Fox Searchlight, hits theaters on March 23.

Six ways Google can keep you up to speed in PyeongChang

Tomorrow thousands of athletes will come together in PyeongChang to represent their countries with the world as their audience. While the athletes are getting ready for the gold, we’re getting a few of our products ready, too. Here are six ways Google is helping you stay connected to what’s happening on the ground (and on the ice) during the PyeongChang 2018 Olympic Winter Games:

1. Stay in the snow know with Google Search

When you search for the Winter Olympics, you’ll find the latest information about your favorite events at the top of Search results. You’ll be able to see your country’s rank in the race for gold medals, or dive into a specific sport to check out which athletes have won. When you’re not tuning into the Winter Games live, you can watch a daily recap video, see top news related to the Olympic Games, and find verified updates from official broadcasters around the world.
GoogleSearch_Winter.gif

2. Tune in with YouTube

Starting February 8, if you miss a competition, you can watch select Olympic Winter Games video highlights from official Olympic broadcasters on YouTube in more than 80 countries around the world including from NBCUniversal (USA), BBC (UK), NHK (Japan), France TV (France), and Eurosport (Rest of Europe). In the U.S., YouTube TV will also show NBCUniversal’s live coverage of the Olympic Winter Games. In India, Pakistan, Bangladesh, Sri Lanka, Bhutan, Maldives and Nepal the Winter Games will be for the first time live and free on the Olympic Channel on YouTube.

3. Keep up with these apps on Google Play

Don’t miss a single jump (ski, axel, or otherwise) with these apps in the Google Play Store. Just download and follow along with the athletes and watch the action live:


4. Explore South Korea in Street View and Google Earth

Check out the new “sports” category in Google Earth Voyager with five stories about the Winter Games that take you from epic ski jumping destinations to theOlympic Torch relay. These travel itineraries will help you explore South Korea’s capital city, and on Street View, you can see the new imagery of stadiums, cities and towns close to PyeongChang.

5. Get your head in the game with the Assistant

Your Google Assistant can help you stay up to date throughout the games. Curious about winners? Just say “Hey Google, who won women’s 1000 meter speed skating in the Olympics?” Rooting for a specific country? “Hey Google, how many medals does Iceland have in the Olympics?” You can even say “Hey Google, tell me a fun fact about the games in PyeongChang.” No matter how you’re asking—on your phone, speaker, TV or other enabled device—the Google Assistant can keep up with all the important Olympic details.

Plus, in the U.S., NBC is bringing an exclusive game to the Google Assistant across devices. It’s already live, so test your winter sports knowledge with dozens of trivia questions. Just say “Hey Google, play NBC Sports Trivia” to start your quest for Olympics’ trivia gold.

6. VR gets you closer to the action

Stream more than 50 hours of NBCUniversal’s live coverage—from the Opening Ceremony to alpine skiing, ice hockey, figure skating, snowboarding, curling and more—in virtual reality by using your YouTube TV credentials to log in to the NBC Sports VR app, powered by Intel True VR. In Europe, multi-camera live VR coverage is available via the Eurosport VR app.

Let the games begin.