Tag Archives: VR

Bringing Real-time Spatial Audio to the Web with Songbird

Posted by Jamieson Brettle and Drew Allen, Chrome Media Team

For a virtual scene to be truly immersive, stunning visuals need to be accompanied by true spatial audio to create a realistic and believable experience. Spatial audio tools allow developers to include sounds that can come from any direction, and that are associated in 3D space with audio sources, thus completely enveloping the user in 360-degree sound.

Spatial audio helps draw the user into a scene and creates the illusion of entering an entirely new world. To make this possible, the Chrome Media team has created Songbird, an open source, spatial audio encoding engine that works in any web browser by using the Web Audio API.

The Songbird library takes in any number of mono audio streams and allows developers to programmatically place them in 3D space around the user. Songbird allows you to create immersive soundscapes, realistically reproducing reflection and reverb for the space you describe. Sounds bounce off walls and reflect off materials just as they would in real-life, capturing truly 360-degree sound. Songbird creates an ambisonic soundfield that can then be rendered in real-time for use in your application. We've partnered with the Omnitoneproject, which we blogged about last year, to add higher-order ambisonic support to Omnitone's binaural rendererto produce far more accurate sounding audio than ever before.

Songbird encapsulates Omnitone and with it, developers can now add interactive, full-sphere audio to any web based application. Songbird can scale to any order ambisonics, thereby creating a more realistic sound and higher performance than what is achievable through standard Web Audio API.

Songbird Audio Processing Diagram

The implementation of Songbird is based on the Google spatial mediaspecification. It expects mono input and outputs ambisonic (multichannel) ACN channel layout with SN3D normalization. Detailed documentation may be found here.

As the web emerges as an important VR platformfor delivering content, spatial audio will play a vital role in users' embrace of this new medium. Songbird and Omnitone are key tools in enabling spatial audio on the web platform and establishing it as a preeminent platform for compelling VR experiences. Combining these audio experiences with 3D JavaScript libraries like three.js gives a glimpseinto the future on the web.

Demo combining spatial sound in 3D environment

This project was made possible through close collaboration with Google's Daydream and Web Audio teams. This collaboration allowed us to deliver similar audio capabilities to the web as are available to developers creating Daydream applications.

We look forward to seeing what people do with Songbird now that it's open source. Check out the code on GitHub and let us know what you think. Also available are a number of demoson creating full spherical audio with Songbird.

ARCore: Augmented reality at Android scale

Posted by Dave Burke, VP, Android Engineering

With more than two billion active devices, Android is the largest mobile platform in the world. And for the past nine years, we've worked to create a rich set of tools, frameworks and APIs that deliver developers' creations to people everywhere. Today, we're releasing a preview of a new software development kit (SDK) called ARCore. It brings augmented reality capabilities to existing and future Android phones. Developers can start experimenting with it right now.

We've been developing the fundamental technologies that power mobile AR over the last three years with Tango, and ARCore is built on that work. But, it works without any additional hardware, which means it can scale across the Android ecosystem. ARCore will run on millions of devices, starting today with the Pixel and Samsung's S8, running 7.0 Nougat and above. We're targeting 100 million devices at the end of the preview. We're working with manufacturers like Samsung, Huawei, LG, ASUS and others to make this possible with a consistent bar for quality and high performance.

ARCore works with Java/OpenGL, Unity and Unreal and focuses on three things:

  • Motion tracking: Using the phone's camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed.
  • Environmental understanding: It's common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking.
  • Light estimation: ARCore observes the ambient light in the environment and makes it possible for developers to light virtual objects in ways that match their surroundings, making their appearance even more realistic.

Alongside ARCore, we've been investing in apps and services which will further support developers in creating great AR experiences. We built Blocks and Tilt Brush to make it easy for anyone to quickly create great 3D content for use in AR apps. As we mentioned at I/O, we're also working on Visual Positioning Service (VPS), a service which will enable world scale AR experiences well beyond a tabletop. And we think the Web will be a critical component of the future of AR, so we're also releasing prototype browsers for web developers so they can start experimenting with AR, too. These custom browsers allow developers to create AR-enhanced websites and run them on both Android/ARCore and iOS/ARKit.

ARCore is our next step in bringing AR to everyone, and we'll have more to share later this year. Let us know what you think through GitHub, and check out our new AR Experiments showcase where you can find some fun examples of what's possible. Show us what you build on social media with #ARCore; we'll be resharing some of our favorites.

Bringing Real-time Spatial Audio to the Web with Songbird

For a virtual scene to be truly immersive, stunning visuals need to be accompanied by true spatial audio to create a realistic and believable experience. Spatial audio tools allow developers to include sounds that can come from any direction, and that are associated in 3D space with audio sources, thus completely enveloping the user in 360-degree sound.

Spatial audio helps draw the user into a scene and creates the illusion of entering an entirely new world. To make this possible, the Chrome Media team has created Songbird, an open source, spatial audio encoding engine that works in any web browser by using the Web Audio API.

The Songbird library takes in any number of mono audio streams and allows developers to programmatically place them in 3D space around the user. Songbird allows you to create immersive soundscapes, realistically reproducing reflection and reverb for the space you describe. Sounds bounce off walls and reflect off materials just as they would in real-life, capturing truly 360-degree sound. Songbird creates an ambisonic soundfield that can then be rendered in real-time for use in your application. We’ve partnered with the Omnitone project, which we blogged about last year, to add higher-order ambisonic support to Omnitone’s binaural renderer to produce far more accurate sounding audio than ever before.

Songbird encapsulates Omnitone and with it, developers can now add interactive, full-sphere audio to any web based application. Songbird can scale to any order ambisonics, thereby creating a more realistic sound and higher performance than what is achievable through standard Web Audio API.
Songbird Audio Processing Diagram
The implementation of Songbird is based on the Google spatial media specification. It expects mono input and outputs ambisonic (multichannel) ACN channel layout with SN3D normalization. Detailed documentation may be found here.

As the web emerges as an important VR platform for delivering content, spatial audio will play a vital role in users’ embrace of this new medium. Songbird and Omnitone are key tools in enabling spatial audio on the web platform and establishing it as a preeminent platform for compelling VR experiences. Combining these audio experiences with 3D JavaScript libraries like three.js gives a glimpse into the future on the web.
Demo combining spatial sound in 3D environment
This project was made possible through close collaboration with Google’s Daydream and Web Audio teams. This collaboration allowed us to deliver similar audio capabilities to the web as are available to developers creating Daydream applications.

We look forward to seeing what people do with Songbird now that it's open source. Check out the code on GitHub and let us know what you think. Also available are a number of demos on creating full spherical audio with Songbird.

By Jamieson Brettle and Drew Allen, Chrome Media Team

Experimenting with VR Ad formats at Area 120

Posted by Aayush Upadhyay and Neel Rao, Area 120

At Area 120, Google's internal workshop for experimental ideas, we're working on early-stage projects and quickly iterate to test concepts. We heard from developers that they're looking at how to make money to fund their VR applications, so we started experimenting with what a native, mobile VR ad format might look like.

Developers and users have told us they want to avoid disruptive, hard-to-implement ad experiences in VR. So our first idea for a potential format presents a cube to users, with the option to engage with it and then see a video ad. By tapping on the cube or gazing at it for a few seconds, the cube opens a video player where the user can watch, and then easily close, the video. Here's how it works:

Our work focuses on a few key principles - VR ad formats should be easy for developers to implement, native to VR, flexible enough to customize, and useful and non-intrusive for users. Our Area 120 team has seen some encouraging results with a few test partners, and would love to work with the developer community as this work evolves - across Cardboard (on Android and iOS), Daydream and Samsung Gear VR.

If you're a VR developer (or want to be one) and are interested in testing this format with us, please fill out this form to apply for our early access program. We have an early-stage SDK available and you can get up and running easily. We're excited to continue experimenting with this format and hope you'll join us for the ride!

Daydream: Bringing high-quality VR to everyone

Unlike anything else, VR can make you feel like you're transported to a different world. Without a ticket, you can now visit a world-renowned museum, see extinct animals come back to life, or take a field trip to the Taj Mahal. It’s transporting.

Our goal is to make VR for everyone. And with Daydream View everyone can enjoy a comfortable, easy to use, and personalized virtual reality experience.

We’ve been working with developers, smartphone manufacturers, and content creators to make the dream a reality — and now, we’re excited to announce that Daydream View is now available for purchase in India from today.

Pixel, Moto Z and more Daydream-ready phones
With Daydream View, you simply pop a Daydream-ready phone in the headset to start exploring. Daydream-ready phones are built with high-resolution displays, powerful mobile processors and high-fidelity sensors — all tuned to support great VR experiences. Google’s Pixel and Pixel XL are the first Daydream-ready phones, and you can experience Daydream on the Moto Z as well. Later this year, the Samsung Galaxy S8 and S8+ will be Daydream-ready with a software update.

Daydream View, the first Daydream-ready headset and controller

Designed and developed by Google, Daydream View is the first Daydream-ready headset and controller.



Daydream View is:
  • Comfortable. Inspired by the clothes we enjoy wearing, the headset is made with soft, breathable fabric and is designed to fit over eyeglasses.
  • Easy to use. Just drop in your Daydream-ready phone and you’ll be ready to go. The phone and headset have an auto-alignment system so you don’t have to worry about cables or connectors.

    A big part of what makes Daydream View special is the Daydream controller. This small yet powerful controller lets you interact with the virtual world the same way you do in the real world. It points where you point, and is packed with sensors to understand your movements and gestures. You can swing it like a bat or wave it like a wand. And it’s so precise that you can draw with it. The controller slides right inside the headset when not in use, so you don’t have to worry about losing it in your bag or between couch cushions.
  • Yours. Daydream View is designed with choice in mind. The headset fits phones big or small, so it’ll work with any Daydream-ready phone you choose.
Incredible experiences
From the universe of YouTube videos to a magical world where you can cast spells and levitate objects, there is a wide range of experiences available on Daydream.

The best of Google
We’ve brought your favourites into VR. With YouTube, you can watch the entire library of videos on a virtual big screen, and experience VR videos from creators all over the world. Use Street View to see curated tours of over 150 of the world’s most amazing places like the Pyramids and the Taj Mahal — fly over a city, stand at the top of the highest peaks, and even soar into space. You can also search Street View to explore everywhere else. Step inside a virtual gallery and view masterpieces from over 50 world-renowned museums with Arts & Culture. With Play Movies, you can watch shows and films on your own personal big-screen. And Google Photos displays your 360° captures in a whole new way.

Don’t just see the world, experience it
Swim with a pod of dolphins, stand at the edge of a volcano and even visit Pluto. With Daydream View, you can teleport from virtually anywhere to pretty much everywhere. These apps are available on Daydream: NYT VR, Guardian VR, The Turning Forest, Fantastic Beasts, Labster: World of Science and many more.

Your personal cinema
You can always get the best seat in the house with Daydream View. Experience sports, live events and more in full 360° panoramic view. Plus, now you can watch top shows and movies, distraction-free, on your own virtual big screen. These apps are available on Daydream: Netflix VR, Google Play Movies, Within, and more.  

Get in the game
Go from the sidelines to the center of action. Feel the adrenaline rush as you speed down the racetrack. Bowl a strike with a swing of your arm. Cast a spell with your own magic wand. The Daydream controller transforms with your imagination. These games are available on Daydream: The Arcslinger, Wonderglade, Mekorama VR, Gunjack 2: End of Shift, Need for Speed™ No Limits VR, LEGO® BrickHeadz Builder VR and dozens more titles.

Daydream View will be available for purchase from today on Flipkart for INR 6,499.


We are always introducing more apps and partners, and over the coming years we’ll continue our goal of bringing high-quality, mobile VR to everyone.

Posted by Clay Bavor, VP of VR & AR


From the courtly fashions of Versailles to the unmatched elegance of the Saree: 3000 years of fashion brought to you in a new, immersive way


What we wear tells a lot about our social identity, our customs, our habits and where we come from. It's appropriate to say that we don’t just wear clothes – we wear our culture!

Highlighting this very aspect, we at Google Arts & Culture have launched an exciting new project “We wear culture” that showcases 3000 years of fashion from across 42 countries in partnership with 183 world famous museums, fashion councils and universities. Using state of the art technology, including Virtual Reality, 360º videos and Gigapixel images, the platform enables unique online access to historic and contemporary stories that decode the various aspects of fashion for everyone. The stories, photos, videos and VR experiences will appeal to all those who are curious about its various intersections with music, pop culture, dance, technology,  economics and so much more.

So if you want to know more about the ancient Silk Road, or the courtly fashions of Versailles, to how the Vivienne Westwood Corset came to be reconceived as a symbol for sexual empowerment, or the origin of the British punk or the stories behind the clothes you wear today, it’s all there at g.co/wewearculture. Perhaps you’re looking for more? You can explore even the Iconic pieces that changed the way generations dressed, be it Marilyn Monroe’s famous Ruby Slipper by Salvatore Ferragamo or the Black Dress by Chanel. It’s there for you to explore,  at your fingertips and at your leisure.

           



Vivienne Westwood Corset courtesy Victoria & Albert Museum; Ruby Slipper of Marilyn Monroe courtesy Museo Salvatore Ferragamo

The Richness and diversity of Indian Fashion has always been marked by its distinctive and varied craftsmanship, it’s fabrics, the weaves, the natural dyes and vibrant colours as well as the classic Indian drape - the iconic Indian Saree. It would be apt to say that the most versatile garment in the world, the saree, is referenced the world over and worn by millions of women on a daily basis.

To celebrate the rich history of this iconic nine yards, Border & Fall in The Sari Project  have explored 60 regional draping styles. You can also view in detail the varied weaves from across India, from Gharchola to  Patola to Temple to Ikat sarees or trace the story and importance of Indian textiles from ancient sculptures, to heirloom textiles and how events such as Gandhi’s Khadi Movement influenced the craftsmanship from Chhatrapati Shivaji Maharaj Vastu Sangrahalaya (CSMVS). Available online from the CSMVS collection are the heirloom sarees of the Tagore family and that of Homi J. Bhabha’s family.
Various sari drapes courtesy Border&Fall

Of the people, of the land. There is plenty of regional textile and fashion heritage to be discovered. You can revisit the colonial Indian fashion with Dr. Bhau Daji Lad Mumbai City Museum, and trace the story of the history and impact of cotton in early trade of textiles. Then there are the designs from north-eastern India including the weaves of tribes such as the Nagas, Meitis and the traditional attire from Meghalaya called ‘Dhara’ or ‘Nara’ worn by the Khasi women during special occasions, made up of costly Mulberry and Eri silk yarn.  From down south, view Salar Jung Museum’s exhibits capturing the dress and fashion of royal attires of the Nizams from 19th century Hyderabad (part of Deccan region). Revisit the art of Brocades, Patola and Baluchari with a special exhibit by Museum of Art and Photography.





Navjote ceremony coat of Cursetjee Vakil courtesy CSMVS; Ethnographic documentation of drape styles courtesy CSMVS; Salar Jung III in a sherwani courtesy Salar Jung Museum

The SEWA Hansiba Museum in Randhapur is completely owned and managed by rural women artisans. The museum contains heirlooms by the local communities, such as the Ahir, Rabari and Harijan. The local skilling has helped bond stronger communities, and top fashion designers are now approaching them for fashion sampling. Flamboyant stitches to regional exchanges, the women are building economic security for themselves.


If it is colour that catches your interest, then explore how Indigo cultivation dates back to the Indus Valley civilisation and how this natural dye has been often credited with opening up an extensive range of beautiful blue shades that redefined global fashion even as the knowledge of extracting blue color from green leaves of indigo was closely guarded within the families.  You don’t have to stop at Indigo or India, you can explore the colour palette of global fashion over the years.

With over over 400 online exhibitions and stories sharing a total of 30.000 photos, videos and other documents;  4 virtual reality experiences of iconic fashion pieces; over 700 ultra high-resolution gigapixel images and over 40 venues offer backstage access on Google Street View you could easily get lost in fashion!

We could not have done this it without our partners around the globe. In India we are very proud to partner with Chhatrapati Shivaji Maharaj Vastu Sangrahalaya (CSMVS), Dr. Bhau Daji Lad Mumbai City Museum (BDL), SEWA Hansiba Museum, Salar Jung Museum, Indian Museum Kolkata, Museum of Art & Photography, Craft Revival Trust, Avani Society, Worldview Impact Foundation, Border & Fall to celebrate this rich history of Indian fashion and bring to life the creativity, heritage and craftsmanship -- for anyone around the world to see, learn, experience and cherish. The new online exhibition opens today at g.co/wewearculture online for free and will also be available through the Google Arts & Culture mobile app both on iOS and Android.

Posted by Simon Rein, India Programme Manager, Google Arts & Culture


Enjoy your personal concert with VR videos on YouTube

Any music fan will tell you, there’s nothing better than seeing your favorite band live. But if you’re one of the millions of people who can’t make it to a show, YouTube is giving you the next best thing. We’re working with some amazing artists to bring you live performances and music videos in VR.

And today Gorillaz – hailed by “The Guinness Book Of World Records” as the planet’s Most Successful Virtual Act – have announced their return with the release of a new video directed by Jamie Hewlett, featuring four tracks from their highly anticipated forthcoming album “Humanz.” The epic six-minute animated film - titled “Saturnz Barz (Spirit House)” - provides an extraordinary cutting-edge VR experience to include the track “Saturnz Barz” in full, plus highlights of “Ascension,” “Andromeda” and “We Got The Power.” You can check it out here fresh from the oven.


With these new immersive experiences, you can transport yourself to top music festivals and killer concerts, without having to deal with the crowds. This weekend, you can check out Ultra Music Festival live, with a set from Hardwell live streaming in 360 degrees. And you can already experience highlights from Coachella on YouTube, without having to bear the heat of the desert.

We've also been working with some of your favorite artists to experiment with new ways to tell the story behind their songs and allow you to be immersed in the video. Sit next to Sampha on the piano bench while he performs “(No One Knows Me) Like The Piano.” Step into Hunter Hayes’ recording studio as he builds each musical component of his current single, “Yesterday’s Song.” Check out Young The Giant’s latest single, “Silvertongue,” as though you were in the audience. Watch The Naked & Famous official music video for “Higher” shot at the YouTube Space LA. And you can use VR to travel behind the scenes, too. Check out Florida Georgia Line as they shoot “May We All” at the Tennessee National Raceway in Hohenwald.


You can watch these videos using the YouTube VR app available on Daydream or with Google Cardboard. If you don’t have a headset, don’t worry, you can still get the 360-degree video experience on your mobile phone or desktop. It’ll be like you’re virtually there with your favorite band.

Vivien Lewit, Global Head Artist Relations, recently watched "The Range - Florida (Official 360° Video)."

Source: YouTube Blog


Improving VR videos

At YouTube, we are focused on enabling the kind of immersive and interactive experiences that only VR can provide, making digital video as immersive as it can be. In March 2015, we launched support for 360-degree videos shortly followed by VR (3D 360) videos. In 2016 we brought 360 live streaming and spatial audio and a dedicated YouTube VR app to our users.

Now, in a joint effort between YouTube and Daydream, we're adding new ways to make 360 and VR videos look even more realistic.

360 videos need a large numbers of pixels per video frame to achieve a compelling immersive experience. In the ideal scenario, we would match human visual acuity which is 60 pixels per degree of immersive content. We are however limited by user internet connection speed and device capabilities. One way to bridge the gap between these limitations and the human visual acuity is to use better projection methods.

Better Projections

A Projection is the mapping used to fit a 360-degree world view onto a rectangular video surface. The world map is a good example of a spherical earth projected on a rectangular piece of paper. A commonly used projection is called equirectangular projection. Initially, we chose this projection when we launched 360 videos because it is easy to produce by camera software and easy to edit.

However, equirectangular projection has some drawbacks:

  • It has high quality at the poles (top and bottom of image) where people don’t look as much – typically, sky overhead and ground below are not that interesting to look at.
  • It has lower quality at the equator or horizon where there is typically more interesting content.
  • It has fewer vertical pixels for 3D content.
  • A straight line motion in the real world does not result in a straight line motion in equirectangular projection, making videos hard to compress.




Drawbacks of equirectangular (EQ) projection

These drawbacks made us look for better projection types for 360-degree videos. To compare different projection types we used saturation maps. A saturation map shows the ratio of video pixel density to display pixel density. The color coding goes from red (low) to orange, yellow, green and finally blue (high). Green indicates optimal pixel density of near 1:1. Yellow and orange indicate insufficient density (too few video pixels for the available display pixels) and blue indicates wasted resources (too many video pixels for the available display pixels). The ideal projection would lead to a saturation map that is uniform in color. At sufficient video resolution it would be uniformly green.

We investigated cubemaps as a potential candidate. Cubemaps have been used by computer games for a long time to display the skybox and other special effects.

eqr_saturation.png


Equirectangular projection saturation map

cubemap_saturation.png


Cubemap projection saturation map

In the equirectangular saturation map the poles are blue, indicating wasted pixels. The equator (horizon) is orange, indicating an insufficient number of pixels. In contrast, the cubemap has green (good) regions nearer to the equator, and the wasteful blue regions at the poles are gone entirely. However, the cubemap results in large orange regions (not good) at the equator because a cubemap samples more pixels at the corners than at the center of the faces.

We achieved a substantial improvement using an approach we call Equi-angular Cubemap or EAC. The EAC projection’s saturation is significantly more uniform than the previous two, while further improving quality at the equator:

eac_saturation.png


Equi-angular Cubemap - EAC

As opposed to traditional cubemap, which distributes equal pixels for equal distances on the cube surface, equi-angular cubemap distributes equal pixels for equal angular change.

The saturation maps seemed promising, but we wanted to see if people could tell the difference. So we asked people to rate the quality of each without telling them which projection they were viewing. People generally rated EAC as higher quality compared to other projections. Here is an example comparison:

EAC vs EQ


Creating Industry Standards

We’re just beginning to see innovative new projections for 360 video. We’ve worked with Equirectangular and Cube Map, and now EAC. We think a standardized way to represent arbitrary projections will help everyone innovate, so we’ve developed a Projection Independent Mesh.

A Projection Independent Mesh describes the projection by including a 3D mesh along with its texture mapping in the video container. The video rendering software simply renders this mesh as per the texture mapping specified and does not need to understand the details of the projection used. This gives us infinite possibilities. We published our mesh format draft standard on github inviting industry experts to comment and are hoping to turn this into a widely agreed upon industry standard.

Some 360-degree cameras do not capture the entire field of view. For example, they may not have a lens to capture the top and bottom or may only capture a 180-degree scene. Our proposal supports these cameras and allows replacing the uncaptured portions of the field of view by a static geometry and image. Our proposal allows compressing the mesh using deflate or other compression. We designed the mesh format with compression efficiency in mind and were able to fit EAC projection within a 4 KB payload.

The projection independent mesh allows us to continue improving on projections and deploy them with ease since our renderer is now projection independent.

Spherical video playback on Android now benefits from EAC projection streamed using a projection independent mesh and it will soon be available on IOS and desktop. Our ingestion format continues to be based on equirect projection as mentioned in our upload recommendations.

Anjali Wheeler, Software Engineer, recently watched "Disturbed - The Sound Of Silence."

Step into the games with new VR videos

YouTube’s become a global destination for people who love watching gaming videos. And we want to take gamers’ viewing experience a step further by exploring how VR videos can put them right at the center of the action. That’s why we partnered with gaming creators and publishers to experiment with the production of 360 and VR videos. What’s come out of those experiments, from “League of Legends” to “Minecraft,” was pretty exciting.

From Let's Play to trailers, there’s a really wide range of gaming content on YouTube, and a lot of these different style videos are now also becoming available in VR. You can check out gameplay from global creators as well as gaming-themed live action videos celebrating games like “Call of Duty.”

Game publishers are also getting involved in VR gaming videos in a big way, from the immensely popular “Clash of Clans” 360-degree video by Supercell to documentaries uploaded by “World of Tanks” publisher Wargaming. Even eSports organizations are producing content for VR, uploading 360-degree content from top events like the “League of Legends” World Championship Finals.

But what if you just want to chill and watch some gaming-themed entertainment content? We got you covered with videos ranging from the classic “Red vs. Blue” series to Stampy’s “Wonder Quest.”


To give you a taste of gaming experience in VR, check out the playlist above for some of our favorite videos so far. It’s a good cross section of the kind of gaming videos we offer in VR, many of which can make you feel like you’re standing inside the game itself. You can watch these videos using the YouTube VR app available on Daydream or with Google Cardboard. If you don’t have a headset, don’t worry, you can still get the 360-degree video experience on your mobile phone or desktop.

Ryan Wyatt, Head of Gaming Content, recently watched “Clash of Clans: Hog Rider 360°.”

Source: YouTube Blog


Meet The AdMob Team at Game Developers Conference (GDC) Next Week

The Game Developers Conference (GDC) is less than one week away in San Francisco. We’re excited to give you a glimpse into how we are helping mobile game developers build businesses and improve user experiences, while monetizing successfully.

Our Google Developer Day will take place in Room 2020 of the West Hall of Moscone Center on Monday, February 27. We'll share how new devices, platforms, and tools are helping developers grow successful businesses and push the limits of mobile gaming. Then stay with us for a diverse set of lightning talks and panels covering topics including virtual reality, machine learning, monetization and more. Here’s a look at the schedule:

  • Opening Keynote || 10AM: We’ll kick off the day by sharing how Google is ramping up efforts to help game developers grow their businesses and make Android a great platform for games. From Play store updates optimized for discovery and engagement, to performance upgrades making high fidelity gaming come to life on Android devices, 2017 will be another year full of exciting gaming experiences.
  • Lightning Talks || 11:10AM: Stay with us for a variety of 5-minute talks covering all the latest to know to develop great games using Google's technological edge with Machine Learning, VR for everyone, Cloud Gaming, Firebase and more... and learn how to make them profitable.
  • 3 x Panels || 2:00PM: After lunch, we'll host panels to hear from developers first-hand about their experiences launching mobile games, building thriving communities, and navigating the successes and challenges of being an indie developer.

Our main keynote will take place in Room 3009 in the West Hall of Moscone Center on Wednesday, March 1.

  • Smarter Growth, powered by Google's Machine Learning || 9:30AM: Join Jane Butler, Sissie Hsiao and Brendon Kraham, where we’ll discuss how the mobile games industry is becoming more sophisticated as technology, player demographics, motivations and community participation becomes more diverse. As a result, marketers are tasked with finding smarter and more efficient ways to engage high-value users. Using over a decade of machine learning technology, Google is driving innovations in growth and monetization across the player lifecycle. Hear from Google app promotion specialists on how to combine the power of data and technology to drive profitable growth for your games business.

We’ll also be hosting two additional ads focused workshops on Thursday, March 2.

  • Find Your Biggest Fans with Google's Machine Learning || 2:00PM: Join this session, hosted by Mark Heneghan, Analytical Lead, to learn how Google can help you grow both your game and profitability with a little help from machine learning.
  • Earn more revenue from Rewarded and the AdMob platform || 3:00PM: This talk will be hosted by Apoorva Sharma, AdMob Account Manager and in this advanced session, the AdMob team will provide a technical deep dive into how the right ad technology helps you earn money and enhance user experience. 

See all the details about our Google Developer Day here and check out all the other Google talks here.

For more information on our presence at GDC, including a full list of our talks and speaker details, click here. Remember, these events are part of the official Game Developer's Conference, so you will need a pass to attend. For those who can't make it in person, watch the live stream on YouTube starting at 10am PST on Monday, February 27th.

We’ll also be live tweeting on Twitter and sharing on Google+ and LinkedIn, so stay in the loop with what’s happening with #Google #GDC17.

Posted by the Google AdMob Team.

Source: Inside AdMob