Tag Archives: arcore

Invite ancient creatures to your living room with AR

What does it feel like to stare into some of the oldest eyes on earth? With augmented reality (AR) and Google Arts & Culture, now you can find out: Meet Cambropachycope, an ancient crustacean with a distinctive pointy head covered in tiny eyes. In collaboration with institutions such as Moscow’s State Darwin Museum and London’s Natural History Museum, we’ve brought a menagerie of prehistoric animals back to digital life. Thanks to AR, you can see them up close through your phone. 


In addition to Cambropachycope, you can also meet the oldest large filter feeder, the fish that swims poorly, or the largest animal ever to live on Earth. Make sure to snap a picture or a video so you can show how these creatures compare in size to the Felis catus or Canis familiaris that roams your living room.

If unusual critters aren’t your thing, we’ve also recreated a collection of unusual cultural artifacts for you to experience in AR. Meet the pre-Inca “smiling god” Lanzón from circa 500 BCE, or see how the Apollo 11 Command Module looks in your backyard—along with a spacesuit, of course. Or, choose from among thousands of paintings to decorate your space, from Frida Kahlo’s self portraits to The Kiss.


To start learning about culture, history and nature in new dimensions, explore our collection of objects in AR and download the Google Arts & Culture app, available for free on Android and iOS

A new wave of AR Realism with the ARCore Depth API

Posted by Rajat Paharia, Product Lead, AR Platform

Since the launch of ARCore, our developer platform for building augmented reality (AR) experiences, we've been focused on providing APIs that help developers seamlessly blend the digital and physical worlds.

At the end of last year, we announced a preview of the ARCore Depth API, which uses our depth-from-motion algorithms to generate a depth map with a single RGB camera. Since then, we’ve been working with select collaborators to explore how depth can be used across a range of use cases to enhance AR realism.

Today, we're taking a major step forward and announcing the Depth API is available in ARCore 1.18 for Android and Unity, including AR Foundation, across hundreds of millions of compatible Android devices.

Generate a depth map without specialized hardware to unlock capabilities like occlusion

As we highlighted last year, a key capability of the Depth API is occlusion: the ability for digital objects to accurately appear behind real world objects. This makes objects feel as if they’re actually in your space, creating a more realistic AR experience.

Illumix, the game studio behind Five Nights at Freddy’s AR: Special Delivery, uses occlusion to deepen the realism of the experience by allowing certain characters to hide behind objects for more startling jump scares.

Play Five Nights at Freddy’s AR: Special Delivery

While occlusion is an important capability, the ARCore Depth API unlocks more ways to increase realism and enables new interaction types. The ARCore Depth Lab spurred more ideas on how depth can be used, including realistics physics, surface interactions, environmental traversal, and more. Developers can now build on these ideas through the open sourced GitHub project.

Experiment with ARCore Depth Lab on the Google Play Store

The designers and engineers at Snap Inc. integrated several of these ideas into a set of Snapchat Lenses including the Dancing Hotdog and a new Android exclusive Undersea World Lens.

See how depth can add a layer of realism to your Snapchat experience

Snapchat Lens Creators can now download an ARCore Depth API template to create depth-based experiences for compatible Android devices. Sam Hare, Research Engineering Manager at Snap Inc, expressed his excitement, “We’re beginning to understand what kinds of depth capabilities are exciting for developers to build with. This single integration point streamlines and simplifies the development process and enables Lens Studio developers to easily take advantage of advanced depth capabilities.”

Another app that combines occlusion with other depth capabilities is Lines of Play, an Android experiment from the Google Creative Lab. Lines of Play lets users create domino art in AR, and uses depth information to showcase both occlusion and collisions. Design elaborate domino creations, topple them over and watch them collide with the furniture and walls in your room.

Watch as domino pieces topple into each other and onto your walls with Lines of Play

In addition to gaming and self-expression, depth can also be used to unlock new utility use cases. For example, the TeamViewer Pilot app, a remote assistance solution that enables AR annotations on video calls, uses depth to better understand the environment so experts around the world can more precisely apply real time 3D AR annotations for remote support and maintenance.

3D annotations help experts accurately highlight details in the TeamViewer Pilot app

Later this year, you will be able to try more depth-enabled AR experiences such as SKATRIX by Reality Crisis and SPLASHAAR by ForwARdgames, that use surface interactions and environmental traversal as they make rich use of the environment around you.

Check out surface interactions and environmental traversal in SKATRIX, and SPLASHAAR

While depth sensors, such as time-of-flight (ToF) sensors, are not required for the Depth API to work, having them will further improve the quality of experiences. Dr. Soo Wan Kim, Camera Technical Product Manager at Samsung commented on the future that the Depth API and ToF unlocks saying, “Depth will enrich user's AR experience in many perspectives. It will reduce scanning time, and can detect planes fast, even low textured planes. These will bring seamless experiences to users who will be able to use AR apps more easily and frequently.” In the coming months, Samsung will update their Quick Measure app to use the ARCore Depth API on the Galaxy Note10+ and Galaxy S20 Ultra.

Accurately measure with Quick Measure

To learn more and get started with the ARCore Depth API, get the SDK and visit the ARCore developer website.

Meet humanity’s first artists through virtual reality

Editor’s Note: France’s Chauvet Cave contains some of the world’s oldest prehistoric drawings. It’s so delicate that it’s closed to the public, but thanks to our partner, the Syndicat Mixte de la Grotte Chauvet, you can now step into the world of our ancient ancestors through Google Search’s augmented reality feature as well as virtual reality. One of these ancient ancestors, who has asked to remain anonymous, has time-traveled 36,000 years to share what the cave was like back then. 

We began our journey to the big cave days ago. Today we arrive and settle near the stone arch that spans the river. We light a fire, signalling to our people up near the caves that we’re here. We’ve brought small stone tools with us to sew the arrowheads we use for hunting. Perhaps we’ll be able to trade them.

There’s plenty of moonlight, so once we’ve made camp I venture out, hiking up to the cave’s entrance to greet the others. The children are still awake, playing with their toys but also listening intently to the lions roaring in the distance. There used to be bears living here too, but they’re long gone.

The closer I get to its entrance, the more the dark cave seems to draw me in, so I light a torch and step inside. After a short walk, the fire illuminates where we—and those before us—have left our marks. Here, someone scraped the clay, exposed the limestone and painted their world, long before we arrived. My favorites are the horses—I think one is afraid, another is playing, and a third one, the curious one, has pricked up its ears inquiringly.

Near the familiar mammoth, a new image catches my eye—perhaps some of our young hunters have depicted this lion to celebrate their success.

The fresco is so enormous, it’s impossible to take it all in. I step back to try and comprehend its meaning. There are cave lions, reindeer and stags, all seeming to move in the play of light and shadow. Just a few lines, drawn by practiced hands, and somehow we have a masterpiece.

Then there are the handprints left by those who came before us. I stand on my toes and stretch to match my own hand to the imprints on the cold rock, and suddenly I feel compelled to leave my mark too. I’ve never been chosen as a painter, but I’m alone and feeling daring, so I dip my hand into the red paint that’s been left out, rise back to my toes, and add my handprint to the others on the wall. 

As it dries, I draw back and watch as the animals and the handprints fade into the darkness. Who knows how long they’ve all been here, and how long they’ll remain?

Another note from the editor: if you enjoyed hearing from our anonymous cave ancestor, check out the following images of the cave she described, or find out more in Google Arts & Culture’s latest exhibit “Chauvet: Meet our Ancestors.”


Heritage on the Edge urges action on the climate crisis

Editor’s note: Guest author Dr. Toshiyuki Kono is President of the International Council on Monuments and Sites. Distinguished Professor Kono also teaches private international law and heritage law at Japan's Kyushu University.

Preserving and protecting the past is essential for our future. This belief is at the core of the International Council on Monuments and Sites (ICOMOS), a global non-government organization dedicated to the conservation of architectural and archaeological heritage.

Our 10,000 members across the globe—including architects, archeologists, geographers, planners and anthropologists—share the same vision: to protect and promote the world’s cultural heritage. The recent youth climate demonstrations shed a spotlight on the urgency of the climate crisis, which is having a devastating effect on our cultural monuments too. It is important to take action, and we must act now to save this part of our human legacy.

That’s why, in collaboration with CyArk and Google Arts & Culture, we’re launching Heritage on the Edge, a new online experience that stresses the gravity of the situation through the lens of five UNESCO World Heritage Sites. You can join us and explore over 50 online exhibits, 3D models, Street View tours, and interviews with local professionals and communities about Rapa Nui’s (Easter Island) iconic statues, the great mosque city of Bagerhat in Bangladesh, the adobe metropolis of Chan Chan in Peru, Scotland’s Edinburgh Castle and the coastal city of Kilwa Kisiwani in Tanzania—all heritage sites that are affected by the climate crisis.

Above all, the project is a call to action. The effects of climate change on our cultural heritage mirror wider impacts on our planet, and require a strong and meaningful response. While actions at individual sites can prevent loss locally, the only sustainable solution is systemic change and the global reduction of greenhouse gas emissions. 

Heritage on the Edge collects stories of loss, but also of hope and resilience. They remind us that all our cultural heritage, including these iconic World Heritage Sites, are more than just tourist destinations. They are places of great national, spiritual and cultural significance.

Blending Realities with the ARCore Depth API

Posted by Shahram Izadi, Director of Research and Engineering

ARCore, our developer platform for building augmented reality (AR) experiences, allows your devices to display content immersively in the context of the world around us-- making them instantly accessible and useful.
Earlier this year, we introduced Environmental HDR, which brings real world lighting to AR objects and scenes, enhancing immersion with more realistic reflections, shadows, and lighting. Today, we're opening a call for collaborators to try another tool that helps improve immersion with the new Depth API in ARCore, enabling experiences that are vastly more natural, interactive, and helpful.
The ARCore Depth API allows developers to use our depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.
Example depth map

Example depth map, with red indicating areas that are close by, and blue representing areas that are farther away.

One important application for depth is occlusion: the ability for digital objects to accurately appear in front of or behind real world objects. Occlusion helps digital objects feel as if they are actually in your space by blending them with the scene. We will begin making occlusion available in Scene Viewer, the developer tool that powers AR in Search, to an initial set of over 200 million ARCore-enabled Android devices today.

A virtual cat with occlusion off and with occlusion on.

We’ve also been working with Houzz, a company that focuses on home renovation and design, to bring the Depth API to the “View in My Room” experience in their app. “Using the ARCore Depth API, people can see a more realistic preview of the products they’re about to buy, visualizing our 3D models right next to the existing furniture in a room,” says Sally Huang, Visual Technologies Lead at Houzz. “Doing this gives our users much more confidence in their purchasing decisions.”
The Houzz app with occlusion is available today.
The Houzz app with occlusion is available today.
In addition to enabling occlusion, having a 3D understanding of the world on your device unlocks a myriad of other possibilities. Our team has been exploring some of these, playing with realistic physics, path planning, surface interaction, and more.

Physics, path planning, and surface interaction examples.

When applications of the Depth API are combined together, you can also create experiences in which objects accurately bounce and splash across surfaces and textures, as well as new interactive game mechanics that enable players to duck and hide behind real-world objects.
A demo experience we created where you have to dodge and throw food at a robot chef
A demo experience we created where you have to dodge and throw food at a robot chef.
The Depth API is not dependent on specialized cameras and sensors, and it will only get better as hardware improves. For example, the addition of depth sensors, like time-of-flight (ToF) sensors, to new devices will help create more detailed depth maps to improve existing capabilities like occlusion, and unlock new capabilities such as dynamic occlusion—the ability to occlude behind moving objects.
We’ve only begun to scratch the surface of what’s possible with the Depth API and we want to see how you will innovate with this feature. If you are interested in trying the new Depth API, please fill out our call for collaborators form.

ARCore updates to Augmented Faces and Cloud Anchors enable new shared cross-platform experiences

Posted by Christina Tong, Product Manager, Augmented Reality

Two years ago, we launched ARCore, our developer platform for building augmented reality (AR) experiences. Since then, we’ve seen developers create thousands of AR apps across Android and iOS that transform the way people play, shop, learn and create together. To enable even more shared cross-platform AR experiences, we’re announcing new updates to ARCore’s Augmented Faces and Cloud Anchors APIs.

Augmented Faces on iOS

Earlier this year, we announced our Augmented Faces API, which offers a high-quality, 468-point 3D mesh that lets users attach fun effects to their faces — all without a depth sensor on their smartphone. With the addition of iOS support rolling out today, developers can now create effects for more than a billion users. We’ve also made the creation process easier for both iOS and Android developers with a new face effects template.

Improvements to Cloud Anchors

Last year, we introduced the Cloud Anchors API, which lets developers create shared AR experiences across Android and iOS. Cloud Anchors let devices create a 3D feature map from visual data onto which anchors can be placed. The anchors are hosted in the cloud so multiple people can use them to enable shared real world experiences. Cloud Anchors power a wide variety of cross-platform apps, like Just a Line, PHAROS AR and Spacecraft AR.

In our latest ARCore update, we’ve made some improvements to the Cloud Anchors API that make hosting and resolving anchors more efficient and robust. This is due to improved anchor creation and visual processing in the cloud. Now, when creating an anchor, more angles across larger areas in the scene can be captured for a more robust 3D feature map. Once the map is created, the visual data used to create the map is deleted and only anchor IDs are shared with other devices to be resolved. Moreover, multiple anchors in the scene can now be resolved simultaneously, reducing the time needed to start a shared AR experience.

These updates to Cloud Anchors are available for developers today.

Persistent Cloud Anchors and Call for Collaborators

As we look to the future, we’re taking steps to expand the scale and timeline of shared AR experiences with persistent Cloud Anchors. We see this as enabling a “save button” for AR, so that digital information overlaid on top of the real world can be experienced at anytime.

Imagine working together on a redesign of your home throughout the year, leaving AR notes for your friends around an amusement park, or hiding AR objects at specific places around the world to be discovered by others.

Persistent Cloud Anchors are powering Mark AR, a social app being developed by Sybo and iDreamSky that lets people create, discover, and share their AR art with friends and followers in real-world locations. With persistent Cloud Anchors, users can continuously return back to their pieces as they create and collaborate over time.

Mark AR phone demonstration

Mark AR is an app that lets people create and discover AR art in real-world locations.

Reliably anchoring AR content for every use case—regardless of surface, distance, and time—pushes the limits of computation and computer vision because the real world is diverse and always changing. By enabling a “save button” for AR, we’re taking an important step toward bridging the digital and physical worlds to expand the ways AR can be useful in our day-to-day lives.

We’re currently looking for more developers to help us explore and test persistent Cloud Anchors in real world apps at scale, before making the feature broadly available. If you’re interested in early access, you can apply here.

Explore art and color in our latest AR gallery

Abstract artist Wassily Kandinsky said, “Color is a power which directly influences the soul.” That’s hard to dispute when you consider the melancholy blues and greens of Picasso’s early Blue Period, or the vibrant yellows of a simple vase of Sunflowers by Van Gogh.

Color has also inspired the latest “Pocket Gallery” on Google Arts & Culture, which uses Augmented Reality to create a virtual space that you can explore using a smartphone. After the first Pocket Gallery brought together paintings by Vermeer last year, the latest collection features a variety of artists’ works, captured in high resolution and selected according to each piece’s color palette.

art-color-intro.gif

The “Art of Color”Pocket gallery in the Google Arts & Culture app

In “The Art of Color,” you can explore four rooms of paintings that each represent a different color palette—you’ll also find a dark room that juxtaposes Rembrandt’s masterpiece The Night Watch with the Op art mastery of Bridget Riley. We selected the art using our Art Palette tool, which brings together a range of works through the lens of color.

The gallery also has a series of playful geometric shapes and vibrant colors that complement the paintings inside. The new Pocket Gallery features art from 33 partner institutions across four continents, and allows you to learn about works of many different eras and styles.

One of the goals of the Google Arts & Culture team is to find new or unexpected ways to bring people closer to art. From renowned masterpieces to hidden gems, “The Art of Color” brings together artworks like Georgia O’Keeffe’s Red Cannas and Amrita Sher-Gil’s Mother India or Hokusai’s South Wind, Clear Dawn.

To check it out, make sure you download the Google Arts & Culture app on your AR-enabled Android or iOS smartphone. You'll find the new gallery in the Camera Tab, and you can jump inside to explore each piece from there.

Immersive branded experiences in YouTube and display ads

As a three-dimensional, visual medium, augmented reality (AR) is a powerful tool for brands looking to tell richer, more engaging stories about their products to consumers. Recently, we brought AR to Google products like Search, and made updates to our developer platform, ARCore, to help creators build more immersive experiences. Starting this week, we’re also bringing AR to YouTube and interactive 3D assets to display ads.

Helping YouTube beauty fans pick their next lipstick

Many consumers look to YouTube creators for help when deciding on new products to purchase. And brands have long been teaming up with creators to connect with audiences. Now, brands and creators can make that experience even more personalized and useful for viewers in AR.

Today, we’re introducing AR Beauty Try-On, which lets viewers virtually try on makeup while following along with YouTube creators to get tips, product reviews, and more. Thanks to machine learning and AR technology, it offers realistic, virtual product samples that work on a full range of skin tones. Currently in alpha, AR Beauty Try-On is available through FameBit by YouTube, Google’s in-house branded content platform.

M·A·C Cosmetics is the first brand to partner with FameBit to launch an AR Beauty Try-On campaign. Using this new format, brands like M·A·C will be able to tap into YouTube’s vibrant creator community, deploy influencer campaigns to YouTube’s 2 billion monthly active users, and measure their results in real time.

Famebit_MAC_Shortened.gif

Viewers will be able to try on different shades of M·A·C lipstick as their favorite beauty creator tries on the same shades. After trying on a lipstick, they can click to visit M·A·C’s website to purchase it.

We tested this experience earlier this year with several beauty brands and found that 30 percent of viewers activated the AR experience in the YouTube iOS app, spending over 80 seconds on average trying on lipstick virtually.

Bringing three-dimensional assets to display ads

We're also offering brands a new canvas for creativity with Swirl, our first immersive display format. Swirl brings three-dimensional assets to display advertising on the mobile web, which can help educate consumers before making a purchase. They can directly zoom in and out, rotate a product, or play an animation. Swirl is available exclusively through Display and Video 360.
3D_Display.gif

In this example from New Balance, people can rotate to explore the Fresh Foam 1080 running shoe. Objects like a mobile phone (right) can expand to show additional layered content.

To help brands more easily edit, configure and publish high-quality, realistic models to use in Swirl display ads, we’re introducing a new editor on Poly, Google’s 3D platform. It provides more editorial control over 3D objects, including new ways to change animation settings, customize backgrounds, and add realistic reflections.

NB_BG (1).gif

The new Poly editor lets you easily edit photorealistic three-dimensional objects for use in Swirl display ads.

These new tools will be available to brands and advertisers this summer. We think they’ll help brands and advertisers make content more engaging, educational, and ultimately effective in driving purchase decisions. If you’re interested, check out our getting started guide for tips. We look forward to seeing you bring your products to life!

Updates to ARCore Help You Build More Interactive & Realistic AR Experiences

Posted by Anuj Gosalia

A little over a year ago, we introduced ARCore: a platform for building augmented reality (AR) experiences. Developers have been using it to create thousands of ARCore apps that help people with everything from fixing their dishwashers, to shopping for sunglasses, to mapping the night sky. Since last I/O, we've quadrupled the number of ARCore enabled devices to an estimated 400 million.

Today, at I/O we introduced updates to Augmented Images and Light Estimation - features that let you build more interactive, and realistic experiences. And to make it easier for people to experience AR, we introduced Scene Viewer, a new tool which lets users view 3D objects in AR right from your website.

Augmented Images

To make experiences appear realistic, we need to account for the fact that things in the real world don’t always stay still. That’s why we’re updating Augmented Images — our API that lets people point their camera at 2D images, like posters or packaging, to bring them to life. The updates enable you to track moving images and multiple images simultaneously. This unlocks the ability to create dynamic and interactive experiences like animated playing cards where multiple images move at the same time.

Letter cards overlaid with an example of how Augmented Images API can be used with moving targets

An example of how the Augmented Images API can be used with moving targets by JD.com

Light Estimation

Last year, we introduced the concept of light estimation, which provides a single ambient light intensity to extend real world lighting into a digital scene. In order to provide even more realistic lighting, we’ve added a new mode, Environmental HDR, to our Light Estimation API.

two mannequins with varying light

Before and after Environmental HDR is applied to the digital mannequin on the left, featuring 3D printed designs from Julia Koerner

Environmental HDR uses machine learning with a single camera frame to understand high dynamic range illumination in 360°. It takes in available light data, and extends the light into a scene with accurate shadows, highlights, reflections and more. When Environmental HDR is activated, digital objects are lit just like physical objects, so the two blend seamlessly, even when light sources are moving.

two mannequins with light diffusing from left to right

Digital mannequin on left and physical mannequin on right

Environmental HDR provides developers with three APIs to replicate real world lighting:

  • Main Directional Light: helps with placing shadows in the right direction
  • Ambient Spherical Harmonics: helps model ambient illumination from all directions
  • HDR Cubemap: provides specular highlights and reflections
Rockets showing lighting changes: Main directional light plus ambient spherical harmonics plus HTF cubemap equals environmental HDR

Scene Viewer

We want to make it easier for people to jump into AR, so today we’re introducing Scene Viewer, so that AR experience can be launched right from your website without having to download a separate app.

To make your assets accessible via Scene Viewer, first add a glTF 3D asset to your website with the <model-viewer> web component, and then add the “ar” attribute to the <model-viewer> markup. Later this year, experiences in Scene Viewer will begin to surface in your Search results.

<script type="module" src="https://unpkg.com/@google/model-viewer/dist/model-viewer.js"></script>
<script nomodule src="https://unpkg.com/@google/model-viewer/dist/model-viewer-legacy.js"></script>

<model-viewer ar src="examples/assets/YOURMODEL.gltf"
auto-rotate camera-controls alt="TEXT ABOUT YOUR MODEL" background-color="#455A64"></model-viewer>
Mobile example of NASA.gov Curiosity Rover in use

NASA.gov enables users to view the Curiosity Rover in their space

These are a few ways that improving real world understanding in ARCore can make AR experiences more interactive, realistic, and easier to access. Look for these features to roll out over the next two releases. To learn more and get started, check out the ARCore developer website.

Partner up with Detective Pikachu Playmoji in Playground

For more than 20 years, generations of fans have delved into the fantastical Pokémon universe on a mission to meet them all. Starting May 10th, you can experience the adventure in theaters in the first Pokémon live-action movie, “POKÉMON Detective Pikachu.”

But you don’t have to wait for the movie to come out to see Pokémon in the wild. We’re launching the POKÉMON Detective Pikachu Playmoji pack today in Playground, a creative mode in your smartphone camera. Now you can partner up with Detective Pikachu, Charizard, Jigglypuff and Mr. Mime to create action-packed scenes in the real world. All you have to do is point your camera and drop one of the Playmoji (or all four) into a scene to bring them to life in your photos and videos.

The pack features Pokémon from the movie, fully animated and sounding just like their film counterparts. And thanks to ARCore’s motion tracking, light estimation and ability to understand the real world, they feel like they’re really there with you. You can even take a selfie with Detective Pikachu and share a smile as he reacts to your facial expressions in real time via machine learning.

So, whether you’re singing alongside Jigglypuff or breathing fire with Charizard, partner up with our #PikaPlaymoji and start sharing your scenes with #DetectivePikachu on social today. Download the POKÉMON Detective Pikachu Playmoji pack now on Pixel and find it on select Motorola and LG devices.

Welcome to the world of Pokémon.