Tag Archives: arcore

Explore art and color in our latest AR gallery

Abstract artist Wassily Kandinsky said, “Color is a power which directly influences the soul.” That’s hard to dispute when you consider the melancholy blues and greens of Picasso’s early Blue Period, or the vibrant yellows of a simple vase of Sunflowers by Van Gogh.

Color has also inspired the latest “Pocket Gallery” on Google Arts & Culture, which uses Augmented Reality to create a virtual space that you can explore using a smartphone. After the first Pocket Gallery brought together paintings by Vermeer last year, the latest collection features a variety of artists’ works, captured in high resolution and selected according to each piece’s color palette.

art-color-intro.gif

The “Art of Color”Pocket gallery in the Google Arts & Culture app

In “The Art of Color,” you can explore four rooms of paintings that each represent a different color palette—you’ll also find a dark room that juxtaposes Rembrandt’s masterpiece The Night Watch with the Op art mastery of Bridget Riley. We selected the art using our Art Palette tool, which brings together a range of works through the lens of color.

The gallery also has a series of playful geometric shapes and vibrant colors that complement the paintings inside. The new Pocket Gallery features art from 33 partner institutions across four continents, and allows you to learn about works of many different eras and styles.

One of the goals of the Google Arts & Culture team is to find new or unexpected ways to bring people closer to art. From renowned masterpieces to hidden gems, “The Art of Color” brings together artworks like Georgia O’Keeffe’s Red Cannas and Amrita Sher-Gil’s Mother India or Hokusai’s South Wind, Clear Dawn.

To check it out, make sure you download the Google Arts & Culture app on your AR-enabled Android or iOS smartphone. You'll find the new gallery in the Camera Tab, and you can jump inside to explore each piece from there.

Immersive branded experiences in YouTube and display ads

As a three-dimensional, visual medium, augmented reality (AR) is a powerful tool for brands looking to tell richer, more engaging stories about their products to consumers. Recently, we brought AR to Google products like Search, and made updates to our developer platform, ARCore, to help creators build more immersive experiences. Starting this week, we’re also bringing AR to YouTube and interactive 3D assets to display ads.

Helping YouTube beauty fans pick their next lipstick

Many consumers look to YouTube creators for help when deciding on new products to purchase. And brands have long been teaming up with creators to connect with audiences. Now, brands and creators can make that experience even more personalized and useful for viewers in AR.

Today, we’re introducing AR Beauty Try-On, which lets viewers virtually try on makeup while following along with YouTube creators to get tips, product reviews, and more. Thanks to machine learning and AR technology, it offers realistic, virtual product samples that work on a full range of skin tones. Currently in alpha, AR Beauty Try-On is available through FameBit by YouTube, Google’s in-house branded content platform.

M·A·C Cosmetics is the first brand to partner with FameBit to launch an AR Beauty Try-On campaign. Using this new format, brands like M·A·C will be able to tap into YouTube’s vibrant creator community, deploy influencer campaigns to YouTube’s 2 billion monthly active users, and measure their results in real time.

Famebit_MAC_Shortened.gif

Viewers will be able to try on different shades of M·A·C lipstick as their favorite beauty creator tries on the same shades. After trying on a lipstick, they can click to visit M·A·C’s website to purchase it.

We tested this experience earlier this year with several beauty brands and found that 30 percent of viewers activated the AR experience in the YouTube iOS app, spending over 80 seconds on average trying on lipstick virtually.

Bringing three-dimensional assets to display ads

We're also offering brands a new canvas for creativity with Swirl, our first immersive display format. Swirl brings three-dimensional assets to display advertising on the mobile web, which can help educate consumers before making a purchase. They can directly zoom in and out, rotate a product, or play an animation. Swirl is available exclusively through Display and Video 360.
3D_Display.gif

In this example from New Balance, people can rotate to explore the Fresh Foam 1080 running shoe. Objects like a mobile phone (right) can expand to show additional layered content.

To help brands more easily edit, configure and publish high-quality, realistic models to use in Swirl display ads, we’re introducing a new editor on Poly, Google’s 3D platform. It provides more editorial control over 3D objects, including new ways to change animation settings, customize backgrounds, and add realistic reflections.

NB_BG (1).gif

The new Poly editor lets you easily edit photorealistic three-dimensional objects for use in Swirl display ads.

These new tools will be available to brands and advertisers this summer. We think they’ll help brands and advertisers make content more engaging, educational, and ultimately effective in driving purchase decisions. If you’re interested, check out our getting started guide for tips. We look forward to seeing you bring your products to life!

Updates to ARCore Help You Build More Interactive & Realistic AR Experiences

Posted by Anuj Gosalia

A little over a year ago, we introduced ARCore: a platform for building augmented reality (AR) experiences. Developers have been using it to create thousands of ARCore apps that help people with everything from fixing their dishwashers, to shopping for sunglasses, to mapping the night sky. Since last I/O, we've quadrupled the number of ARCore enabled devices to an estimated 400 million.

Today, at I/O we introduced updates to Augmented Images and Light Estimation - features that let you build more interactive, and realistic experiences. And to make it easier for people to experience AR, we introduced Scene Viewer, a new tool which lets users view 3D objects in AR right from your website.

Augmented Images

To make experiences appear realistic, we need to account for the fact that things in the real world don’t always stay still. That’s why we’re updating Augmented Images — our API that lets people point their camera at 2D images, like posters or packaging, to bring them to life. The updates enable you to track moving images and multiple images simultaneously. This unlocks the ability to create dynamic and interactive experiences like animated playing cards where multiple images move at the same time.

Letter cards overlaid with an example of how Augmented Images API can be used with moving targets

An example of how the Augmented Images API can be used with moving targets by JD.com

Light Estimation

Last year, we introduced the concept of light estimation, which provides a single ambient light intensity to extend real world lighting into a digital scene. In order to provide even more realistic lighting, we’ve added a new mode, Environmental HDR, to our Light Estimation API.

two mannequins with varying light

Before and after Environmental HDR is applied to the digital mannequin on the left, featuring 3D printed designs from Julia Koerner

Environmental HDR uses machine learning with a single camera frame to understand high dynamic range illumination in 360°. It takes in available light data, and extends the light into a scene with accurate shadows, highlights, reflections and more. When Environmental HDR is activated, digital objects are lit just like physical objects, so the two blend seamlessly, even when light sources are moving.

two mannequins with light diffusing from left to right

Digital mannequin on left and physical mannequin on right

Environmental HDR provides developers with three APIs to replicate real world lighting:

  • Main Directional Light: helps with placing shadows in the right direction
  • Ambient Spherical Harmonics: helps model ambient illumination from all directions
  • HDR Cubemap: provides specular highlights and reflections
Rockets showing lighting changes: Main directional light plus ambient spherical harmonics plus HTF cubemap equals environmental HDR

Scene Viewer

We want to make it easier for people to jump into AR, so today we’re introducing Scene Viewer, so that AR experience can be launched right from your website without having to download a separate app.

To make your assets accessible via Scene Viewer, first add a glTF 3D asset to your website with the <model-viewer> web component, and then add the “ar” attribute to the <model-viewer> markup. Later this year, experiences in Scene Viewer will begin to surface in your Search results.

<script type="module" src="https://unpkg.com/@google/model-viewer/dist/model-viewer.js"></script>
<script nomodule src="https://unpkg.com/@google/model-viewer/dist/model-viewer-legacy.js"></script>

<model-viewer ar src="examples/assets/YOURMODEL.gltf"
auto-rotate camera-controls alt="TEXT ABOUT YOUR MODEL" background-color="#455A64"></model-viewer>
Mobile example of NASA.gov Curiosity Rover in use

NASA.gov enables users to view the Curiosity Rover in their space

These are a few ways that improving real world understanding in ARCore can make AR experiences more interactive, realistic, and easier to access. Look for these features to roll out over the next two releases. To learn more and get started, check out the ARCore developer website.

Partner up with Detective Pikachu Playmoji in Playground

For more than 20 years, generations of fans have delved into the fantastical Pokémon universe on a mission to meet them all. Starting May 10th, you can experience the adventure in theaters in the first Pokémon live-action movie, “POKÉMON Detective Pikachu.”

But you don’t have to wait for the movie to come out to see Pokémon in the wild. We’re launching the POKÉMON Detective Pikachu Playmoji pack today in Playground, a creative mode in your smartphone camera. Now you can partner up with Detective Pikachu, Charizard, Jigglypuff and Mr. Mime to create action-packed scenes in the real world. All you have to do is point your camera and drop one of the Playmoji (or all four) into a scene to bring them to life in your photos and videos.

The pack features Pokémon from the movie, fully animated and sounding just like their film counterparts. And thanks to ARCore’s motion tracking, light estimation and ability to understand the real world, they feel like they’re really there with you. You can even take a selfie with Detective Pikachu and share a smile as he reacts to your facial expressions in real time via machine learning.

So, whether you’re singing alongside Jigglypuff or breathing fire with Charizard, partner up with our #PikaPlaymoji and start sharing your scenes with #DetectivePikachu on social today. Download the POKÉMON Detective Pikachu Playmoji pack now on Pixel and find it on select Motorola and LG devices.

Welcome to the world of Pokémon.

Step into Childish Gambino’s world with augmented reality

Augmented reality (AR) lets you bring digital content into the real world—transforming the way you shop, learn, create and experience what’s around you. For artists and creators, AR can be used as an outlet for artistic expression and a way for fans to explore and interact with their content in a new way.

Earlier this year, we partnered with recording artist Childish Gambino to create an AR version of himself in Playground, a creative mode in the Pixel camera. The Playmoji looks and feels lifelike as it dances and reacts to you in your photos and videos. Today, Childish Gambino fans can try his new multiplayer AR app called PHAROS AR and journey through his universe to the tune of his latest sounds.

The experience begins with the opening of an AR portal. Walk through it to explore an augmented cave where you can find and interact with hidden glyphs while still being able to see out into the real world.

After finding all the hidden glyphs, your journey continues to more worlds throughout Childish Gambino’s universe. You can go on the adventure alone, or share the experience with friends as you view and interact with visual elements simultaneously.

A screenshot of a neon pink walkway within the PHAROS app.

The app is built with ARCore, Google’s developer platform for building AR experiences, and Unity, a real-time 3D development platform. With ARCore, developers can build apps that blend the digital and physical worlds—creating experiences that bring what you see on your phone into your actual surroundings. PHAROS AR uses ARCore’s Cloud Anchors API for the multiplayer experience across Android and iOS, so you can use it along with your friends regardless of your device.

A garden with palm trees and characters within the PHAROS app.

Put on your headphones and download PHAROS AR on Android now (coming soon to iOS) as you step inside Childish Gambino’s world with AR.

Suit up with Marvel Studios’ Avengers: Endgame and Pixel

Want to defeat a villain like Thanos and save the world?


Now #teampixel can, with a little help from Marvel Studios and Playground, a creative mode in the Pixel camera that gives you the power to create and play with the world around you using augmented reality. Just in time for the upcoming release of Marvel Studios’ “Avengers: Endgame,” in theaters April 26, today we’re adding to our collection of Playmoji from the Marvel Cinematic Universe with five new characters: War Machine, Thor, Black Widow, Rocket and Captain Marvel.


The heroes join Iron Man, Captain America, Hulk, Nebula and Okoye in Playground, so now you can make even more epic scenes come to life by adding the interactive characters to your photos and videos. Thanks to ARCore’s motion tracking, light estimation and ability to understand the real world, the Playmoji look and feel lifelike, and react to your facial expressions in real time.

You don’t need superhero strength or a suit of armor to unleash the power of Playground—you just need a Pixel and the newest Playmoji joining the Marvel Studios’ Avengers: Endgame pack.


For some added fun, we reimagined the Marvel Cinematic Universe by exploring what would happen if Pixel 3 was launched into a world in need of a little assistance, alongside the Avengers.

Pixel 3 + Marvel Studios‘ Avengers: Endgame

So whether you’re suiting up to defeat Thanos or getting ready to supercharge your selfie, start saving the world alongside your favorite Playmoji using Playground today. Show us how you’re assembling to defeat Thanos on social with #pixelendgame.

Explore millennia of human inventions in one exhibition

New inventions have fueled fantasies and shaped human society—from the first stone tools to robotic arms, steam engines to jet propulsion, pieces of paper to the internet, and hieroglyphics to emoji. Take the telescope, for example. Today, the Hubble Space Telescope orbits 340 miles above the Earth, capturing crisp images of 10,000 galaxies that are up to 13 billion years old. The idea for the telescope was born in 1608 from Dutch spectacle-maker Hans Lippershey's idea, and Galileo Galileo later improved the design, then pointed it at the sky.

Today, we’re celebrating the objects dreamt up and created by inventors, scientists and dreamers. Thanks to over 110 institutions, as well as dedicated curators and archivists from 23 countries around the world, you can explore a millennia of human progress in Once Upon a Try,now available on Google Arts and Culture. With over 400 interactive collections, it’s the largest online exhibition about inventions and discoveries ever created.

In addition to the exhibition, you can download a “Big Bang” augmented reality app, which we developed in collaboration with CERN, the European Organization for Nuclear Research. In the app, you’ll embark on an epic 360-degree journey through the birth and evolution of the universe. With Tilda Swinton as your guide, witness the formation of the very first stars and watch planet Earth take shape in the palm of your hand. Using Google’s machine learning, you can also explore NASA's vast archive of 127,000 historic images with a new tool called NASA's Visual Universe. See the history of discoveries and missions, or search for a term to learn more about the space agency. You can also tour the Space Shuttle Discovery—based in the Smithsonian National Air and Space Museum—in 360 degrees, with the astronauts who once called it home as your hosts.

A demonstration of the Big Bang augmented reality app, narrated by Tilda Swinton.

Within the Once Upon a Try exhibition, you can dive into Street View to tour the sites of great discoveries, from the deep underground of CERN to the high-in-the-sky International Space Station. Zoom into 200,000 artifacts in high definition, like the first map of the Americas and Saturn and its 62 moons. Get the lowdown on big inventions (from emoji to the toilet), or hear five inspirational scientists talk about superpowers—like shapeshifting—that are being created through science. Meet the Einsteins and Curies, or learn more about champions behind the scenes—like Chewang Norphel, the man single-handedly combating climate change with artificial glaciers, or Mary Anning, the pioneering female paleontologist who discovered the pterodactyl.

Woven through the exhibition are tales of lucky accidents, epic fails and even people who died for their projects—like Röntgen’s fluke discovery of x-rays, Isaac Peral’s ingenious electric submarine that never launched and Marie Curie’s quest to find polonium, which led to her own death from radioactive poisoning. Despite these setbacks, human endeavour is a never-ending journey—and you can imagine that only a few things are as exhilarating as that “eureka” moment when everything falls into place. Get all the tips you need to become and inventor, and learn why it’s important to embrace failure through the stories of pioneers like Ada Lovelace, Mae Jemison and Chien-Shiung Wu.

We hope this tribute to human discovery inspires a new generation of creators to be curious, to seek what lies beyond the known and to try something new. Explore “Once Upon a Try” on Google Arts & Culture or via our iOS or Android app and join the conversation on #OnceUponaTry.

New UI tools and a richer creative canvas come to ARCore

Posted by Evan Hardesty Parker, Software Engineer

ARCore and Sceneform give developers simple yet powerful tools for creating augmented reality (AR) experiences. In our last update (version 1.6) we focused on making virtual objects appear more realistic within a scene. In version 1.7, we're focusing on creative elements like AR selfies and animation as well as helping you improve the core user experience in your apps.

Creating AR Selfies

Example of 3D face mesh application

ARCore's new Augmented Faces API (available on the front-facing camera) offers a high quality, 468-point 3D mesh that lets users attach fun effects to their faces. From animated masks, glasses, and virtual hats to skin retouching, the mesh provides coordinates and region specific anchors that make it possible to add these delightful effects.

You can get started in Unity or Sceneform by creating an ARCore session with the "front-facing camera" and Augmented Faces "mesh" mode enabled. Note that other AR features such as plane detection aren't currently available when using the front-facing camera. AugmentedFace extends Trackable, so faces are detected and updated just like planes, Augmented Images, and other trackables.

// Create ARCore session that support Augmented Faces for use in Sceneform.
public Session createAugmentedFacesSession(Activity activity) throws UnavailableException {
// Use the front-facing (selfie) camera.
Session session = new Session(activity, EnumSet.of(Session.Feature.FRONT_CAMERA));
// Enable Augmented Faces.
Config config = session.getConfig();
config.setAugmentedFaceMode(Config.AugmentedFaceMode.MESH3D);
session.configure(config);
return session;
}

Animating characters in your Sceneform AR apps

Another way version 1.7 expands the AR creative canvas is by letting your objects dance, jump, spin and move around with support for animations in Sceneform. To start an animation, initialize a ModelAnimator (an extension of the existing Android animation support) with animation data from your ModelRenderable.

void startDancing(ModelRenderable andyRenderable) {
AnimationData data = andyRenderable.getAnimationData("andy_dancing");
animator = new ModelAnimator(data, andyRenderable);
animator.start();
}

Solving common AR UX challenges in Unity with new UI components

In ARCore version 1.7 we also focused on helping you improve your user experience with a simplified workflow. We've integrated "ARCore Elements" -- a set of common AR UI components that have been validated with user testing -- into the ARCore SDK for Unity. You can use ARCore Elements to insert AR interactive patterns in your apps without having to reinvent the wheel. ARCore Elements also makes it easier to follow Google's recommended AR UX guidelines.

ARCore Elements includes two AR UI components that are especially useful:

  • Plane Finding - streamlining the key steps involved in detecting a surface
  • Object Manipulation - using intuitive gestures to rotate, elevate, move, and resize virtual objects

We plan to add more to ARCore Elements over time. You can download the ARCore Elements app available in the Google Play Store to learn more.

Improving the User Experience with Shared Camera Access

ARCore version 1.7 also includes UX enhancements for the smartphone camera -- specifically, the experience of switching in and out of AR mode. Shared Camera access in the ARCore SDK for Java lets users pause an AR experience, access the camera, and jump back in. This can be particularly helpful if users want to take a picture of the action in your app.

More details are available in the Shared Camera developer documentation and Java sample.

Learn more and get started

For AR experiences to capture users' imaginations they need to be both immersive and easily accessible. With tools for adding AR selfies, animation, and UI enhancements, ARCore version 1.7 can help with both these objectives.

You can learn more about these new updates on our ARCore developer website.

Roses are red, violets are blue: six Pixel camera tips, just for you

No matter what your plans are this Valentine’s Day, you’ll probably end up taking a few photos to celebrate or capture the moment—and that's where Pixel's camera comes in. Pixel 3's camera has tools that can help you capture and get creative with your V-Day photos. Here are six tips for our beloved #teampixel.

1. Virtual Valentines

Playground is a creative mode in the Pixel camera that helps you create and play with the world around you. You can send a virtual Valentine, or make your photos and videos stand out with the new Love Playmoji pack and two sticker packs. Capture and celebrate the love in the air today and year-round with interactive hearts, fancy champagne glasses, animated love notes or lovebirds.


2. A V-Day Vision

Your Valentine always stands out to you. So make them the center of focus with Portrait Mode, and watch as the background fades into a beautiful blur… just like the world does when you’re together.

3. Mood Lighting

Romantic dinner date? Use Night Sight to capture the mood when the lights are dim. Pixel’s camera can capture the highlights of your Valentine’s celebrations, even in low light.

Night Sight

4. Picture Perfect Palentines

If you’re celebrating with your Palentines, Group Selfie Cam on Pixel 3 gives everyone the love they deserve in your group selfie.

5. Search at First Sight

The technology that lets you search what you see is baked right into Pixel 3’s camera. See a shirt that would look great on your Galentine? Use Google Lens to find something similar online. Want to know what that flower is in your bouquet? Use Google Lens to identify it. Making a last-minute dinner reservation at that restaurant on the corner? Use Google Lens suggestions to dial the number on their sign with just a tap in the Pixel camera.

Lens

6. Sharing is Caring

With unlimited original quality photo and video storage using Google Photos on Pixel 3, you can snap as many shots as you want. From there, you can turn them into a movie or set up a live album, so you can relive (and share) your favorite Palentines’ moments year-round with your friends.

So whether you’re celebrating Valentine’s, Palentine’s or Galentine’s day, Pixel 3’s camera can help you capture your favorite moments with your favorite people.

Childish Gambino dances into Playground on Pixel

Playground gives you the power to create and play through your Pixel camera using augmented reality. You can bring your photos and videos to life by adding interactive characters called Playmoji to what’s around you, and now there’s a new Playmoji inspired by recording artist Childish Gambino. You can add him to your photos or videos by simply pointing your camera and dropping him into the scene.

childish gambino 2

Examples of Playmoji inspired by recording artist Childish Gambino. 

We worked closely with Childish Gambino and his music video choreographer, Sherrie Silver, to make sure the Playmoji’s dance moves rival those of Childish Gambino himself. By using ARCore’s motion tracking, light estimation, and ability to understand the real world, his Playmoji looks and feels lifelike, whether he’s in front of you or in a selfie next to you. He even reacts to your facial expressions in real time thanks to machine learning—try smiling or frowning in selfie mode and see how he responds.

gambino_better.gif

The Childish Gambino Playmoji pack features unique moves that map to three different songs: “Redbone,” “Summertime Magic,” and “This is America.” Pixel users can start playing with them today using the camera on their Pixel, Pixel XL, Pixel 2, Pixel 2 XL, Pixel 3 and Pixel 3 XL.

Google Pixel 3 | Playmoji Dance-Off

Think you have moves too? We want to bring you in on the fun so we're inviting #teampixel to a dance challenge. When you use the Childish Gambino Playmoji, bust out your best dance moves alongside him. The #pixeldanceoff is on—get moving!