Tag Archives: arcore

New UI tools and a richer creative canvas come to ARCore

Posted by Evan Hardesty Parker, Software Engineer

ARCore and Sceneform give developers simple yet powerful tools for creating augmented reality (AR) experiences. In our last update (version 1.6) we focused on making virtual objects appear more realistic within a scene. In version 1.7, we're focusing on creative elements like AR selfies and animation as well as helping you improve the core user experience in your apps.

Creating AR Selfies

Example of 3D face mesh application

ARCore's new Augmented Faces API (available on the front-facing camera) offers a high quality, 468-point 3D mesh that lets users attach fun effects to their faces. From animated masks, glasses, and virtual hats to skin retouching, the mesh provides coordinates and region specific anchors that make it possible to add these delightful effects.

You can get started in Unity or Sceneform by creating an ARCore session with the "front-facing camera" and Augmented Faces "mesh" mode enabled. Note that other AR features such as plane detection aren't currently available when using the front-facing camera. AugmentedFace extends Trackable, so faces are detected and updated just like planes, Augmented Images, and other trackables.

// Create ARCore session that support Augmented Faces for use in Sceneform.
public Session createAugmentedFacesSession(Activity activity) throws UnavailableException {
// Use the front-facing (selfie) camera.
Session session = new Session(activity, EnumSet.of(Session.Feature.FRONT_CAMERA));
// Enable Augmented Faces.
Config config = session.getConfig();
config.setAugmentedFaceMode(Config.AugmentedFaceMode.MESH3D);
session.configure(config);
return session;
}

Animating characters in your Sceneform AR apps

Another way version 1.7 expands the AR creative canvas is by letting your objects dance, jump, spin and move around with support for animations in Sceneform. To start an animation, initialize a ModelAnimator (an extension of the existing Android animation support) with animation data from your ModelRenderable.

void startDancing(ModelRenderable andyRenderable) {
AnimationData data = andyRenderable.getAnimationData("andy_dancing");
animator = new ModelAnimator(data, andyRenderable);
animator.start();
}

Solving common AR UX challenges in Unity with new UI components

In ARCore version 1.7 we also focused on helping you improve your user experience with a simplified workflow. We've integrated "ARCore Elements" -- a set of common AR UI components that have been validated with user testing -- into the ARCore SDK for Unity. You can use ARCore Elements to insert AR interactive patterns in your apps without having to reinvent the wheel. ARCore Elements also makes it easier to follow Google's recommended AR UX guidelines.

ARCore Elements includes two AR UI components that are especially useful:

  • Plane Finding - streamlining the key steps involved in detecting a surface
  • Object Manipulation - using intuitive gestures to rotate, elevate, move, and resize virtual objects

We plan to add more to ARCore Elements over time. You can download the ARCore Elements app available in the Google Play Store to learn more.

Improving the User Experience with Shared Camera Access

ARCore version 1.7 also includes UX enhancements for the smartphone camera -- specifically, the experience of switching in and out of AR mode. Shared Camera access in the ARCore SDK for Java lets users pause an AR experience, access the camera, and jump back in. This can be particularly helpful if users want to take a picture of the action in your app.

More details are available in the Shared Camera developer documentation and Java sample.

Learn more and get started

For AR experiences to capture users' imaginations they need to be both immersive and easily accessible. With tools for adding AR selfies, animation, and UI enhancements, ARCore version 1.7 can help with both these objectives.

You can learn more about these new updates on our ARCore developer website.

Roses are red, violets are blue: six Pixel camera tips, just for you

No matter what your plans are this Valentine’s Day, you’ll probably end up taking a few photos to celebrate or capture the moment—and that's where Pixel's camera comes in. Pixel 3's camera has tools that can help you capture and get creative with your V-Day photos. Here are six tips for our beloved #teampixel.

1. Virtual Valentines

Playground is a creative mode in the Pixel camera that helps you create and play with the world around you. You can send a virtual Valentine, or make your photos and videos stand out with the new Love Playmoji pack and two sticker packs. Capture and celebrate the love in the air today and year-round with interactive hearts, fancy champagne glasses, animated love notes or lovebirds.


2. A V-Day Vision

Your Valentine always stands out to you. So make them the center of focus with Portrait Mode, and watch as the background fades into a beautiful blur… just like the world does when you’re together.

3. Mood Lighting

Romantic dinner date? Use Night Sight to capture the mood when the lights are dim. Pixel’s camera can capture the highlights of your Valentine’s celebrations, even in low light.

Night Sight

4. Picture Perfect Palentines

If you’re celebrating with your Palentines, Group Selfie Cam on Pixel 3 gives everyone the love they deserve in your group selfie.

5. Search at First Sight

The technology that lets you search what you see is baked right into Pixel 3’s camera. See a shirt that would look great on your Galentine? Use Google Lens to find something similar online. Want to know what that flower is in your bouquet? Use Google Lens to identify it. Making a last-minute dinner reservation at that restaurant on the corner? Use Google Lens suggestions to dial the number on their sign with just a tap in the Pixel camera.

Lens

6. Sharing is Caring

With unlimited original quality photo and video storage using Google Photos on Pixel 3, you can snap as many shots as you want. From there, you can turn them into a movie or set up a live album, so you can relive (and share) your favorite Palentines’ moments year-round with your friends.

So whether you’re celebrating Valentine’s, Palentine’s or Galentine’s day, Pixel 3’s camera can help you capture your favorite moments with your favorite people.

Childish Gambino dances into Playground on Pixel

Playground gives you the power to create and play through your Pixel camera using augmented reality. You can bring your photos and videos to life by adding interactive characters called Playmoji to what’s around you, and now there’s a new Playmoji inspired by recording artist Childish Gambino. You can add him to your photos or videos by simply pointing your camera and dropping him into the scene.

childish gambino 2

Examples of Playmoji inspired by recording artist Childish Gambino. 

We worked closely with Childish Gambino and his music video choreographer, Sherrie Silver, to make sure the Playmoji’s dance moves rival those of Childish Gambino himself. By using ARCore’s motion tracking, light estimation, and ability to understand the real world, his Playmoji looks and feels lifelike, whether he’s in front of you or in a selfie next to you. He even reacts to your facial expressions in real time thanks to machine learning—try smiling or frowning in selfie mode and see how he responds.

gambino_better.gif

The Childish Gambino Playmoji pack features unique moves that map to three different songs: “Redbone,” “Summertime Magic,” and “This is America.” Pixel users can start playing with them today using the camera on their Pixel, Pixel XL, Pixel 2, Pixel 2 XL, Pixel 3 and Pixel 3 XL.

Google Pixel 3 | Playmoji Dance-Off

Think you have moves too? We want to bring you in on the fun so we're inviting #teampixel to a dance challenge. When you use the Childish Gambino Playmoji, bust out your best dance moves alongside him. The #pixeldanceoff is on—get moving!

Six ways to take Playground home for the holidays with Pixel

In October, we launched Playground on the Pixel 3 and Pixel 3 XL, giving you the power to create and play with the world around you through your camera. Playground helps you bring more of your imagination to your photos and videos with interactive Playmoji—characters that react to each other and to you—and tell a richer story by adding animated stickers and creative captions. Starting today, Playground is available on Pixel, Pixel XL, Pixel 2 and Pixel 2 XL, so now all of #teampixel can join in on the fun.

Just in time for the holidays, we’re also introducing festive new Playmoji and stickers soon that can help bring your photos and videos to life. Whether you’re celebrating at home or hitting the road, here are six ways you can take Playground home for the holidays with the Pixel camera.

1. Share your journey.Whether your holiday travels take you away on a plane, train or automobile, getting there is all the fun with the speedy new Travel Playmoji pack. Document your adventures from the window seat, or spice up snaps from your road trip.

SunnySanta _ Device Frame.gif

2. Send a virtual postcard. Wish loved ones a happy holiday from wherever you are with Playmoji, stickers and captions. Or say it all in a selfie by posing with characters who react to you. Once you’ve lined up the perfect shot, sharing with Playground is easy—just a few taps straight from the camera.

winter

3. Let your creativity snow. Brrr! Chillier temperatures mean a chance to use snowy Weather Playmoji and the Winter Playmoji pack during your favorite seasonal activities like ice skating and hockey.

Oy to the World

4. Put your spin on a scene. Oh, dreidel, dreidel, dreidel! We made you out of…pixels. Hanukkah may be over, but you can use these themed stickers year-round, complete with dancing menorah candles, spinning dreidels and latkes galore.

ar ornaments

5. Deck the halls. Make an old tradition new again by challenging your family to a tree decorating contest with Playground. One tree, endless possibilities!

christmascheer

6. Tell an imaginative story. Add some flair to holiday scenes with the jolly new “Christmas Cheer” stickers. Throw a Santa hat on your friend, stick mistletoe where it belongs or place a one-of-a-kind “gift” under the tree.

holiday hello

You can spread the holiday cheer with Playground this season by sharing your creations with #teampixel. We can’t wait to see what you create!

Creating More Realistic AR experiences with updates to ARCore & Sceneform

Posted by Ashish Shah, Product Manager, Google AR & VR

The magic of augmented reality is in the way it blends the digital and the physical worlds. For AR experiences to feel truly immersive, digital objects need to look realistic -- as if they were actually there with you, in your space. This is something we continue to prioritize as we update ARCore and Sceneform, our 3D rendering library for Java developers.

Today, with the release of ARCore 1.6, we're bringing further improvements to help you build more realistic and compelling experiences, including better plane boundary tracking and several lighting improvements in Sceneform.

With 250M devices now supporting ARCore, developers can bring these experiences to an even larger and growing user base.

More Realistic Lighting in Sceneform

Previous versions of Sceneform defaulted to optimizing ambient light as yellow. Version 1.6 defaults to neutral and white. This aligns more closely to the way light appears in the real world, making digital objects look more natural. You can see the differences below.

Left side image: Sceneform 1.5Right side image: Sceneform 1.6

This change will also make objects rendered with Sceneform look as if they're affected more naturally by color and lighting in the surrounding environment. For example, if you're viewing an AR object at sunset, it would appear to be illuminated by the red and orange hues, just like real objects in the scene.

In addition, we've updated Sceneform's built-in environmental image to provide a more neutral scene for your app. This will be most noticeable when viewing reflections in smooth metallic surfaces.

Adding screen capture and recording to the mix

To help you further improve quality and engagement in your AR apps, we're adding screen capture and recording to Sceneform. This is something a number of developers have requested to help with demo recording and prototyping. It can also be used as an external facing feature, allowing your users to share screenshots and videos on social media more easily, which can help get the word out about your app.

You can access this functionality through the surface mirroring API for the SceneView class. The API allows you to display the Sceneform view on a device's screen at the same time it's being rendered to another surface (such as the input surface for the Android MediaRecorder).

Learn more and get started

The new updates to Sceneform and ARCore are available today. With these new versions also comes support for new devices, such as the Samsung Galaxy A3 and the Huawei P20 Lite, that will join the list of ARCore-enabled devices. More information is available on the ARCore developer website.

An art gallery in your pocket: See Vermeer’s paintings in augmented reality

Over 28 years ago, two art thieves dressed as police officers made their way into the Isabella Stewart Gardner Museum in Boston and stole multiple artworks, including a Vermeer painting that was one of only 36 attributed to the artist. With an estimated value of over $200 million, “The Concert” remains one of the most expensive missing items on the FBI’s list of stolen art. With the rest of Vermeer’s masterpieces scattered across 17 collections in seven countries, people have never had the opportunity to see all of Vermeer’s works in one place. And since some of his works are now too fragile to travel, they’ll have to remain where they are indefinitely.

But now, you can experience all of Vermeer's known artworks in one place for the first time. Thanks to the Mauritshuis museum in the Netherlands and other cultural institutions guarding Vermeer’s legacy, they’re available in Pocket Gallery, a brand new feature on the Google Arts & Culture app. Pocket Gallery uses augmented reality, so you can pull out your phone and step into a virtual exhibition space to see all of his works, curated by experts from the Mauritshuis. All 36 of his paintings—including the missing masterpiece and the famous “Girl with a Pearl Earring”—hang lifesize and perfectly lit. As you step closer, you’ll see each painting in stunning detail and can learn more about each piece.   

The Art Camera—our ultra-high resolution robotic camera made specifically for artworks—was deployed to several galleries around the world, creating the highest-ever resolution image of eight of Vermeer’s masterpieces for your zooming pleasure. You can also dive into “in painting tours” of each Vermeer’s 36 works and enjoy the guided insights into artworks like Girl with a Flute. In addition to Vermeer’s paintings, you’ll be able to explore several expert stories that shed light on Vermeer's art, legacy or mysterious life—for instance, you can hear from Tracy Chevalier, author of the bestseller “Girl with a Pearl Earring.”

Today, Vermeer resonates in pop culture references around the world. Justin Richburg—who recently created the character designs for Childish Gambino’s music video “Feels like Summer”—conceived an original piece of art that bridges time and cultures: “Icons” reimagines Vermeer in the 21st century, and shows how the subjects of his paintings have become icons themselves.

justinrichburg

Icons by Justin Richburg.

You can experience Vermeer’s work in a variety of formats—whether it’s an interactive coloring book on Instagram or an original series with YouTube Creators. To see Vermeer’s paintings hanging where they currently are, you can also check out Street View photography in galleries worldwide to navigate the halls of the Frick Collection (New York) and Rijksmuseum (Amsterdam). Visit g.co/meetvermeer, join the conversation with #MeetVermeer or download the app on either iOSor Android to try out Pocket Gallery.

Bust ghosts in the newest game built with Maps—Ghostbusters World

Earlier this year, we introduced a new way for game developers to create real world games using information about the world from Google Maps. It enables game studios to easily reimagine our world as whatever they can dream up and helps them find the best places in the world for players to fuel up or start a mission.


With Ghostbusters World™, the newest game built with Google Maps, you can grab your virtual proton pack and bust ghosts—all as you explore a game world built on the Google Maps you know and love. Brought to you by Sony Pictures Consumer Products, Ghost Corps, publisher FourThirtyThree Inc. (4:33), and developer Next Age, Ghostbusters World is available for free on Google Play and the App Store now.

                                                   

As a Ghostbuster, your mission is to “bust” ghosts to keep the world safe and ghost-free (just in time for Halloween, in case you’re superstitious). Lurking among 3D buildings, landmarks and parks, you’ll find hundreds of ghosts from all dimensions of the Ghostbusters franchise like Wes Pinker, Splat and Achira—in addition to fan favorites like Stay Puft and Slimer. Catch them in your proton beam to drain their energy and then capture them in your containment unit. As you advance in the game, you’ll gain access to the latest in spectral neutralization and trapping technology.

                                                     

Because some ghosts are just too strong to take down on your own (would you want to face Stay Puft solo?), you can team up with nearby Ghostbusters in multiplayer boss raids. Not a team player? No problem. If competition is what you’re after, just build up your ghost team (the ghosts you capture and store in your bank) and enter battles against other Ghostbusters around the world to gain valuable resources needed to make your ghosts stronger.

                                                      

For those Ghostbusters who delight in the story—not just the action—there’s an all original story mode featuring your favorite classic characters. And if you’re feeling festive (or daring) this Halloween, there’s an AR Mode (built with ARCore on Android) that lets you blur the lines between ghostly fantasy and reality.

                                                        

If you want to do your part to make sure your local streets are ghost-free this Halloween, try Ghostbusters World. Download it now from Google Play and the App Store.


Source: Google LatLong


Introducing new APIs to improve augmented reality development with ARCore

Posted by Clayton Wilkinson, Developer Platforms Engineer

Today, we're releasing updates to ARCore, Google's platform for building augmented reality experiences, and to Sceneform, the 3D rendering library for building AR applications on Android. These updates include algorithm improvements that will let your apps consume less memory and CPU usage during longer sessions. They also include new functionality that give you more flexibility over content management.

Here's what we added:

Supporting runtime glTF loading in Sceneform

Sceneform will now include an API to enable apps to load gITF models at runtime. You'll no longer need to convert the gITF files to SFB format before rendering. This will be particularly useful for apps that have a large number of gITF models (like shopping experiences).

To take advantage of this new function -- and load models from the cloud or local storage at runtime -- use RenderableSource as the source when building a ModelRenderable.

 private static final String GLTF_ASSET = "https://github.com/KhronosGroup/glTF-Sample-Models/raw/master/2.0/Duck/glTF/Duck.gltf";

// When you build a Renderable, Sceneform loads its resources in the background while returning
// a CompletableFuture. Call thenAccept(), handle(), or check isDone() before calling get().
ModelRenderable.builder()
.setSource(this, RenderableSource.builder().setSource(
this,
Uri.parse(GLTF_ASSET),
RenderableSource.SourceType.GLTF2).build())
.setRegistryId(GLTF_ASSET)
.build()
.thenAccept(renderable -> duckRenderable = renderable)
.exceptionally(
throwable -> {
Toast toast =
Toast.makeText(this, "Unable to load renderable", Toast.LENGTH_LONG);
toast.setGravity(Gravity.CENTER, 0, 0);
toast.show();
return null;
});

Publishing the Sceneform UX Library's source code

Sceneform has a UX library of common elements like plane detection and object transformation. Instead of recreating these elements from scratch every time you build an app, you can save precious development time by taking them from the library. But what if you need to tailor these elements to your specific app needs? Today we're publishing the source code of the UX library so you can customize whichever elements you need.

An example of interactive object transformation, powered by an element in the Sceneform UX Library.

Adding point cloud IDs to ARCore

Several developers have told us that when it comes to point clouds, they'd like to be able to associate points between frames. Why? Because when a point is present in multiple frames, it is more likely to be part of a solid, stable structure rather than an object in motion.

To make this possible, we're adding an API to ARCore that will assign IDs to each individual dot in a point cloud.

These new point IDs have the following elements:

  • Each ID is unique. Therefore, when the same value shows up in more than one frame, you know that it's associated with the same point.
  • Points that go out of view are lost forever. Even if that physical region comes back into view, a point will be assigned a new ID.

New devices

Last but not least, we continue to add ARCore support to more devices so your AR experiences can reach more users across more surfaces. These include smartphones as well as -- for the first time -- a Chrome OS device, the Acer Chromebook Tab 10.

Where to find us

You can get the latest information about ARCore and Sceneform on https://developers.google.com/ar/develop

Ready to try out the samples or have issues, then visit our projects hosted on GitHub:

Save dinosaurs from extinction in a game world built with Google Maps

JurassicWorldAlive1

Back in March we announced a product to help developers build games using the information Google Maps knows about the world around you. It enables game studios to easily reimagine our world as a medieval fantasy, a bubble gum candy land, or a zombie-infested post-apocalyptic city. It even helps them find the best places for gameplay––whether it’s a landmark where a player refuels or a park where they must go to complete a mission––no matter where in the world players are.


Just in time for summer and the release of Jurassic World: Fallen Kingdom™, you can explore a virtual world built with Google Maps while saving dinosaurs from extinction. Ludia and Universal’s Jurassic World™ Alive is available for free on Google Play and the App Store now.


Ludia used Google Maps’ rich and accurate data to create a game world where dinosaurs have escaped Jurassic World on Isla Nublar to roam freely in cities and neighborhoods around the world. As a member of the Dinosaur Protection Group, your mission is to save dinosaurs from another extinction––and you do that by exploring the world around you. Nestled among 3D buildings, roads, landmarks, and parks, you’ll find dinosaurs, track them with drones, collect DNA samples to level up, and create hybrid dinosaurs in your lab. Once you’ve assembled your own roster of prehistoric animals, you can battle other players to defend against threats to your mission.

JurassicWorldAlive2

To perform your duties as part of the Dinosaur Protection Group, you’ll need to earn rewards like in-game currency and battery life for your drone by finding supply drops––all strategically placed in fun (and appropriate) places using Google Maps’ in-depth understanding of real world places.

JurassicWorldAlive3

The game uses ARCore, Google's platform for building augmented reality experiences, to power an exciting (and terrifying!) AR mode that lets you get up close and personal with dinosaurs in your collection. To get to the AR mode, tap the collection icon at the bottom of your screen, select a dinosaur, and then tap the round AR icon on the right hand side.

So if you’re still bummed that dinosaurs went extinct millions of years ago, try Jurassic World Alive and spend some quality time with prehistoric beasts in your own natural habitat.

Creating AR Experiences for I/O: Our Process

Posted by Karin Levi, Product Marketing, ARCore

A few weeks ago at Google I/O we released a major update to ARCore, Google's AR development platform. We added new APIs like Cloud Anchors, that enable multi-user, collaborative AR experiences and Augmented Images that enable activation of 2D images into 3D objects. All of these updates are going to change the way we use AR today and enable developers to create richer, more immersive AR apps.

With these new capabilities, we decided to put our platform to the test. So we built real experiences to showcase how these all come to life. All demos were presented at the I/O AR & VR sandbox area. We open sourced them to make sure you can see how simple it is to build these experiences. We're pretty happy with how they turned out and would love to share with you some learning and insights from behind the scenes.

Light Board - Multiplayer game

Light Board is an AR multiplayer tabletop game where two players on floating game boards launch colored projectiles at each other.

While building Light Board it was important for us to keep in mind who the end users are. We wanted it to be a simple/fun game for developers to try out while visiting the I/O sandbox. The developers would only have a couple minutes to play while passing through, so it needed to allow players (even non-gamers) to pick it up and play with very little setup.

The artwork for Light Board was a major focus. Our mission for the look of the game was to align with the design and decor of I/O 2018. This way, our app would feel like an extension of everything the attendees saw around them. As a result, our design philosophy had 3 goals; bright accent colors, simple graphic shapes and natural physical materials.

Left: Design for AR/VR Sandbox at I/O 2018. Right: Key art for Light Board game boards

The artwork was created in Maya and Cinema 4D. We created physically based materials for our models using Substance Painter. Just as continuous iteration is crucial for engineering, it is also important when creating art assets. With that in mind, we kept careful track of our content pipeline, even for this relatively simple project. This allowed us to quickly try out different looks and board styles before settling on our final design.

On the engineering front we selected the Unity game engine as our dev environment. Unity gives us a couple of important advantages. First, it is easy to get great looking 3D graphics up and running right away. Second, the engine component is already complete, so we could immediately start iterating on gameplay code. As with the artwork, this allowed us to test gameplay options before we made a final decision. Additionally, Unity gave us support for both Android and iOS with only a little extra work.

To handle the multiplayer aspect we used Firebase Realtime Database. We were concerned with network performance at the event, and felt that the persistent nature of a database would make it more tolerant of poor networks. As it turned out, it worked very well and we got the ability to quit and rejoin games for free!

We had a lot of fun building Light Board and we hope people can use it as an example of how easy it can be to not only build AR apps, but to use really cool features like Cloud Anchors. Please check out our open source repo and give Light Board a try!

Just a line - Draw with your friends

In March, we released Just a Line, an Android app that lets you draw in the air with your phone. It's a simple experiment meant to showcase the power of ARCore. At Google I/O, we added Cloud Anchors to the app so that two people can draw at once in the same space, even if one of them is using Android and the other iOS.

Both apps were built natively: The Android version was written in Android Studio, and the iOS version was built in xCode. ARCore's Cloud Anchors enable Just a Line to pair two phones, allowing users to draw simultaneously in a shared space. Pairing works across Android and iOS devices, and drawings are synchronized live through a Firebase Realtime Database. You can find the open-source code for iOS here and for Android here.

Illusive Images - Art exhibition comes to life

"Illusive Images" demo is an augmented gallery consisting of 3 artworks, each exploring a different augmented image use case and user experience. As one walks from side to side, around the object, or gazes in a specific direction, 2D artworks are married with 3D, inviting the viewer to enter into the space of the artwork spanning well beyond the physical frame.

Due to the visual design nature of our augmented images, we experimented a lot with creating databases with varying degrees of features. In order to get the best results, we iterated quickly by resizing the canvas for the artwork. We also moved and stretched the brightness and contrast levels. These variations helped to achieve the most optimal image without compromising design intent.

The app was built in Unity with ARCore, with the majority of assets created in Cinema 4D. Mograph animations were imported into Unity as an fbx, and driven entirely by the position of the user in relation to the artwork. An example project can be found here.

To make your development experience easier, we open sourced all the demos our team built. We hope you find this useful! You can also visit our website to learn more and start building AR experiences today.