Tag Archives: Project Tango

Shopping made simple with Tango and WayfairView

Posted by Sophie Miller, Tango Business Development

Window shopping and showrooms let us imagine what that couch might look like in our living room or if that stool is the right height, but Tango can help take out the guesswork using augmented reality. Place virtual furniture in your real room, walk around, and try different colors.

Tango-enabled apps like WayfairView make it easy to visualize and rearrange new furniture in your home. We sat down with the Wayfair team to learn more about their app and see how Tango helps power new AR shopping experiences:

Google: Please tell us about your Tango app.

Mike: Wayfair offers a massive selection of products online. We believe that the ability for customers to visualize products in their living space augments our online experience, and solves real customer problems such as: Will this product fit in my space? and Will this match the rest of my environment?

Why are you excited for your customers to start using WayfairView?

One of the biggest barriers that online shopping poses is the inability for a customer to get a good sense of how a product would fit in their room, and what it would look like in their living space. With WayfairView, we aim to help our customers better visualize our products - going above and beyond a flat, 2D image and providing them with an accurate 3D rendering of what the full-size item could look like in their home. Not only is this a great extension of the customer experience, it’s also a practical approach to figure out how the product fits into the user’s space before ordering it.


How did you get started developing for Tango?

I signed up to buy a dev kit in 2014 because he was personally interested in scanning 3D objects and environments. I ended up using it for a hackathon to build the first prototype of what is now WayfairView. One of my teammates, Shrenik Sadalgi, has always been interested in AR technology and had participated in Tango hackathons in years prior. He thought this particular flavor of AR, i.e Markerless in the form factor of a mobile device, had the potential of providing a seamless, easy user experience for Wayfair customers.

Was there something unique to the Tango platform that made it particularly appealing?

AR technology has been around for a while, but Tango is making it accessible by providing the technology in a way that is user friendly. Specifically, the Tango platform excels in accurate tracking, which allowed Wayfair’s R&D team to focus on building a great experience for our customers. No markers, no HMDs, no cords that can get tangled, but still powerful.

What were some of the challenges you faced building for Tango?

The biggest challenge Wayfair faces with AR technology is more about the experience than the device, which is in big part thanks to Tango. Our goal was to introduce an entirely new way of shopping for furniture in a way that is user friendly. Not having to worry about the inner workings of Tango helped us focus on making the furniture look as real as possible, scaling the app with our massive catalog, and getting to market in a short period of time.

What surprised you during the Tango development process?

The learning curve for Tango was minimal. We were able to get started very quickly using example code. It was pretty remarkable how the stability of the platform (primarily the tracking) kept improving over the period of time that we worked on the app.

Which platform did you build your Tango app on, and why?

We wrote the core of the app using Unity in C# - we wanted all the 2D UI to be in native Android to match the Wayfair native Android experience. This also gave us the opportunity to re-use code from the existing Wayfair Android app. We saw significant performance improvements by using native Android to create the 2D UI as well, which also makes the UI easier to update when the next UI theme of Android comes along.

What features can customers look forward to in a future WayfairView update?

We would love to add the ability to search for products by space: imagine drawing a cube in your real space and finding all products that fit the space. We also want to allow users to stack virtual products on top of each other to help them visualize how a virtual table lamp would look on top of a virtual table. Of course, we also want to make the products look even more real and add more products that can be visualized on WayfairView.

How do you think that this will change the way people shop for household goods?

WayfairView makes it easier than ever for customers to visualize online goods in their home at full scale, giving them an extra level of confidence when making an online purchase. We believe Tango has the potential to become a ubiquitous technology, just like smartphone cameras and mobile GPS. Ultimately, we anticipate that this will further accelerate the shift from brick and mortar to online.

We also imagine that WayfairView will be a very useful tool for our designers as they share their design proposal and vision with their customers.

Tango developer workshop brings stories to life

Posted by Eitan Marder-Eppstein, Senior Software Engineer for Tango

Technology helps us connect and communicate with others -- from sharing commentary and photos on social media to a posting a video with breaking news, digital tools enable us to craft stories and share them with the world.

Tango can enhance storytelling by bringing augmented reality into our surroundings. Recently, the Tango team hosted a three-day developer workshop around how to use this technology to tell incredible stories through mobile devices. The workshop included a wide range of participants, from independent filmmakers and developers to producers and creatives at major media companies. By the end of the workshop, a number of new app prototypes had been created. Here are some of the workshop highlights:

  • The New York Times experimented with ways to connect people with news stories by creating 3D models of the places where the events happened.
  • The Wall Street Journal prototyped an app called ViewPoint to bring location-based stories to life. When you’re in front of a monument, for example, you can see AR content and pictures that someone else took at that site.
  • Line experimented with bringing 3D characters to life. For example, app users could see AR superheros in front of them, and then their friend could jump into the characters’ costumes.
  • Google’s Mobile Vision Team brought music to life by letting people point their phones at various objects and visualize the vibrations that music makes on them.

We even had an independent developer use Tango to create realtime video stabilization tool. We’re looking forward to seeing these apps—and many more—come to life. If you want to start building your own storytelling and visual communication apps for augmented reality, check out our developer page and join our G+ community.

Adding a bit more reality to your augmented reality apps with Tango

Posted by Sean Kirmani, Software Engineering Intern, Tango

Augmented reality scenes, where a virtual object is placed in a real environment, can surprise and delight people whether they’re playing with dominoes or trying to catch monsters. But without support for environmental lighting, these virtual objects can stick out rather than blend in with their environments. Ambient lighting should bleed onto an object, real objects should be seen in reflective surfaces, and shade should darken a virtual object.

Tango-enabled devices can see the world like we do, and they’re designed to bring mobile augmented reality closer to real reality. To help bring virtual objects to life, we’ve updated the Tango Unity SDK to enable developers to add environmental lighting to their Tango apps. Here’s how to get started:

Let’s dive in!

Before we begin, you’ll need to download the Tango Unity SDK. Then you can follow the steps below to make your reality a little brighter.

Step 1: Create a new Unity project and import the Tango SDK package into the project.

Step 2: Create a new scene. If you need help with this, check out the solar system tutorial from a previous post. Then you’ll add Tango Manager and Tango AR Camera prefabs to your scene and remove the default Main Camera game object. Also remove the artificial directional light. We won’t need that anymore. After doing this, you should see the scene hierarchy like this:

Step 3: In the Tango Manager game object, you’ll want to check Enable Video Overlay and set the method to Texture and Raw Bytes.

Step 4: Under Tango AR Camera, look for the Tango Environmental Lighting component. Make that the the Enable Environmental Lighting checkbox is checked.

Step 5: Add your game object that you’d like to be environmental lit to the scene. In our example, we’ll be using a pool ball. So let’s add a new Sphere.

Step 6: Let’s create a new material for our sphere. Go to Create > Material. We’ll be using our environmental lighting shader on this object. Under Shader, select Tango >Environmental Lighting > Standard.

Step 7: Let’s add a texture to our pool ball and tweak our smoothness parameter. The higher the smoothness, the more reflective our object becomes. Rougher objects have more of a diffuse lighting that is softer and spreads over the surface of the object. You can download the pool_ball_textureand import it into your project.

Step 8: Add your new material to your sphere, so you have a nicer looking pool ball.

Step 9: Compile and run the application again. You should able see environment lit pool ball now!

You can also follow our previous post and be able to place your pool ball on surfaces. You don’t have to worry about your sphere rolling off your surface. Here are some comparison pictures of the pool ball with a static artificial light (left) and with environment lighting (right).

We hope you enjoyed this tutorial combining the joy of environmental lighting with the magic of AR. Stay tuned to this blog for more AR updates and tutorials!

We’re just getting started!

You’ve just created a more realistically light pool ball that live in AR. That’s a great start, but there’s a lot more you can do to make a high performance smartphone AR application. Check out our Unity example code on Github (especially the Augmented Reality example) to learn more about building a good smartphone AR application.

Schell Games gives popular games a twist with Tango

Posted by Justin Quimby, Senior Product Manager Tango

At Tech World last month, our team showed off some of the latest Tango-enabled games. One crowd favorite was Domino World by Schell Games which will will be available on the first Tango-enabled device, Lenovo’s Phab 2 Pro, coming this fall. Schell Games has adapted a few classic games, including Jenga, into smartphone augmented reality, and their developers share their experience and considerations they kept in mind as they gave dominoes a new twist.

Google: How did your team first hear about Tango technology?

Schell Games: The Tango team invited us to their Game Developer Workshopwhere we learned about Tango and the types of apps we could develop for this platform.

Google: You took a classic game, and added AR elements. How did you come to dominoes?

Schell Games: At the Game Developer Workshop, we prototyped three games: a racing game, Jenga and a pet game. Of the three games, people connected the most with Jenga.

People loved sharing a device to play the game together—and they loved that they didn’t have to pick up all the Jenga pieces when the game was over! And from a developer perspective, Jenga was great as it highlighted Tango’s ability to recognize surfaces.

Based on how much people liked Jenga, we decided that Domino World would be our second game. Domino World gives players all the fun of dominoes, but without the setup effort or mess. We were inspired by YouTube videos where people of all ages were doing really creative things with dominoes. Our goal was to bring that experience to the phone as an immersive and fun augmented-reality experience.

Google: Which Tango features did you use in Jenga and Domino World?

Schell Games: We used motion tracking, which lets people walk around their dominoes or Jenga tower. We also used surface detection with the depth camera, so that the device recognizes when objects are placed on a surface.

Google: How does your development approach differ for AR apps versus standard mobile apps?

Schell Games: With Domino World, for example, our approach to augmented reality thrives on reinforcing the feeling that the player’s display is a “window on the world.” Toys and dominoes are (virtually) placed on the actual surfaces around the player, and the game’s controls aid players in manipulating objects in the space in front of them. As a result, the player is naturally encouraged move around as they view, adjust and otherwise shape their ever-growing creations.

In contrast, traditional touchscreen controls largely work with metaphors of interacting with the screen’s image itself -- drawing on it, pinch-zooming it, etc. As a result, a more traditional touchscreen-controlled Domino World could have influenced players to remain more static and work with the existing view, as opposed to moving around to different vantage points.

Google: We noticed that you use a landscape orientation for Domino World. How did you decide to take that approach.

Schell Games: The decision to use landscape orientation for Domino World is the result of multiple smaller reasons all put together:

  • Many new players have a tendency to initially build wider versus deeper (possibly due to an instinctive desire to be able to more easily access their domino runs).
  • UI controls at the edges of a landscape layout minimizes HUD overlap when working with wider versus. deeper runs.
  • A landscape orientation naturally places players’ a hands at the device’s corners, which makes for a more stable grip during gameplay.

Google: What surprised you the most while building with Tango?

Schell Games: We were quite surprised at how easy it was to build with the Tango SDK and add Tango functionality to our apps. We used the Unity Engine which made the whole process quite seamless. It took us just over two weeks to build Jenga and 10 weeks to build Domino World from beginning to end.

Google: How do you think Tango will change the way people play games?

Schell Games: Tango makes it easy to play AR games. You don’t need to print and cut out AR trackers or markers to place throughout your room to help orient the phone. Instead, your phone always knows where it is in relation to the AR objects and you can easily start playing—whether you’re in a living room or on a bus. It’s incredible to have this experience with just your mobile device.

Bringing virtual cats to your world with Project Tango

Posted by Jason Guo, Developer Programs Engineer, Project Tango

Project Tango brings augmented reality (AR) experiences to life. From the practical to the whimsical, Project Tango apps help place virtual objects -- anything from new living room furniture to a full-sized dinosaur -- into your physical world.

Last month we showed you how to quickly and easily make a simple solar system in AR. But if you are ready for something more advanced, the tutorial below describes how to use Project Tango’s depth APIs to associate virtual objects with real world objects. It also shows you how to use a Tango Support Library function to find the planar surface in an environment.

So what’s our new tutorial project? We figured that since cats rule the Internet, we’d place a virtual cat in AR! The developer experience is designed to be simple -- when you tap on the screen, the app creates a virtual cat based on real-world geometry. You then use the depth camera to locate the surface you tapped on, and register (place) the cat in the right 3D position.

Bring on the cats!

Before you start, you’ll need to download the Project Tango Unity SDK. Then you can follow the steps below to create your own cats.

Step 1: Create a new Unity project and import the Tango SDK package into the project.

Step 2: Create a new scene. If you don’t know how to do this, look back at the solar system tutorial. Just like the solar system project, you’ll use the Tango Manager and Tango AR Camera in the scene and remove the default Main Camera gameobject. After doing this, you should see the scene hierarchy like this:

Step 3: Build and run once, making sure sure the application shows the video feed from Tango’s camera.

Step 4: Enable the Depth checkbox on the Tango Manager gameobject.

Step 5: Drag and drop the Tango Point Cloud prefab to the scene from the TangoPrefab folder.

Tango Point Cloud includes a bunch of useful functions related to point cloud, including finding the floor, transforming pointcloud to unity global space, and rendering debug points. In this case, you’ll use the FindPlane function to find a plane based on the touch event.

Step 6: Create a UI Controller gameobject in the scene. To do this, click the “Create” button under the Hierarchy tab, then click “Create Empty.” The UI Controller will be the hosting gameobject to run your UIController.cs script (which you’ll create in the next step).

Step 7: Click on “UIController gameobject” in the inspector window, then click “Add Component” to add a C# script named KittyUIController.cs. KittyUIController.cs will handle the touch event, call the FindPlane function, and place your kitty into the scene.

Step 8: Double click on the KittyUIController.cs file and replace the script with the following code

using UnityEngine;
using System.Collections;

public class KittyUIController : MonoBehaviour
{
public GameObject m_kitten;
private TangoPointCloud m_pointCloud;

void Start()
{
m_pointCloud = FindObjectOfType();
}

void Update ()
{
if (Input.touchCount == 1)
{
// Trigger place kitten function when single touch ended.
Touch t = Input.GetTouch(0);
if (t.phase == TouchPhase.Ended)
{
PlaceKitten(t.position);
}
}
}

void PlaceKitten(Vector2 touchPosition)
{
// Find the plane.
Camera cam = Camera.main;
Vector3 planeCenter;
Plane plane;
if (!m_pointCloud.FindPlane(cam, touchPosition, out planeCenter, out plane))
{
Debug.Log("cannot find plane.");
return;
}

// Place kitten on the surface, and make it always face the camera.
if (Vector3.Angle(plane.normal, Vector3.up) < 30.0f)
{
Vector3 up = plane.normal;
Vector3 right = Vector3.Cross(plane.normal, cam.transform.forward).normalized;
Vector3 forward = Vector3.Cross(right, plane.normal).normalized;
Instantiate(m_kitten, planeCenter, Quaternion.LookRotation(forward, up));
}
else
{
Debug.Log("surface is too steep for kitten to stand on.");
}
}
}
Notes on the code
Here are some notes on the code above:
  • m_kitten is a reference to the Kitten gameobject (we’ll add the model in the following steps)
  • m_pointCloud is a reference to the TangoPointCloud script on the Tango Point Cloud gameobject. We need this reference to call the FindPlane method on it
  • We assign the m_pointcloud reference in the Start() function
  • We check the touch count and its state in the Update() function when the single touch has ended
  • We invoke the PlaceKitten(Vector2 touchPosition) function to place the cat into 3D space. It queries the main camera’s location (in this case, the AR camera), then calls the FindPlane function based on the camera’s position and touch position. FindPlane returns an estimated plane from the touch point, then places the cat on a plane if it’s not too steep. As a note, the FindPlane function is provided in the Tango Support Library. You can visit TangoSDK/TangoSupport/Scripts/TangoSupport.cs to see all of its functionalities.
Step 9: Put everything together by downloading the kitty.unitypackage, which includes a cat model with some simple animations. Double click on the package to import it into your project. In the project folder you will find a Kitty prefab, which you can drag and drop to the Kitten field on the KittyUIController.
Step 10:
Compile and run the application again. You should able to tap the screen and place kittens everywhere!We hope you enjoyed this tutorial combining the joy of cats with the magic of AR. Stay tuned to this blog for more AR updates and tutorials!

A final note on this tutorial
So you’ve just created virtual cats that live in AR. That’s great, but from a coding perspective, you’ll need to follow some additional steps to make a truly performant AR application. Check out our Unity example code on Github (especially the Augmented Reality example) to learn more about building a good AR application. Also, if you need a refresher, check out this talk from I/O around building 6DOF games with Project Tango.

Travel through space with the Project Tango app, Solar Simulator

Posted by Jason Guo, Developer Programs Engineer, Project Tango

Since most of us haven’t been to space, it’s often hard to grasp concepts like the vastness of the Solar System or the size of the planets. To make these concepts more tangible, three graduate students at San Francisco State University (SFSU)--Jason Burmark, Moses Lee and Omar Shaikh--have created Solar Simulator, a new app for Project Tango. The app lets people take a virtual walk through space to understand the size and scale of our solar system.

Created with the Unity SDK, the application lays out our solar system’s planets in their relative distances from each other and draws 3D models of them in their relative sizes. The app leverages Project Tango’s motion-tracking API to track your movements as you walk, so you can better understand the planets and their distance in space.

If you like what you see, you can create your own solar system at home. Just follow the six steps below:

  1. Download the Tango Unity SDK.
  2. Create a new Unity project and import the Tango SDK package into the project. If you don’t already have the Tango SDK, you can download it here.
  3. Assuming that you are building a solar simulation, place a sphere at (0, 0, 2) to simulate a planet floating in space. The screen will look like this:
  4. Next, replace the Main Camera with the Tango AR Camera and connect the Tango Manager through the prefabs. To do this, first remove the Main Camera gameobject from the scene. Then drag in the Tango AR Camera and Tango Manager from the TangoPrefabs folder under Project. The scene hierarchy will look like this:
  5. On Tango Manage gameobject, there are several Tango startup configurations such as knobs to configure how Tango will run in the application session, i.e, turning on/off depth, or motion tracking. In this case, check the boxes to turn Auto-connect to service, Enable motion tracking (with Auto Reset), and Enable video overlay (with TextureID method).
  6. To get your code ready for AR on a Tango-enabled device, build and run the project. To do this, follow the “Change the Build Settings” and “Build and run” sections in this tutorial.

Here is what the final scene should look like from the device:

If you want a guided tour of the planets with Solar Simulator, developers Jason, Moses, and Omar will be demoing their app at San Francisco’s California Academy of Sciences’ NightLife tonight at 6:30PM PT. You can also download Solar Simulator on your Project Tango Development Kit.

Project Tango heads to GDC with new updates and games

Posted by Alex Lee, Program Manager, Project Tango

This week we’re headed to the Game Developer Conference in San Francisco—it’s a great opportunity to get inspired, engage with new technology and learn from each other.

In fact, one of the things we’ve heard from game developers in particular, is that it’d be awesome to have a plugin for Unreal. We agree. Which is why we’re excited to announce that Opaque has just released the beta version of their Project Tango plugin! Stop by our booth at GDC to learn more (Zone 1, Booth 612), and to see all the newest Project Tango games in action. We’re also hosting two talks we think you’ll like:

  • Wednesday March 16th, 11-11:20 am: Jesse Schell, CEO of Schell Games, discusses building with Tango and demonstrates his company’s apps
  • Thursday March 17th, 12:30 -12:50 pm: Randall Eike, President of Eike Consulting and Daniel Winkler, Lead Programmer at Iguanabee introduce two new Tango-powered games, Raise and Slingshot Islands

In the meantime, enjoy an in-depth interview with game developers Danielle Swank and Jim Fleming of Barking Mouse Studio, and Josh Lee of Floor is Lava. The three recently met up at Gamenest, and created a brand new Project Tango game, World of Orbles.

Tell us about The World of Orbles

Danielle: The World of Orbles is a real-time strategy game that deals with fictional creatures called Orbles. Orbles are very smart and can build really amazing things, like trans-dimensional portals, but they also have a tendency to get distracted and forget to double check their math. So when their trans-dimensional portal malfunctions and leaves them stranded, it’s no surprise to anyone. Chaos ensues as you try to get the Orbles back to their home dimension. By using Project Tango AR features, you can see the Orbles right in your physical environment.

What got you excited about developing for Project Tango?

Jim: Mixed reality is really compelling. Literally placing something you’ve created into the real world and being able to walk around it feels like the future.

Josh: A lot of my work involves playing in physical spaces, and Project Tango's ability to map a real room and blend digital objects with it opens up all kinds of new possibilities for gameplay.

How do you think Project Tango enhances your user’s experience?

Josh: Giving users the ability to engage with digital games the way they would with a physical space is extremely powerful. Interacting with digital objects in a more physical way – looking at them from different angles, tracking their movement, etc. – makes it possible for users to play games much more intuitively, with less fiddling around with camera controls and such.

What is your favorite Project Tango feature?

Jim: Real-time mesh generation and out-of-the-box Unity integration.

Josh: Building a digital model of a physical room in real time is pretty hard to beat.

What was your biggest hurdle getting starting with Project Tango?

Danielle: Having to write our own pathfinding was an unexpected hurdle. We normally rely on Unity and static environments to provide character and npc movements. Since it’s dynamic, you can’t use Project Tango’s area mesh generation with the Unity pathfinding. Instead we had to rely on ray casting against the depth map that the Tango generates and use a modified boids algorithm to control our npcs.

What creative solution did you come up when using Project Tango for your app?

Danielle: We had to come up with a creative solution to what happens with our characters if they move behind a real-world object so we decided to write a glitch shader. It transitions the character from visible to hidden in a believable manner since we can’t partially occlude game objects.

What resources were most helpful for you?

Danielle: The Project Tango developer docs and just reading the Unity integration. There a lot of good information in the source code. Also having Unity experience was really helpful.

What tip would you give a developer who wants to get started with Project Tango

Danielle: Getting the game characters to look like they are part of the environment was tough. It’s definitely harder to make an AR game then a normal game, so be sure to give yourself enough time. That all said, I think there are all sorts of possibilities for AR that you can’t do with a normal game.

Jim: You’ll have to get creative when working with AR. Some things you would do in a normal game don’t make sense in AR.

What are you most excited to see with a the launch of Lenovo’s phone with Project Tango?

Danielle: I’m excited to see the devices in a lot of people’s hands. I think it’s really awesome that I can build a world that’s different than everyday reality and share it with people around the globe.

Jim: The smaller phone form factor opens up more use cases such as Cardboard.

Josh: I want to go to an AR gaming party, where all the partiers bring their Tango-enabled devices to a special location and play crazy games together.

Bringing Museu Nacional d’Art de Catalunya to Life

Posted by Larry Yang, Senior Product Manager, Project Tango

Ever been lost indoors? Us too. All the time. We brought Project Tango to Mobile World Congress, and instead of just talking about augmented reality and indoor navigation, we let attendees experience these technologies for themselves... in a museum.

Most of us love museums, but they can be difficult to navigate. It can be hard to find the location of a painting, the story of the sculpture—and equally hard to tell your friends where you are. With Project Tango, the way we experience museums can completely change by making it easier to find what (and who) you’re seeking as well as showing you the story behind the art.

To demonstrate this, we invited conference attendees into the Museu Nacional d'Art de Catalunya, carrying Project Tango developer kits loaded with GuidiGO, a museum tour app, and Glympse, a location-sharing app.

With GuidiGO, attendees could view the museum’s floor plan and follow blue dots on their screens to navigate to specific pieces of art — all without relying on Wi-Fi, GPS or beacons. They could also learn more about the art by simply holding the tablet up to each work and tapping virtual tags to reveal additional information. If people lost track of their friends during their visit, they could easily share their indoor location with their friends with the Glympse app.


We loved seeing these creative applications of Project Tango and can’t wait for people to get their hands on them with the launch of Lenovo’s consumer-ready Project Tango smartphone this summer. To make your own Project Tango app, take a look at our developer documentation and stay tuned for more Project Tango news and applications.

Get your app featured on the first smartphone with Project Tango from Lenovo

Posted by Johnny Lee, Technical Project Lead, Project Tango

Today, at CES, Lenovo announced the development of the first consumer-ready smartphone with Project Tango. By adding a few extra sensors and some computer vision software, Project Tango transforms your smartphone into a magic lens that lets you place digital information on your physical world.


*Renderings only. Not the official Lenovo device.

To support the continued growth of the ecosystem, we’re also inviting developers from around the world to submit their ideas for gaming and utility apps created using Project Tango. We’ll pick the best ideas and provide funding and engineering support to help bring them to life, as part of the app incubator. Even better, the finished apps will be featured on Lenovo’s upcoming device. The submission period closes on February 15, 2016.

All you need to do is tell us about your idea and explain how Project Tango technologies will enable new experiences. Additionally, we’ll ask you to include the following materials:

  • Project schedule including milestones for development –– we’ll reach out to the selected developers by March 15, 2016
  • Visual mockups of your idea including concept art
  • Smartphone app screenshots and videos, such as captured app footage
  • Appropriate narrative including storyboards, etc.
  • Breakdown of your team and its members
  • One pager introducing your past app portfolio and your company profile

For some inspiration, Lowe’s is developing an app where you can point your Project Tango-enabled smartphone at your kitchen to see where a new refrigerator or dishwasher might fit virtually.


Elsewhere, developer Schell Games let’s you play virtual Jenga on any surface with friends. But this time, there is no cleanup involved when the blocks topple over.


There are also some amazing featured apps for Project Tango on Google Play. You can pick up your own Project Tango Tablet Development Kit here to brainstorm new fun and immersive experiences that use the space around you. Apply now!