Dancing Tango in 3D

Hi everybody,

a new announcement video hit the world showing off the latest progress from Google on 3D reconstruction on mobile devices.
Johnny Lee (well known for his mixed reality and Wiimote and then Microsoft activities before he went over to big G) starts off talking about this project “Tango” within the Google ATAP team (exactly: those bad ass research guys with the “We like epic shit” claim).

tango1

This approach lets you reconstruct your living room or basically anything in 3D by just walking around with your smartphone.
This will enable developers to offer way more accurate tracking but also better mixed realities. Let’s have a look on what they say:

Project Tango is an exploration into giving mobile devices a human-scale understanding of space and motion.

The demo looks pretty impressive (well, they are Google developers and also have a big PR team!!) and I guess we can expect many things to come. You can imagine the impact on AR, but let’s quote another bit from their page:

What if you could capture the dimensions of your home simply by walking around with your phone before you went furniture shopping? What if directions to a new location didn’t stop at the street address? What if you never again found yourself lost in a new building? What if the visually-impaired could navigate unassisted in unfamiliar indoor places? What if you could search for a product and see where the exact shelf is located in a super-store?

Imagine playing hide-and-seek in your house with your favorite game character, or transforming the hallways into a tree-lined path. Imagine competing against a friend for control over territories in your home with your own miniature army, or hiding secret virtual treasures in physical places around the world?

tango2

Google had teamed up with universities, research labs, and industrial partners over the last year to come up with this package now to try out. The Prototype runs on 5″ Android phone and includes development APIs to provide position, orientation, and depth data to standard Android applications written in Java, C/C++. Unity Game Engine is included for rendering. Google wants to give out 200 hardware/DEV Kit demo devices to interested developers by March, 14th. So, everybody can apply and let Google collect our ideas.

Having a 3D representation of the world and it’s features would dramatically help outdoor/urban augmented reality to kick off. Like with metaio’s Augmented City approach we can imagine cool applications with accurate visual overlays and positional information. Since mobile GPUs also not get slower, we might finally see big scale AR deployed rather sooner than later! Nevertheless, it would be even better to see an open standard arise along the way… like the Open Geospatial Consortium is trying to get this done for GPS-positional data for (simpler) location based services, I’d love to see this also for 3D representations, not being bound to the data center of Google or others. Obviously, many people would not want to get their living rooms scanned and uploaded to the web…

In any case: enjoy the cutting edge shit!