Happy Holidays everyone! Have fun playing!

April 17th, 2014

Hey everyone! Time to drop the keyboard for a while and get out there egg hunting and enjoying some good food offline with your loved ones (if you happen to celebrate Easter).


But if your 21st century children (or yourself) would rather like to play with some digital goods, you could also take a look at the new app by Berlin start-up Toywheel: they kick off a virtual car game, bringing the physical and the digital world together once more:

I like the looks of it and the cute designed elements. Racing games in AR have been done before, but I still like this concept very much (I did implement my own AR-Racinggame with spaceships). It is a winning combination, where shrunk objects (like cars) are still convincing declaring them as toys with a known mini scale. The developers say that it is supposed to be a more open approach where the kids can play freely without the scripted need for winning/racing, giving them back the freedom of playing freely (as it was with matchbox cars all those years). A nice twist. Let’s see how they evolve the concept.

What I still would love to see is the direct manipulation of the track by different (independent) markers: just pick up a marker for a ramp and move it around to change the track. This would give a deeper interaction between real and virtual! Let’s wait for more!

… but what I don’t understand is why the kid is so obsessed with food in the video (prominent donut eating and pizzas…). But it fits to our Easter theme. :-)

Enjoy and happy holidays!

Transparent Bonnets to drive savely the rocky road

April 11th, 2014

A nice new concept video “Land Rover Reveals Transparent Bonnet Concept” hit the web and received some attention:
the car’s bonnet appears translucent by a visual overlay to the driver’s view via a HUD, showing the camera-recorded ground underneath. They briefly describe it as: Land Rover reveals world’s first Transparent Bonnet Concept allowing a new level of driver awareness with a ‘see-through’ augmented reality view of the terrain ahead, making the front of the car ‘virtually’ invisible from inside the cabin.


The concept video shows it nicely, so no more words are needed to show the possible (optimistic) outcome:

Why optimistic? Because it is still only a concept and is not proven to really look like in the video from the driver’s perspective. First, the ground needs to be recorded with a single or multiple cameras with a reasonable field of view and exposure time to have a solid visual quality but at the same time match the viewer’s perception of the environment. Then this image needs to be match to the viewer’s point of view. Second, the user needs to get a good (non-distracting) overlay to its windshield (via some HUD setup). This might sacrifice more visual quality (in contrast to the looks as seen in the concept video). The car company might find a solid driver’s head position through the current seat position, etc., so it might give reasonable results, assuming no bigger head movements by the driver…

Blair MacIntyre also commented more on this issue of geometric distortion and positional problem in his blog, where you might want to read on, if you are interested in this tricky thing.

So, result’s might not be as seen in the video soon, but in general an interesting concept and idea!!

Turn your world into a game

April 8th, 2014


Want to run around in your house or office playing first-person shooters with your friends in an mixed reality setup?
The fellows from 13th lab (who e.g. brought you Minecraft Reality) are pledging on Kickstarter for this idea, named “Rescape”. 8 days to go, so it is urgently time for you to take a look!

Their solution is a piece of hardware (the rifle-styled mount for your smart-device) plus their software SDK, including 3D scanning and tracking components. Quoted: Using advanced computer vision and specialized optics to track your movements, Rescape lets you digitize the real world, pull any canvas you want over it, and then have you and your friends blow it to pieces!


So, their approach is more of a mixed reality rather than a visual augmentation to be finicky. But the immersive experience within your real environment looks damn promising (in their promo videos).

The want to offer a Rescape SDK, the game controller with specialized optics (with strap-on 180° fisheye lens) and mount for your smartphone. By using the SLAM approach they plan on reconstructing, scanning the real building and giving a full (6 DOF) positional tracking in real-time.

Their SDK would be 100% free, giving a C interface and Unity support for first demos. It will be available first on iOS devices, Android to come later with a first release to kickstarter backers planned in September.

To make it real fun, they want to support multi player with friends – having them even replaced by Avatars. This looks a bit too good to be true to me. But let’s see how it goes and if they can raise the missing money. I cross my fingers!

More technical videos and multi-player demo can be seen on their kickstarter page. Worth a look!


Project Tango Peanut hands-on

March 27th, 2014

First developers receive their Project Tango device from Google. Omar Soubra is one of the first to get one. He now posted his initial experiences covering a lot for a first trial. Unboxed, this is what you get:


There it is already labeled “Project Peanut”, which seems to be the name for the updated developer’s prototype. Omar is pretty impressed directly from the device being “like magic” and draws a funny comparison to start off with:

PEA·NUT / ?p?n?t / Noun
1. A mobile device that will change the world of mobile computer vision forever.
2. Tasty seed of a South American plant.

Seems like he is also a good guy picked by Google to try it out as Omar has worked with lots of 3D LIDAR scanner systems before. He already has a baseline of his completely scanned house and compares it to Project Tango’s Peanut’s results.

In his posted video below you can also see the live depth map how it works out of the box. Still pretty impressive, though Omar also speaks about a few problematic areas in his post. Nevertheless, a great start for a dev kit and we can expect more to come:


Within the GooglePlay stored there seems to be a newly created dedicated Peanut category for device owners. Can’t wait to see more documetation from users (who were lucky) to pop up.

Please visit Omar’s full article on makezine following this link below:

Hands On: Project Tango, Google’s 3D-Scanning Phone for Makers


Adrenalin Shock while waiting for the bus

March 21st, 2014

Hi everybody,

a new AR marketing campaign hit the streets of London. I have to admit: I’m impressed by the simple, but great idea! A fizzy drink company had the right feeling for a good tech-supported ad: the “Unbelievable Bus Shelter”. Typically people doze off at bus stops while waiting and they do have time to look around from a (more or less fixed) position – sitting in the right perspective.


The concepts uses the window to the world approach to convert a bus stop billboard wall into a simple glass wall (obviously by means of camera). Now nothing crazy is going on and people see the real cars and people passing by in the “glass” window… but then it gets interesting:

Great AR story! Not often I say: I hope they ship this ad over to Germany!

Have a nice weekend!

Augmented Pools for our pleasure

March 14th, 2014

I like pools. Especially in the summer, but also indoors playing billiard pool. There has been an old AR demo way back visualizing the perfect angle and bounces as a helper guide. But now there is another neat package enhancing the boooring static world with real-time visuals!


The cool thing about it: it is open source and can be used and extended by everybody. The OpenPool uses two Kinects attached on the ceiling. Those will watch the locations of the balls and a projector will overlay whatevery fancy graphics you might come up with.

Some additional items are available to detect balls dropping into the pockets and to detect collisions by microphones.


Check out different videos on youtube, here a short one:

Neat stuff! Looking forward to see it live one night out!

New Meta 2.0 hands-on for sculpting

February 27th, 2014

News pop in from Meta and their Augmented Reality Glasses 2.0.

Techcrunch had the chance to take a hands-on trial and show us the demo session video. Colleen Taylor talks with the founder and CEO of Meta Meron Gribetz and tries out the stereo AR glasses with gesture interaction:


It looks like a solid demo with stable tracking and hand interaction. Unfortunately we cannot see the point of view of the user. We can just assume some neat looking imagery from her gasps and wows… Having a large field of view straight in the center of your view with a high opacity (for non-black parts) sounds really promising.

The glasses claim to have 16 times the screen size of Google Glass and will cost 3,000$. A dev kit comes for 1,667$. Pre-orders are taken now on SpaceGlasses.com.

Let’s hope for more footage soon to judge! Definitely those glasses look like a solid approach to be worn during work sessions with ease and we can imagine many cool things to work on with these with gestures… Cool!

(If the embedded video does not play in your country, try the link to techcrunch.)

Dancing Tango in 3D

February 21st, 2014

Hi everybody,

a new announcement video hit the world showing off the latest progress from Google on 3D reconstruction on mobile devices.
Johnny Lee (well known for his mixed reality and Wiimote and then Microsoft activities before he went over to big G) starts off talking about this project “Tango” within the Google ATAP team (exactly: those bad ass research guys with the “We like epic shit” claim).


This approach lets you reconstruct your living room or basically anything in 3D by just walking around with your smartphone.
This will enable developers to offer way more accurate tracking but also better mixed realities. Let’s have a look on what they say:

Project Tango is an exploration into giving mobile devices a human-scale understanding of space and motion.

The demo looks pretty impressive (well, they are Google developers and also have a big PR team!!) and I guess we can expect many things to come. You can imagine the impact on AR, but let’s quote another bit from their page:

What if you could capture the dimensions of your home simply by walking around with your phone before you went furniture shopping? What if directions to a new location didn’t stop at the street address? What if you never again found yourself lost in a new building? What if the visually-impaired could navigate unassisted in unfamiliar indoor places? What if you could search for a product and see where the exact shelf is located in a super-store?

Imagine playing hide-and-seek in your house with your favorite game character, or transforming the hallways into a tree-lined path. Imagine competing against a friend for control over territories in your home with your own miniature army, or hiding secret virtual treasures in physical places around the world?


Google had teamed up with universities, research labs, and industrial partners over the last year to come up with this package now to try out. The Prototype runs on 5″ Android phone and includes development APIs to provide position, orientation, and depth data to standard Android applications written in Java, C/C++. Unity Game Engine is included for rendering. Google wants to give out 200 hardware/DEV Kit demo devices to interested developers by March, 14th. So, everybody can apply and let Google collect our ideas.

Having a 3D representation of the world and it’s features would dramatically help outdoor/urban augmented reality to kick off. Like with metaio’s Augmented City approach we can imagine cool applications with accurate visual overlays and positional information. Since mobile GPUs also not get slower, we might finally see big scale AR deployed rather sooner than later! Nevertheless, it would be even better to see an open standard arise along the way… like the Open Geospatial Consortium is trying to get this done for GPS-positional data for (simpler) location based services, I’d love to see this also for 3D representations, not being bound to the data center of Google or others. Obviously, many people would not want to get their living rooms scanned and uploaded to the web…

In any case: enjoy the cutting edge shit!

No more markers

February 10th, 2014

Hi everybody,

back after some busy days out! Let’s jump into the new week with a short demo video on some of the latest tracking technology!

tracking by the 3D object itself

Currently when talking about vision-based tracking it’s all about 3D feature maps (using algorithms like SLAM or PTAM, etc.), that do resort to stable color details in the environment to work.

This has some restrictions, when considering changing lighting conditions during operation or reflective surfaces. Both will result in an unstable environment, making it hard for the tracking to work well (or at all).

Different companies work on solutions to overcome these limitations by having edge-based initializations or by doing the entire tracking based on edges rather than color features.

I was able to see such a solution in an early stage at ISMAR 2011 in Basel from a research team at CEA from France. Now those results have found their way into a fresh company offering AR solutions with their tracking technology:

Diotasoft will use CAD data from the to-get-tracked real model to initialize, align and track the object. This could free us all from an annoying initialization process to start off directly – with a known piece of furniture, etc. being the reference and target object at the same time. This would be great and boost easy setup for AR by a big leap!!! Let’s take a look:

What I really like is the last piece of the video, where a 3D representation of the object is created on the fly. Overall quality looks pretty stable, though I haven’t seen it live yet. Let’s wait for some live demo to judge!

Hope to see it soon!

Coming up

  • The Mobile World Congress will take place in Barcelona on February, 24th-27th.
    Hope to see some cool mobile AR demos hitting the web from there!
  • The augmented reality Stammtisch in Munich will occur next time on March, 4th! This time we want to switch places and opt for the Augustiner close to Hackerbrücke (North side). It fits the logo better and the beer is, too! Details to follow under ARMUC subpage and the social space (facebook, linkedin and xing groups)!
  • AR Stammtisch next Tuesday!

    January 15th, 2014


    Hey everybody!

    Don’t forget and feel free to join our next Stammtisch! It will happen next Tuesday (Jan, 21st). Bring your AR-interested buddies, demos or the latest AR gossip along to the Wassermann at 7.30pm! Looking forward to seeing you there!

    All the details can be found on the AR MUC subpage as usual.