Google’s ARCore attacks Apple’s ARKit

Google watching WWDC with Apple announcing ARKit: “Ooh, they launched their AR stuff without waiting for a depth sensor? Didn’t see that coming. Oh well, then let’s just //-comment out the Tango calls and make it another SDK quickly!”. This is not how it happened at Google. But feels like it. Out of the blue, Google launches their new SDK called “ARCore” today. It’s dedicated to mobile phone augmented reality using standard sensors. A direct reply to Apple’s freshly presented ARKit. Wow, that came a little bit surprising!

…but then again, it did not came that surprising. Often, Google developers have discussed the tracking concept of Tango – turns out it mostly relies on the RGB/IMU input only anyway. Makes sense to ship it for as many phones as possible now?

Funny. Google is kind of taking the backward approach. While Apple started without IR-depth-sensors and might include it later, Google did long trials with Tango and now skips it… Will it kill Tango technology? Definitely not! But it seems more like a must-have step that was needed to close the gap for all developers co-developing for Android and iOS. Since competitors couldn’t wait for “full AR”, it feels like Google joined the circle to not lose touch with the users.

So, is this good or bad for the users? Definitely good! Well, short-term. We love AR and want to see more of it now. But it could also lead to poor experiences and a lot of crappy less attractive demos, that could eat on the reputation of the term AR (again). Will smartphone producers now rush to include more expensive Tango hardware? Feels like it could heavily delay roll-out of more Tango gadgets. Or it has always been a long-term plan to fill the gap between four dots on their roadmap A-B-C-D. A to B: dumb smartphone to ARCore-phone, B to C: ARCore-phone to Tango-phone, C to D: Tango to AR-glasses (not Google Glass).

Let’s not speculate today, but enjoy two videos to dive into the smartphone fun coming from Google and showing some more experimental fun they did already:

But why another SDK?

Couldn’t it just be a subset of Tango? Or Tango a subset of ARCore in the future as it could work out with a long-term roadmap connecting the four dots? Well, we will see. Let’s take a brief look at the SDK while installing all the stuff needed – currently only working on a Pixel or Galaxy S8 phone (for now, many more devices to come):

  • Environmental understanding – to allow virtual objects to be placed “in a way that physically connects with the real world.”
  • Motion Tracking – to allow “users to walk around and interact with virtual content that is rendered in the 3D world.”
  • Light Estimation – to create “realistic looking objects by having its own light change dynamically according to the environment lighting.”

Oh, reminds me of something. Which is good, let’s have some wrapper for Android and iOS and let’s get going!

Google’s approach also ships a light estimation function with a float pixelIntensity. Although, in the Unity documentation they describe the “EnvironmentalLight” as a “component that automatically adjust lighting settings for the scene to be inline with those estimated by ARCore” – maybe they will do something more to my Shaders automatically?

Their tracking plane object “TrackedPlane” lists position, rotation and boundaries. Seems like the plane detection is limited to horizontal 2D surfaces for now (like ARKit). They ship the anchor concept to solidly place virtual objects in a learned environment.

Talking about environment… You get access to the point cloud data sets as well. You can check each point from the cloud. Pretty neat! … so, will we see an update to this environmental package including the Tango stuff?

In any case. Good to see Google close the gap, support Unity and Unreal on day 1 and let’s see what else will come next weeks to update the many AR SDKs of the world! So, back to development…

Enjoy!