Bye Bye Project Tango, Hello ARCore

Written by: on January 17, 2018

AR experiences focus around integrating virtual content into the real world through a device’s camera. Weeks after Apple unveiled their SDK, ARKit, Google announced their own AR platform with ARCore.

You may be thinking about Google’s initial foray into AR with Project Tango; however, ARCore doesn’t require specific devices with specialized sensors and cameras. Furthermore, Google recently announced that Tango will be unsupported after March 1st, 2018 due to ARCore’s ability to reach a larger market.

Currently, ARCore is available on Pixel devices and Samsung Galaxy S8 devices running Android N and later. Technically, ARCore is still in what Google calls an “SDK preview,” which means that final changes to the SDK are still coming in the future before a full public release. More developer previews of the software are expected before version 1.0 of ARCore is released in the coming months. Google announced that they are getting more OEMs (original equipment manufacturers) on board to support ARCore soon, and are hoping for support for more than 100 million devices at the time of official launch.

ARCore offers support for Unreal and Unity and provides three main features to developers: motion tracking (tracking your position in the world), environmental understanding (detecting the size and location of horizontal planes), and light estimation (estimate light intensity at a location). These features allow devices to both track the position of the device as it moves as well as build an internal representation of the world.

Motion Tracking

ARCore uses various measurements to map the world and track your device’s motion. The development platform picks up distinct features through the camera, referred to as “feature points,” while using the device’s accelerometer to help estimate position in both space and time. Based on these feature points, apps are able to determine where objects are relative to the device itself, which allows them to place new virtual objects into this space.

Environmental Understanding

The model of the world that ARCore creates is constantly updated with new information from the device like finding more feature points and detecting groups of feature points to find horizontal planes. As a result of ARCore using feature detection to determine planes, some surfaces that reflect light well or don’t contain distinguishing features aren’t as accurately determined as others. Apps can use these planes to place virtual objects amongst real life objects. Additionally, planes in ARCore contain other useful information such as the center point, the bounding rectangle (the lengths of the edges that bound the plane), and the type of plane.

Lastly, ARCore and ARKit both use sparse mapping to maintain an internal model of the environment. A sparse map is a technique that allows the device to efficiently track major features in the environment. When the app opens, new maps are initialized (as both ARCore and ARKit currently don’t support saving/reloading maps), and the maps are then iteratively improved upon and updated with new sensor data as the device moves around. ARKit uses a “sliding window” for its map, meaning that it only saves data for a limited time/distance while throwing away old data. ARCore seems to shine in this regard by having the ability to save more data in a larger map. If the device’s internal tracking loses its sense of where it is currently, it would be easier to recover with more data already existing in the map.

Light Estimation

Similar to ARKit, ARCore offers an estimate of lighting in an environment. ARCore provides a single intensity value in the Android Studio API or a shader in the Unity API, while Apple’s ARKit offers intensity and color temperature values to developers. These values allow developers to adjust the simulated lighting of AR objects to match the real scene, or even have objects react to lighting changes. Demos of both platforms’ lighting seem to produce similar results, so there isn’t a huge difference between the lighting values offered.

Where Does ARCore Stand?

ARCore’s big advantage in the AR market is that Google has two years of experience in the AR department from Project Tango, much of which was pulled into ARCore. A significant amount of the research that went into Tango, including user testing and market testing, will be helpful for ARCore’s growth over the coming months and years.

Overall, ARCore offers very similar capabilities and allows developers to create nearly identical experiences as ARKit in the mobile market. Technically speaking, there are a few minor differences in what the two SDKs offer developers and how they’re implemented, but nothing that will change the finished product of AR creations at this point. As discussed earlier, ARCore is currently available to Galaxy S8 and Pixel users, while Google plans to up the support to more than 100 million devices in the coming months. As more Android devices are supported, companies will continue to leverage both ARCore and ARKit for their corresponding platforms.

While we wait for ARCore’s full release to more devices, we can all have fun playing around with Google’s AR stickers on the Pixel and Pixel 2. For more information on ARCore’s SDK, check out Google’s official documentation.

To read more about AR, be sure to check out ARKit: Get ahead of the Enterprise Curve and How to Design an Engaging AR Experience on the POSSIBLE Mobile Insights blog.

Want to explore how your brand should be taking advantage of AR? Email us at We’d love to hear about your ideas and see how we could help.

Dawson Foushee

Dawson Foushee

Dawson is a Software Developer at POSSIBLE Mobile, a leading mobile development agency. He graduated from Georgia Tech in 2015 with a bachelor’s degree in Computer Science, and has been developing mobile apps for 4+ years.

Add your voice to the discussion: