Increasingly, we're seeing Google's AR and VR ambitions align with its developer community, so it's no surprise ARCore and Daydream were front and center at Google I/O 2018.

Instead of a few seconds at the main Keynote, Google's AR efforts span multiple extended sessions with a developer hands-on space an a pair of huge playgrounds for everyone to put hands and eyes on the latest things happening in ARCore. Lines to experience these new goodies have wrapped around the building, but everyone walking out of these demos had a huge smile on their face.

What was announced for ARCore at Google I/O?

After dropping the latest milestone for ARCore, support for 100 million-plus devices thanks to expanding the availability to many more phones, we got a taste of ARCore 1.2. This update is iterative in nature, but sets the stage for AR apps to be used in many more places and by many more people.

VPN Deals: Lifetime license for $16, monthly plans at $1 & more

Augmented Images

ARCore is seen as special because it works without needing any kind of positional marker in the real world to anchor a 3D image for you to see. No clumsy QR codes or bizarre patterns for you to point a phone at, you just create an AR experience in front of you whenever and wherever you want. And while that is cool, being able to tether an AR experience to a defines place in the world has value. Developers asked Google for some kind of image recognition system for ARCore, and it delivered in this thing called Augmented Images.

In a nutshell, Augmented Images makes it possible for developers to make a movie poster or the front of a box on a store shelf an ARCore experience. You point your phone, and when the app recognizes the image the experience starts on your phone. Google has made it possible for developers to either store reference points for up to 1000 images locally for offline modes, or use a real-time image system to call out special images for AR experiences. Users could point their phones at a movie poster and get the trailer playing inside the poster, for example.

This has some natural benefits in advertising, but Google also sees a big benefit for educators. Being able to point a phone at an image in a textbook and see it come to life would be incredibly cool, no matter how old you are.

Cloud Anchors

Creating an AR experience using the world around you is a lot of fun, but aside from recording it with your phone, there's not a great way to share that experience with others. Cloud Anchors solves this problem by creating a way to link multiple phones to the same temporary experience in the same place. You do something fun in AR, and then you can invite your friends to either play with you or see what you see in the same space.

In all of the demos we've seen so far, Cloud Anchors work when someone places something with ARCore first. Once that ARCore experience has been "hosted" in the real world, depth and positional markers from that area are saved and made available in the cloud. When someone else points their phone at that area, those markers will line up and allow the ARCore experience to be shared. No need to be on the same wireless network and, in fact, no need to even have an Android phone. Cloud Anchors are supported in Apple's ARKit for iPhone, complete with an SDK to easily add support.

The coolest thing about this new feature by far is the way it has been implemented. By not using a shared network or some kind of manual server, ARCore apps can support an infinite number of users in the same physical space. As soon as you connect to a Cloud Anchor you can move around as though you had created the experience yourself, making the barrier to entry for users just about as low as you can get.


Asking someone to install an app to get an AR experience isn't always a guaranteed yes, so Google is building ARCore support directly into Chrome. Starting in Chrome Canary next week, WebXR will make it possible for AR experiences to be launched directly from the browser, no additional app installations required.

In the demos here at Google I/O, this tech was used to place individual objects like statues in the real world to walk around and experience with a greater sense of realism. Google plans to have more details on how this will work for Android and web developers alike very soon.


In an effort to lower the barrier to entry for developers with no previous 3D modeling or graphics experience, new tools have been created to make developing the ARCore parts of your app even easier. Sceneform makes it a little easier to build the AR environment in your app, and lets you easily plug into things like the Google Play library to quickly access AR-friendly assets you would otherwise need to build yourself.

Developers can access the Sceneform 1.0 SDK now, and in theory be able to dive into new experiences in no time!