When Apple released the first public version of its ARKit API, I admit that I was gushing about it. I’m not sorry in the least however, since it was without a doubt the biggest leap in mainstream consumer AR so far. While other AR leaders, like Microsoft and Google, were messing around with expensive hardware-based solutions Apple simply released a software patch. Suddenly all the latest Apple phones and tablet could do marker-less, persistent augmented reality. It wasn’t perfect, but it was honestly mind-blowing.
Apple has not been sitting on its well-heeled laurels however and ARKit 2 is bringing a lot of new features to newer Apple devices, rolling out with iOS 12.
Learning to Share
One of the most exciting new features is the idea of “Shared Spaces” with ARKit. We’ve seen this sort of thing with premium solutions before. In one Microsoft Hololens demo, for example, several students with the headset all stand around an anatomical projection of a human. The AR project is seen by everyone from their correct relative angle.
ARKit can now also do this. Which means multiple people with their own devices can see the same persistent object. This opens up spectator and multiplayer games, as well as collaborative applications. It’s a killer feature that was sorely missed in ARKit 1.
Another very cool feature is the idea of a persistent virtual object. Having persistent objects means you could leave something on your desk in AR, come back later and still find it where you left it.
Now to be clear, ARKit 1 already had object persistence within each AR session. What Apple has done here is to create persistence across sessions. So even of you go do something else, the next time you open your AR app, your objects are still where you left them.
An Eye for Detail
Apple have also improved image recognition and analysis. This means app developers can now more easily introduce physical objects for use in AR software. Imagine using a real Chess set, for example, where the app can apply effects to them and track their position in real time. Mixing the physical and virtual makes for a very special experience.
While a few app developers have already done this with ARKit 1, Apple has now included a native app called Measure, which can be used as a virtual tape measure. Were Apple going to do this from the start or did they steal the idea from devs? Hey, I’m just asking questions here.
Filed Under Awesome
The last major development worth mentioning is the introduction of an AR object file format. It’s called “usdz” and means that you can now share AR objects with app like Mail or Messages.
I’m pretty stoked about ARKit 2 and the extra features it’s bringing to existing iOS devices. However, I don’t think Apple is just trying to corner the mobile AR market here. It’s becoming ever more certain that Apple will bring out its own AR/VR HMD and the software work being done here is obviously going to power that device if it ever comes to market. Features like shared spaces and object recognition make way more sense when thinking about uses for an HMD, so what we should really be excited about is when ARKit and the Apple HMD meet. Until then, you can experience the best of consumer AR right on your new iPhone or iPad.