It has finally happened. Augmented Reality is, well, a reality on iOS 11, and it looks about as niche as you can expect. With ARKit (short for Augmented Reality Kit) API, developers can now tap into the power of iOS devices to create virtual environments within real life spaces. Apple made the announcement at their World Wide Developers Conference (WWDC 2017) where they announced the new API as a part if iOS 11.
In the ARKit demo, Senior Vice President of Software Engineering, Craig Federighi (or Hair Force One as I will forever know him) showed some basic things that Augmented Reality would allow developers to do. Thanks to Machine Learning and Metal 2, the iPhone managed to detect the surface in front of him, which he used to place a virtual cup of tea. He then added a light source and showed how the shadows dynamically changed according to where the light source in reference to the cup he was moving. It was a very impressive demonstration.
It basically brings together features we have seen in hit games such as Pokémon Go. In the game, users can see Pokémon in their surrounding environment, where they can throw pokeballs to catch the Pokémon. As such, ARKit opens it up for a whole new generation of developers to play with.
In another demo that followed right after, Apple showed how kids could project their LEGO creations onto the surfaces in front of them. In this case, a LEGO Batmobile projected onto a table, where it was disassembled in real-time. The user could then move and pan the device to inspect all LEGO pieces individually. It was two quick little demos that showed real basic things. Imagine the possibilities when developers get hold of this!
ARKit, along with Metal 2 and Siri’s new Machine Learning capabilities, will be available to all iOS users later this year. A developer preview for iOS 11 is out now, with a public beta that will follow later this month.