Apple just announced an updated version of its augmented reality platform, timed to the developer beta release of its newest iOS software, and it includes a few new features that true AR fans are likely to be happy about.
The newest version of ARKit, version 1.5, has support for vertical planes. That’s a technical way of saying that it allows the sensors on an iPhone (or iPad) to not only recognize the floor you’re standing on, but the windows and walls around you as well. So when app developers are making AR apps now, they can build in features that would utilize the vertical spaces around you as well as the horizontal space.
For example, during a brief demo at Apple’s new campus yesterday, I tried out a generic ARKit app that involves throwing balls at a virtual, velcro game board. In a prior version of the app, if you missed the board, the balls would continue into space, because the app wouldn’t support that vertical plane that the game board is hanging on. Now, ARKit apps will “understand” walls, and objects like game balls will bounce off of them.
Since Apple first announced ARKit, more than 2,000 apps have been built out with ARKit features
The new version of ARKit also has 2D image recognition, which means you can point your phone at a flat print or wall hanging and the ARKit app will show contextual information. A flat image could also trigger another action in the app. An obvious use case is museums: point your phone at a painting, the AR app identifies the image for you, and then it can also spawn a 3D object or virtual experience in the app based on whatever image you’re looking at. In the example I saw, wall art of the Apollo 13 launched an entire virtual tour of a lunar module when the iPhone was pointed at the wall art.
And lastly, ARKit will now support imagery in 1080p HD, which it didn’t do before.
Apple first announced ARKit, its platform for augmented reality on iPhones and iPads, at WWDC last June; it formally rolled out in September. Before that, app makers had been using 2D or flat overlays in apps (like Pokémon Go), but the apps weren’t utilizing advanced depth sensors or computer vision technology. With frameworks like ARKit, or Google’s ARCore, they are.
Since Apple first announced ARKit, more than 2,000 App Store apps have been built out with ARKit features. That might not sound like a lot relative to the size of the App Store, but it is a lot of augmented reality apps.
The biggest feature change in this new version of ARKit is almost certainly the support of vertical planes. Google’s ARCore supports two types of horizontal planes (horizontal upward facing and horizontal downward facing), but as far as we know, it does not support vertical planes. Threads in GitHub indicate this is a much-requested feature for ARCore.
The consumer beta of iOS 11.3 and ARKit 1.5 is expected to go live in the coming days, though Apple didn’t specify when.
0 Response to "Apple’s AR system can now recognize more real-world objects"
Post a Comment