Unity Gives Developers the Keys to ARKit 2.0 Capabilities via Updated Plugin
With developers chomping at the bit to play with ARKit 2.0, Unity has updated its ARKit plugin to enable access to the new augmented reality superpowers of the toolkit.
The updated plugin is available now for developers via Bitbucket.
“Apple announced exciting news for AR developers last week at WWDC, including ARKit 2. Unity has worked closely with Apple to allow our developers instant access to all of these new features with an update to Unity ARKit Plugin,” wrote Jimmy Alamparambil, the software architect and engineering director at Unity, in a blog post reviewing the technical details of the new features.
For shared and persistent AR experiences, developers can tap into the ARWorldMap feature. ARWorldMap saves feature points of a user’s environment that can be sent to other users to establish a multiplayer session or loaded by an app to recall persistent content. Apps can use the feature points in ARWorldMap corresponding to the observed environment to reorient the device to its surroundings.
- If you are creating games or 3D models: FlatPyramid – 3d models marketplace
To help illustrate these new capabilities, Unity has published a sample project called Shared Spheres on Github.
To leverage 3D object recognition, Unity now offers ARReferenceObject and ARObjectAnchor. The tandem work similarly to their counterparts for image recognition, with a reference file providing feature points of the scanned object and the anchor feature tracking the target object based on recognized points.
Also arriving for ARKit 2.0 is a new anchor type called AREnvironmentProbeAnchor, which establishes an environmental map of an area, updates the map over time, and uses machine learning to anticipate changes for textures and lighting. In turn, Unity has added a new parameter to work with the anchor and three new values for specifying textures.
While Apple introduced image recognition in ARKit 1.5, the latest version brings tracking based on recognized 3D objects. Unity has responded in kind by adding an extra parameter for determining how many objects to track at one time.
Finally, Unity has also enabled parameters for tongue and eye gaze tracking in iPhone X apps, which Apple showcased during the Animoji update demo at its WWDC keynote.
While apps working with ARKit 2.0 won’t drop in the App Store until iOS 12 officially arrives, we can’t wait to see what developers build with these new toys.