Snap has launched a new version of its augmented reality studio that allows users to create cool AR effects on Apple’s newest iPhone 12 Pro smartphones, thanks to lidar sensors.
The social chat company said the lidar-powered Snap Lenses will usher in a new generation of AR. LiDAR, or light detection and ranging, uses lasers to illuminate objects and judge how far away they are based on how long it takes the light to reflect back. The iPhone 12 Pro will ship on October 23, while the iPhone 12 Pro Max will ship on November 13. Both are equipped with the lidar scanner sensor, which lets them detect the shape of objects in the surrounding area and more accurately map an AR image onto the surface of that object, adding a new level of realism to AR effects.
Snap is launching Lens Studio 3.2 today so developers can take advantage of lidar and build lidar-powered Lenses for the iPhone 12 Pro and the iPhone 12 Pro Max. Snap said the AR experiences can overlay on the real world more seamlessly, letting Snapchat’s camera see a metric-scale mesh of the scene and better understand the geometry and meaning of surfaces and objects.
Snap said this new level of scene understanding allows Lenses to interact realistically with the surrounding world. With the iPhone 12 Pro’s A14 Bionic and ARKit software, developers can render thousands of AR objects in real time and create immersive environments.
A new interactive preview mode in Lens Studio 3.2 allows developers to create Lenses and preview them in the world before gaining access to the new iPhone 12 Pro.
You can’t solo security
COVID-19 game security report: Learn the latest attack trends in gaming. Access here
Author: Dean Takahashi
Source: Venturebeat