While many of Apple’s investments in innovative technologies pay off, some just don’t: Think back to the “tremendous amount” of money and engineering time it spent on force-sensitive screens, which are now in the process of disappearing from Apple Watches and iPhones, or its work on Siri, which still feels like it’s in beta nine years after it was first integrated into iOS. In some cases, Apple’s backing is enough to take a new technology into the mainstream; in others, Apple gets a feature into a lot of devices only for the innovation to go nowhere.
Lidar has the potential to be Apple’s next “here today, gone tomorrow” technology. The laser-based depth scanner was the marquee addition to the 2020 iPad Pro that debuted this March, and has been rumored for nearly two years as a 2020 iPhone feature. Recently leaked rear glass panes for the iPhone 12 Pro and Max suggest that lidar scanners will appear in both phones, though they’re unlikely to be in the non-Pro versions of the iPhone 12. Moreover, they may be the only major changes to the new iPhones’ rear camera arrays this year.
If you don’t fully understand lidar, you’re not alone. Think of it as an extra camera that rapidly captures a room’s depth data rather than creating traditional photos or videos. To users, visualizations of lidar look like black-and-white point clouds focused on the edges of objects, but when devices gather lidar data, they know relative depth locations for the individual points and can use that depth information to improve augmented reality, traditional photography, and various computer vision tasks. Unlike a flat photo, a depth scan offers a finely detailed differentiation of what’s close, mid range, and far away.
Six months after lidar arrived in the iPad Pro, the hardware’s potential hasn’t been matched by Apple software. Rather than releasing a new user-facing app to show off the feature or conspicuously augmenting the iPad’s popular Camera app with depth-sensing tricks, Apple pitched lidar to developers as a way to instantly improve their existing AR software — often without the need for extra coding. Room-scanning and depth features previously implemented in apps would just work faster and more accurately than before. As just one example, AR content composited on real-world camera video could automatically hide partially behind depth-sensed objects, a feature known as occlusion.
In short, adding lidar to the iPad Pro made a narrow category of apps a little better on a narrow slice of Apple devices. From a user’s perspective, the best Apple-provided examples of the technology’s potential were hidden in the Apple Store app, which can display 3D models of certain devices (Mac Pro, yes; iMac, no) in AR, and iPadOS’ obscure “Measure” app, which previously did a mediocre job of guesstimating real-world object lengths, but did a better job after adding lidar. It’s worth underscoring that those aren’t objectively good examples, and no one in their right mind — except an AR developer — would buy a device solely to gain such marginal AR performance improvements.
Whether lidar will make a bigger impact on iPhones remains to be seen. If it’s truly a Pro-exclusive feature this year, not only will fewer people have access to it, but developers will have less incentive to develop lidar-dependent features. Even if Apple sells tens of millions of iPhone 12 Pro devices, they’ll almost certainly follow the pattern of the iPhone 11, which reportedly outsold its more expensive Pro brethren across the world. Consequently, lidar would be a comparatively niche feature, rather than a baseline expectation for all iPhone 12 series users.
That said, if Apple uses the lidar hardware properly in the iPhones, it could become a bigger deal and differentiator going forward. Industry scuttlebutt suggests that Apple will use lidar to improve the Pro cameras’ autofocus features and depth-based processing effects, such as Portrait Mode, which artificially blurs photo backgrounds to create a DSLR-like “bokeh” effect. Since lidar’s invisible lasers work in pitch black rooms — and quickly — they could serve as a better low-light autofocus system than current techniques that rely on minute differences measured by an optical camera sensor. Faux bokeh and other visual effects could and likely will be applicable to video recordings, as well. Developers such as Niantic could also use the hardware to improve Pokémon Go for a subset of iPhones, and given the massive size of its user base, that could be a win for AR gamers.
Apple won’t be the first company to offer a rear depth sensor in a phone. Samsung introduced a similar technology in the Galaxy S10 series last year, adding it to subsequent Note 10 and S20 models, but a lack of killer apps and performance issues reportedly led the company to drop the feature from the Note 20 and next year’s S series. While Samsung is apparently redesigning its depth sensor to better rival the Sony-developed Lidar Scanner Apple uses in its devices, finding killer apps for the technology may remain challenging.
Though consumer and developer interest in depth sensing technologies may have (temporarily) plateaued, there’s been no shortage of demand for higher-resolution smartphone cameras. Virtually every Android phone maker leaped forward in sensor technology this year, such that even midrange phones now commonly include at least one camera with 4 to 10 times the resolution of Apple’s iPhone sensors. Relying on lidar alone won’t help Apple bridge the resolution gap, but it may further its prior claims that it’s doing the most with its smaller number of pixels.
Ultimately, the problems with Apple-owned innovations such as 3D Touch, Force Touch, and Siri haven’t come down to whether the technologies are inherently good or bad, but whether they’ve been widely adopted by developers and users. As augmented reality hardware continues to advance — and demand fast, room-scale depth scanning for everything from object placement to gesture control tracking — there’s every reason to believe that lidar is going to be either a fundamental technology or a preferred solution. But Apple is going to need to make a better case for lidar in the iPhone than it has on the iPad, and soon, lest the technology wind up forgotten and abandoned rather than core to the next generation of mobile computing.
Author: Jeremy Horwitz
Source: Venturebeat