But it’s even more appropriate when you leave your apartment and step out into the world, which introduces complications like walking around. “This is
But it’s even more appropriate when you leave your apartment and step out into the world, which introduces complications like walking around. “This isn’t your run of the mill ‘stand still and point your phone at something’ AR experience,” says Jessica Brillhart, director of the Mixed Reality Lab at the University of Southern California’s Institute for Creative Technologies. “Now you can walk and AR in a much more seamless way. Virtual objects will now be easily occluded—meaning assets will be much more baked into the world you’re moving through, plus they’ll continually map to your perspective and position within space relative to their placement.”
That becomes even more important when you offload AR sensors to eyeglasses rather than a tablet or phone. Rather than having to reference narrow slices of a virtual world through a smartphone screen, a lidar scanner will enable experiences that truly envelop you. Apple’s lidar component also doesn’t take up much space.
“The technology is ideally suited for all small scale devices, says Wetzstein. “It’s low power, it’s lightweight, it’s small scale, it gives you high-quality depth. You can probably combine a number of things together so they wouldn’t interfere with one another,” because lidar works by identifying individual points in space rather than an entire room all at once.
So … why is it on an iPad Pro again, the least mobile of Apple’s mobile computers? Think of it as a headset head start, for both the supply chain that provides the component and the developers who need time to figure out what to do with it. “iPad feels like an especially odd place for it, but it may just be that suppliers aren’t ready to meet iPhone demand yet,” says Troughton-Smith. “It also gives Apple a chance to get developers to build experiences now that they can eventually show off whenever the next iPhone is introduced.”
That last bit will be especially important. While Apple has spent the last several years pushing its ARKit framework—devoting big chunks of stage time to it at its annual Worldwide Developers Conference—the tech has yet to go mainstream outside of a few specific cases, most notably Pokémon Go. If and when Apple does finally push into a new product category, it needs fully baked experiences to show off along with it. Getting lidar in front of developers through the iPad Pro will help familiarize them now, helping to stave off a potential chicken-and-egg problem later.
“Immersive media has a content problem more-so than a hardware problem,” says Brillhart. “Most immersive technologies fail if engaging content is lacking, which is practically every piece of hardware right now. If I were Apple, I would be trying to seed this ecosystem now so that when I did release hardware, I would be avoiding a similar fate.”
Whatever form Apple’s head-worn AR takes, and when, remains unclear, especially given the disruptive long-term effects of Covid-19. But when it does come, you can expect lidar to play a critical role. The iPad Pro is its dress rehearsal.
More Great WIRED Stories