Apple is optimistic about lidar, a technology found in the iPhone 12 family, specifically the iPhone 12 Pro and iPhone 12 Pro Max. According to rumors, lidar will be available on all four models of Apple’s iPhone 13 lineup. If you look closely at an iPhone 12 Pro or an iPad Pro since 2020, you’ll notice a small black dot near the camera lenses, about the size of the flash. That’s the lidar sensor, and it provides a new type of depth sensing that could be useful in a variety of ways.

If Apple has its way, lidar will be a term you’ll hear a lot more of in the future, so let’s break down what we know, what Apple plans to use it for, and where the technology could go next. And, if you’re wondering what it does right now, I spent some time with the technology as well.

Lidar, which stands for light detection and ranging, has been around for quite some time. It employs lasers to ping off objects and return to the laser’s source, measuring distance by timing the travel, or flight, of the light pulse.

A time-of-flight camera is a type of lidar. Other smartphones measure depth with a single light pulse, whereas a smartphone with this type of lidar technology sends out waves of light pulses in the form of a spray of infrared dots and can measure each one with its sensor, creating a field of points that map out distances and can “mesh” the dimensions of a space and the objects in it. The light pulses are invisible to the naked eye, but a night vision camera can detect them.

Lidar is a technology that is becoming increasingly popular. It’s used in self-driving cars, as well as assisted driving. It is employed in robotics and drones. Similar technology is used in augmented reality headsets such as the HoloLens 2, which maps out room spaces before layering 3D virtual objects into them. There’s even a lidar-equipped VR headset. It does, however, have a long history.

The Kinect, Microsoft’s old depth-sensing Xbox accessory, was also a camera with infrared depth-scanning. In fact, Apple purchased PrimeSense, the company that helped develop the Kinect technology, in 2013. Apple’s face-scanning TrueDepth and rear lidar camera sensors are now available.

Time-of-flight cameras are commonly used on smartphones to improve focus accuracy and speed, and the iPhone 12 Pro does the same. Apple promises improved low-light focus that is up to six times faster. The lidar depth-sensing technology is also used to improve the effects of the night portrait mode. So far, it has made an impression: for more information, see our review of the iPhone 12 Pro Max.

Better focus is a plus, and there’s a chance the iPhone 12 Pro will include more 3D photo data in images as well. Although that aspect has yet to be revealed, Apple’s front-facing, depth-sensing TrueDepth camera has been used in a similar way with apps, and third-party developers could jump in and come up with some wild ideas. It is already taking place.

Lidar enables the iPhone 12 Pro to launch AR apps much faster and create a quick map of a room to add more detail. Many of Apple’s augmented reality (AR) updates in iOS 14 make use of lidar to hide virtual objects behind real ones (called occlusion) and place virtual objects within more complex room mappings, such as on a table or chair.

I’ve been testing it on Hot Lava, an Apple Arcade game that already uses lidar to scan a room and all of its obstacles. I was able to hide things behind real-life objects in the room by placing virtual objects on the stairs. Expect to see a lot more AR apps add lidar support like this for richer experiences.

However, there is additional potential with a longer tail. Many companies are working on headsets that will blend virtual and real objects: AR glasses, which are being developed by Facebook, Qualcomm, Snapchat, Microsoft, Magic Leap, and most likely Apple and others, will rely on advanced 3D maps of the world to layer virtual objects onto.