A few years ago, Apple shook up the mobile market by shipping its latest iPhones and iPads with a lidar sensor. So far, they are still the only players in the game to make this feature available on their phones. However, the technology has a long pedigree, and many modernphones have features based on the same principles. So what is lidar, and what can you do with it?
What is lidar?
Lidaris an acronym for LIght Detection And Ranging and works on the same principles as radar. In both systems, an electromagnetic wave (laser light for lidar and radio waves for radar) is emitted from a transmitter, and a sensor listens for (detects) any reflections of the wave if it hits an object. The distance (range) between the reflecting object and the source of the wave can be determined based on the time between transmission and reception.
Thefirst lidar systemsemerged shortly after the invention of the laser in the early 1960s. By the 1970s,NASA was using lidarto measure the moon’s topography. Today, everyone uses lidar, from archaeologists discovering lost cities in the jungle to self-driving carmakers ensuring their vehicles can sense objects in front of them.

Oh, and it’s also on the latest iPhone Pro.
Phones with laser beams
Phones with lasers in them started showing up withLG’s G3in 2014 when they were used astime-of-flight sensorsto help with autofocusing. This is essentially the most basic use of lidar, where one infrared laser is emitted to gauge the distance to the subject of the photo. That information is then used to adjust the camera’s focus automatically.
Three years later, Apple launched the iPhone X, which had the TrueDepth camera system to allow its revolutionaryFace IDfunctionality. Although using facial recognition to unlock a phone wasn’t pioneered by Apple (that honor should go toAndroid 4.0 back in 2011), its TrueDepth system completely changed the game. Whereas Android’s first attempt at the tech was notoriously unsecure, such that one could use a photo to unlock a phone, Apple’s foray into the field was remarkably robust. It could distinguish between twins while still allowing for sunglasses and hats.

How did they achieve this technological marvel? Lasers. The TrueDepth system bombards the face with a patterned array of 30,000 infrared lasers, which are sensed by an infrared camera and analyzed by AI software to produce a detailed 3D scan of your face in seconds. It differs from a time-of-flight sensor because it doesn’t measure the time it takes for the laser light to return to the camera. It measures how the pattern of lasers (structured light) is deformed by objects in the scene and infers 3D data based on that information.
In 2020, three years after dropping the TrueDepth bomb on the market, Apple introduced its lidar system. Like its TrueDepth system, it emits anarray of dozens of lasers(again, in a known pattern), but instead of measuring how the light deforms, a specialized sensor measures how long each beam takes to reflect to the sensor. These time-of-flight measurements are combined with data from the phone’s motion sensor and camera to produce a 3D scan.

Lidar, what is it good for?
Apple’s lidar technology is cool, but what can you do with it? Lots, as it turns out, but it’s a bit niche, and none of the use cases are the kinds of things that will change the way most of us interact with our phones. More importantly (especially for manufacturers), it’s not likely to influence our decision-making process when shopping for a new phone.
The most mundane use of Apple’s lidar system is as a simple time-of-flight sensor to speed up the autofocus on its rear-facing cameras. Most phone cameras from the past few years use either contrast-based or phase-detection autofocus to keep their images sharp, but these techniques rely on having sufficient light to work. Time-of-flight sensors don’t have that limitation, so phones with these sensors have an edge in low-light photography.

Apple also makes use of its lidar in a pair of first-party apps. TheMeasure appuses lidar data, augmented reality, and simple trigonometry to give accurate measurements of people and objects.
It also has a Detection Mode in its Magnifier app that uses AI and lidar to help people with vision impairments detect the presence of doors or people and how far away they are.

Most third-party apps that use Apple’s lidar fall into one of three categories: interior design, 3D scanning, andaugmented reality games. Ikea Place is the current king of interior design apps. It uses augmented reality to let you visualize how Ikea products will look in situ, taking much of the guesswork out of knowing if that rug will tie the room together. If you’re more interested in taking measurements of your interior space, opt for Canvas, which scans a room and lets you export the data into CAD software.
For 3D printing enthusiasts and game makers looking for 3D models, there’s Scaniverse and Polycam. By scanning an object from multiple angles, you can build a 3D model which can be exported to common design software. Which one to choose comes down to personal preference and the size of your wallet since Scaniverse is free, and although Polycam has a free tier, some of its features are locked behind a subscription.
If you’d rather use the latest technology to play games, you’re in luck. One of the best lidar-activated games on the App Store is RC Club, which lets you drive a virtual RC car around your living room. The lidar scanner means you’ll crash into walls and drive over obstacles. Your virtual RC car even responds to the surface it’s driving on, slowing down when it hits a patch of carpet.
Where to get lidar
Apple’s lidar system is onlyavailable on the iPhone Proand iPad devices released since 2020, so it doesn’t have a wide distribution. Nothing similar exists on Android phones, and standalone systems cost hundreds of dollars. Although this exclusivity is a clear win for Apple, it may also hurt software-based innovation and keep lidar from expanding out of the niche position it currently occupies. On the other hand, the work Apple has done to miniaturize these systems and make them robust enough to be carried around in your pocket means that it might not be long before lidar systems are in every new phone.