Close ad

Last week, Apple introduced several new products, including the new iPad Pro. In addition to a new (and slightly more powerful) SoC and increased operating memory capacity, it also offers an updated camera system, which is complemented by a new LIDAR sensor. A video appeared on YouTube that clearly demonstrates what this sensor can do and what it will be used for in practice.

LIDAR stands for Light Detection And Ranging, and as the name suggests, this sensor aims to map the area in front of the iPad's camera using laser scanning of the surroundings. This can be a bit hard to imagine, and a newly released YouTube video that shows real-time mapping in action helps with that.

Thanks to the new LIDAR sensor, the iPad Pro is able to better map the surrounding environment and "read" where everything around is located with regard to the iPad as the center of the mapped area. This is very important especially with regard to the use of applications and functions designed for augmented reality. This is because they will be able to "read" the surroundings better and be both far more accurate and at the same time more capable with regard to the use of the space into which things from augmented reality are projected.

The LIDAR sensor does not have much use yet, as the possibilities of augmented reality are still relatively limited in applications. However, it is the new LIDAR sensor that should make a significant contribution to the fact that AR applications will be significantly improved and spread among ordinary users. In addition, it can be expected that LIDAR sensors will be extended to the new iPhones, which will significantly increase the user base, which should motivate developers to develop new AR applications all the more. From which we can only benefit.

.