To further its augmented reality plans, Apple is reportedly developing a rear-facing 3D sensor for a 2019 iPhone that would join the front-facing TrueDepth system that premiered with the iPhone X, AppleInsider reports.
This would be an entirely new technology that's completely different from the TrueDepth camera system that's now on the front notch of the iPhone X. The TrueDepth sensor system uses 30,000 infrared laser dots to map out a user's face to accurately generate a 3D image.
The present system depends on a "structured-light technique that projects a pattern of 30,000 laser dots onto a user's face and measures the distortion to generate an accurate 3D image for authentication".
According to a report on Bloomberg, the company is going to use a "time-of-flight" method in the 2019 iPhone.
According to Apple, "TrueDepth camera that enables Face ID brings portrait mode with portrait Lighting to the front camera for selfies with a depth-of-field effect and enables Animoji, which captures and analyses over 50 different facial muscle movements to bring emoji to life in a fun new way".
The goal of the laser-based 3D sensors is for what CEO Tim Cook believes would play a huge part in the future of Apple: augmented reality. Among the potential manufacturers are Infineon Technologies AG, Sony Corp., STMicroelectronics NV and Panasonic Corp. A new report, meanwhile, believes that Apple will introduce a breakthrough feature in the 2019 iPhones.
What Are The Laser-Based 3D Sensors For? Moreover, Apple might be heading towards making the new generation of iPhones more augmented-reality focused handset. Current AR apps for the iOS can place virtual objects in space relative to flat surfaces, but they could not, for example, place the virtual objects behind a real-world item.
According to Bloomberg's sources, Apple has begun contacting prospective suppliers for the new system.