The company plans to launch at least one iPhone 12 model with a VCSEL laser system, great for depth measurements that feed advanced augmented reality applications.
Citing sources familiar with Apple’s plans, Fast Company reports that the tech giant will use the San Jose-based VCSEL lasers from Lumentum, which will power the iPhone 12’s 3D depth sensor .
Currently, Apple is relying on Lumentum for lasers in TrueDepth, the front camera system that powers Face ID, Animoji and other features on the iPhone and iPad.
Unlike TrueDepth, the next system should calculate depth with time of flight (ToF) technology.
TrueDepth, introduced with iPhone X in 2017, deduces depth using an infrared VCSEL transmitter and specialized receiver together with a traditional color RGB camera module to measure deviations in structured light – a dot pattern – projected onto a user’s face.
The new system that will be integrated in 2020, on the other hand, will generate a depth map by calculating the time taken by the pulses of the laser light to bounce on a target.
Compared to TrueDepth, ToF systems are generally considered more accurate and practicable over long distances . The latter consideration is important for a system designed to measure a user’s world.
As in previous reports, Fast Company believes that a 3D system facing the back will bring users more effective photographic effects , especially for Portrait and Portrait Lighting modes that separate a subject from its background. AR apps could also benefit from an extremely precise depth array.
In related developments, a report earlier this week says Apple is planning to introduce a new augmented reality app in iOS 14 , codenamed Gobi. The app is said to allow users to ” get more information about the world around them ” through AR.
These rumors indicate that two high-end iPhones arriving this year will use ToF technology, indicating that it may not be available in low-end iPhones.