Apple Is Said to Target Rear-Facing 3-D Sensor for 2019 iPhone
Apple is working on a rear-facing 3-D sensor system for the iPhone in 2019, another step toward turning the handset into a leading augmented-reality device, according to Bloomberg, citing people familiar with the plan.
Apple is evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X. The existing system relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user’s face and measures the distortion to generate an accurate 3-D image for authentication. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment.
The company is expected to keep the TrueDepth system, so future iPhones will have both front and rear-facing 3-D sensing capabilities. Apple has started discussions with prospective suppliers of the new system, the people said. Companies manufacturing time-of-flight sensors include Infineon, Sony, STMicroelectronics and Panasonic. The testing of the technology is still in early stages and it could end up not being used in the final version of the phone.
While the structured light approach requires lasers to be positioned very precisely, the time-of-flight technology instead relies on a more advanced image sensor. That may make time-of-flight systems easier to assemble in high volume.
Google has been working with Infineon on depth perception as part of its AR development push, Project Tango, unveiled in 2014. The Infineon chip is already used in Lenovo’s Phab 2 Pro and Asustek’s ZenFone AR, both of which run on Google’s Android operating system.