AppleInsider quotes KGI analyst Ming-Chi Kuo on the oncoming iPhone "Face ID" design. It employs 4 cameras: a regular front camera, structured light 3D camera and proximity ToF sensor:
"
According to Kuo, Apple's system relies on four main components: a structured light transmitter, structure light receiver, front camera and time of flight/proximity sensor.
Kuo points out that structured light transmitter and receiver setups have distance constraints. With an estimated 50 to 100 centimeter hard cap, Apple needs to include a proximity sensor capable of performing time of flight calculations. The analyst believes data from this specialized sensor will be employed to trigger user experience alerts. For example, a user might be informed that they are holding an iPhone too far or too close to their face for optimal 3D sensing."
Related Posts :
HDR Pixel with Tone MappingDecember 2020 issue of an International Journal on Sensing and Imaging publishes a paper "On Wide Dynamic Range Tone Mapping CMOS Image Sens… Read More...
Media: Huawei Allocates 10,000 Employees to LiDAR DevelopmentcnTechPost, EqualOcean, NaijaTechNews, Observer Network: Huawei optoelectronics R&D center in Wihan employing 10,000 people develops aut… Read More...
Ouster Explains Details of its SPAD LiDAR SensorOuster 2018 article explains the design choices behind its 850nm SPAD-based LiDAR. Among other blocks, the article covers SPAD sensor and it… Read More...
Night Vision Circa 1974A Vimeo video "Night Vision R&D 1974 US Army; Research and Development Progress Report No. 53" shows how far the imaging has advanced ov… Read More...
Xiaomi Smartphone with Omnivision Sensor in the Main Camera Wins Top DXOMark ScoreXDA Developers: Xiaomi announces its new flagship phone, Mi 10 Ultra. In its main camera, it uses "48MP, custom 1/1.32″ sensor (OV48C), 2.4µ… Read More...
0 Response to "KGI on Apple iPhone "Face ID" Internals"
Post a Comment