iPhone X Use Whicn Sensors For Face ID?

iPhone X Use Whicn Sensors For Face ID?

MADISON, Wis.— Apple is getting rid of touch-based fingerprint ID in iPhone X altogether and staking its future on Face ID.

Putting the new UI in place, Apple is “renewing the [whole] user experience of the smartphone,” observed Pierre Cambou, activity leader for imaging and sensor at market research firm Yole Développement.

With Face ID, the user simply makes “eye contact” with the new iPhone and it unlocks. The iPhone X can also help turn emojis into animated emojis, or Animoji. Sensors can capture and analyze more than 50 different facial muscle movements, enabling the user to changing the expression of emoji characters such as a panda,  chicken or unicorn. The iPhone X provides a range of on-screen masks, turning the user virtually into someone — or something — else. 

In short, the many sensors deployed in the iPhone X are there primarily for facial ID, but they also enable other apps, including Animoji and Augmented Reality. Cambou believes this versatility is the genius of Apple. “They know their audience well,” he said. 

A host of sensors in iPhone X
As the picture below shows, a number of sensors integrated into a small space at the top of the iPhone X screen are: an infrared camera, flood illuminator, proximity sensor, ambient light sensor, front camera, dot projector, speaker and microphone.

Sensors crammed into iPhone XSensors crammed into iPhone X

Cambou acknowledged that he was surprised to see the solution “way more complex than initially envisioned.” Building blocks inside the iPhone X, designed to enable Apple’s TrueDepth camera, include a structured light transmitter, a structure light receiver on the front camera and a time-of- flight/proximity sensor. Cambou said, “Apple managed to have so many technologies, and players behind those technologies, to work together for a very impressive result.”

Because all these building blocks maintain certain interdependency, an active alignment process must take place among all the modules before final assembly, to ensure accurate operation. Cambou said, “Well done indeed, if they were able to do such complex assembly.”

The Yole analyst suspects that STMicroelectronics is supplying the infrared camera and the proximity sensor. Apple might have sourced the front camera and the dot projector from AMS, he added.

While admitting that Apple isn’t — after all — using in iPhone X “ST’s SPAD imager as I dreamed,” Cambou conceded, “Apple combined admirably all the available technologies.”

How it works
3D sensing in iPhone X starts at the ToF (time of flight) sensor. Describing ToF as “more or less a presence detector,” Cambou explained that ToF powers up the other sensors, once it detects motion. Next comes a structured light, which calculates the depth and surface information of the objects in the scene.

Asked about the role of a dot projector, Cambou explained, “One needs to project infrared dots in the scene for a structured camera… so that the infrared camera from ST can pick up the image of the projected dot.”

Next page: Accuracy and security


PreviousIBM Goes All Out for AI
Next    iPhone X Packs Upgrades-at a Price