Apple’s AR headset has been at the center of a whirlwind of rumors lately. Apple analyst Ming-Chi Kuo has now offered additional details on the device, this time predicting that it could feature advanced hand gesture detection.
Kuo claims that Apple’s mixed reality headset will be equipped with four sets of 3D sensors, as opposed to the iPhone’s single unit. This will enable the device to detect objects and capture gestures with more accuracy. Its precision could even rival that of the iPhone’s TrueDepth camera which is used for FaceID. He also states that the headset’s advanced sensors will be able to detect objects at greater distances due to its increased field of view, in particular, up to 200 percent more than the current sensors used by FaceID.
The Apple analyst says that the headset will not only be able to detect the general position of objects, but also capture the dynamic details of those objects. He notes:
We predict that the structured light of the AR/MR headset can detect not only the position change of the user or other people’s hand and object in front of the user’s eyes but also the dynamic detail change of the hand (just like the iPhone’s Face ID/structured light/Animoji can detect user’s dynamic expression change). Capturing the details of hand movement can provide a more intuitive and vivid human-machine UI (for example, detecting the user’s hand from a clenched fist to open and the balloon [image] in hand flying away).
Besides hand gesture detection, the device is rumored to feature iris recognition, eye tracking, voice control, skin detection, spatial detection, and even facial expression detection. We had earlier reported that the headset could sport more than a dozen cameras for tracking eye and hand movements.
Yesterday, Kuo suggested that Apple’s headset could weigh less than one pound. He also hinted that a lighter second-gen AR headset is in the works, with a purported launch date of 2024.
[Via 9to5Mac]