Apple is apparently set to present its AR/VR headset in 2022, but it won’t be released to the public before the beginning of 2023.
Analyst Ming-Chi Kuo has presented more details about the upcoming system in an investor note.
The glasses will be able to interpret the exact detection of hand gestures in a way similar to how the true-depth camera works (when it transmits facial expressions to Animoji, for example).
However, the system in the glasses will be much more extensive. For example it will be able to measure distance much more exactly.
Instead of a true-depth system like in the iPhone, Apple intends to install four 3D lighting systems in its glasses. These will not only recognize the exact position, but also the movement of the objects in front of them - perfect for hand movements.
In addition to the 3D sensors, Apple's future glasses should be able to track eye movements, detect pupils and distinguish skin. Technologies such as voice control, room recognition (lidar in iPads) and facial expression recognition (as with Animoji) will also feature.
According to Kuo's information, the first generation of AR glasses will weigh up to 400 grams. Apple plans to reduce weight, improve battery and processors with the second generation, however, this will not appear before 2024.
The fact that Apple can rely on gesture control for the upcoming AR glasses sounds futuristic, but for Apple this is nothing new. Even watchOS 8 includes gesture-related operation. If the setting is activated, you can use hand gestures to stop alarm clocks, scroll forwards and backwards on the display with your thumbs and index fingers and much more.
Apple won’t rely on the True Deep system for this, but rather on the motion and acceleration sensors of the Apple Watch.
For more info read: What will Apple's AR/VR glasses do?