Apple Vision Pro’s incredible engineering effort to craft a whole new AR operating environment starts and ends with what you do with your eyes. From hugely precise eye-tracking to use it as a virtual mouse to deploying vision sensors to track your hand movements for input actions, Apple’s spatial computing future puts a lot of focus on your eyes. Check out the awesome Tim Cook intro cut of the Vision Pro in the video below…
Not just this, but the following three patented technologies are central to make the Vision Pro reality come to life like nothing you’ve ever seen before in any other AR/XR headset till date:
Also read: Apple Vision Pro is superior to Meta Quest 3 but is that a reason to spend on it?
1) Optic ID: This patented technology uses a person’s iris (which is unique for every person) in their eye as part of a new secure authentication system. The Vision Pro headset’s internal, invisible eye-tracking LED lights capture their “Optic ID” and securely encrypt it on-device — very similar to the current Face ID login on most iPhones.
This data remains privately encrypted on the Vision Pro headset and it’s unreadable for anyone else, be it apps or websites you access through the Vision Pro or even Apple itself for that matter.
2) EyeSight: This is another incredible innovation on the Apple Vision Pro headset — you see the headset isn’t exactly see-through, yet it has the ability to sense the user’s environment, detect any other person trying to communicate with them physically in their field of vision, at which point the Vision Pro headset displays the user’s eyes on the external glass layer of the HUD.
Basically, it’s over-engineered to such a level that from the front it accurately recreates the eye movement realistically to make any other person in the room feel like they’re looking directly into the Vision Pro wearer’s eyes even when they aren’t.
Also read: 5 cool gadgets you can buy instead of Apple Vision Pro’s expected ₹3 lakh price
3) visionOS: In true Apple style, the Vision Pro comes with its own spatial computing operating system, called the visionOS. It’s a sophisticated 3D environment that makes screens, interfaces, and digital actions taken inside the Vision Pro’s extended-reality canvas feel like it’s present in the user’s actual physical space.
How does it do that? By constantly adjusting to changes in natural light, position and angle of viewing, which casts shadows on the wall or on the floor below to enhance the depth effect.
Of course, this article doesn’t even begin to scratch the surface of all the engineering breakthroughs packed inside the Apple Vision Pro headset, but hopefully it gives you some idea on how Apple is taking advantage of eye-tracking based technology to a whole new level.
Are you excited to try the Apple Vision Pro when it launches next year? Let us know in the comments below.