Apple Vision Pro Headset: what does it do and will it deliver?
Apple
recently unveiled its Vision Pro headset at the Worldwide Developers
Conference in California. With it, Apple is
venturing into a market of head-mounted
devices (HMDs) – which
are usually just displays, but in this case is more of a
complete computer attached to your head – as well as the worlds of virtual reality
(VR),
augmented reality (AR) and mixed reality (MR).
The new Apple product will fuel the hopes of many working on these technologies
that they will some day be
routinely used by the public, just as the iPhone, iPad and
Apple Watch helped bring smartphones, tablets and wearable tech into mainstream
use.
But what does the Vision Pro actually do, and how much mass appeal will it have?
VR immerses users in an entirely computer-generated world, isolating them to a large
degree from their physical surroundings. AR superimposes
computer-generated
elements onto the real world while the latter remains visible,
with the purpose of
enhancing the context of our physical surroundings.
A term often used interchangeably with AR is mixed reality,
referring to a set of
immersive technologies including AR, that provide different “blends”
of physical
and virtual worlds. These three technologies are often collectively referred to as XR.
The blending of VR and AR seems to be a key part of Apple’s thinking, with the
Vision Pro allowing users to adjust their level of
immersion by deciding how much
of the real world they can see. This transitioning between the two experiences will
probably be a trend for future HMDs.
Apple’s CEO Tim Cook was at the unveiling at the Apple Worldwide Developers
Conference (WWDC) in California. JOHN G. MABANGLO / EPA IMAGES
The physical world is “seen” through an array of 12 cameras located behind a
ski-
goggle-like glass fascia, acting as a lens. When the Vision Pro is in VR mode, people
approaching you in the real world are automatically
detected and displayed as they
get close.
A feature called EyeSight also displays the wearer’s eyes through the glass lens when
needed, to enable more natural interaction with people around them – a challenge for
many HMDs.
In terms
of technical specifications, the Vision Pro is impressive. It uses a
combination of the M2 microchip and a new chip called the R1. M2
is running
visionOS, which Apple calls its first spatial operating system, along with computer
vision algorithms and computer graphics generation.
R1 processes information from the cameras, an array of microphones and a LiDAR
scanner – which uses a laser to measure distances to different objects – in order to
make the headset aware of its surroundings.
More importantly, the Vision Pro
boasts an impressive display system with “more
pixels than a 4K TV to each eye”. Its ability to track where the wearer’s eyes are
looking allows users to interact with graphical elements just by looking at them. The
headset can receive gesture and voice commands and features a form of 360-degree
sound called
spatial audio. The quoted unplugged operating time is two hours.