Global Courant
Apple has recently been Vision Pro headset at the Worldwide Developers Conference in California. With this, Apple is venturing into a market of head-mounted devices (HMDs) – usually just displays, but in this case more like a complete computer attached to your head – as well as the worlds of virtual reality (VR). , augmented reality (AR) and mixed reality (MR).
The new Apple product will fuel the hopes of many working on these technologies that they will one day be routinely used by the public, just as the iPhone, iPad, and Apple Watch helped push smartphones, tablets, and wearable technology into the mainstream .
But what does the Vision Pro actually do and how much appeal will it have?
VR immerses users in a completely computer-generated world and greatly isolates them from their physical environment. AR superimposes computer-generated elements on the real world while keeping it visible, aiming to reinforce the context of our physical environment.
A term often used interchangeably with AR is mixed reality, referring to a range of immersive technologies, including AR, that provide different “blends” of physical and virtual worlds. These three technologies are common collectively referred to as XR.
The blending of VR and AR seems to be an important part of Apple’s thinking, with the Vision pro allowing users to adjust their level of immersion by deciding how much of the real world they can see. This transition between the two experiences will likely be a trend for future HMDs.
Apple CEO Tim Cook attended the unveiling at the Apple Worldwide Developers Conference (WWDC) in California. Photo: EPA Images via The Conversation/John G Mabanglo
The physical world is “seen” through an array of 12 cameras located behind a goggle-like glass fascia, acting as a lens. When the Vision Pro is in VR mode, people approaching you in the real world will be automatically detected and displayed when they get close.
A feature called EyeSight also displays the wearer’s eyes through the glass lens when needed, to allow for more natural interaction with those around them – a challenge for many HMDs.
In terms of technical specifications, the Vision Pro is impressive. It uses a combination of the M2 microchip and a new chip called the R1. M2 is running visionOSwhich Apple calls its first spatial operating system, along with algorithms for computer vision and computer graphics generation.
R1 processes information from the cameras, an array of microphones and a LiDAR scanner – which uses a laser to measure distances to various objects – to make the headset aware of its surroundings.
More importantly, the Vision Pro boasts an impressive display system with “more pixels than a 4K TV for each eye”. The ability to track where the wearer’s eyes are looking allows users to interact with graphical elements simply by looking at them.
The headset can receive gesture and voice commands and features a form of 360-degree sound called spatial audio. The stated operating time without plug is two hours.
Wearable ‘ecosystem’
Wrapped, in typical Apple fashion, in curved aluminum and glass, the headset has an eye-watering price of US$3,499 and represents a collection of many premium features. But Apple has a history of developing products with increasingly versatile capabilities to sense what’s happening in their real-world environment.
Tim Cook (L) and Apple Senior VP of Software Engineering Craig Federighi speak at the conference keynote address. Photo: Joe Manbanglo/EPA Images via The Conversation
Apple also focuses on making its devices interoperable – meaning they work easily with other Apple devices – creating a portable “ecosystem.” This is what promises to be really disruptive about the Vision Pro. It is also akin to what was promised and hoped for by pioneers in the idea of portable computers in the 1990s.
By combining the headset with the iPhone, which is still the backbone of the Apple ecosystem, and the Apple Watch, new applications for augmented reality can be created. Similarly, pairing the headset with many programming tools demonstrates the company’s desire to tap into an existing community of augmented reality application developers.
However, many questions remain. For example, will it be able to access mixed reality applications through a web browser? What will it be like from an ergonomic point of view?
It’s also unclear when the Vision Pro will be available outside the US or if there will be a non-Pro version – as the “Pro” part of the title implies a more “expert” or developer market.
The Vision Pro is a gamble, as XR is often seen as something that promises but rarely delivers. Still, companies like Apple and those who are probably the main competitors in the XR domain, Meta and Microsoft, have the power to popularize XR to the general public.
More importantly, devices such as the Vision Pro and its ecosystem, as well as its competitors, can form the basis for development the metaphor. This is an immersive world, facilitated by headsets, that strives for social interaction that is more natural than previous products.
Skeptics will say that Vision Pro and EyeSight make you look like a diver in your living room. But now could finally be the time to dive into the deep waters of XR.
Panagiotis Ritsossenior lecturer visualization, Bangor University And Peter ButcherLecturer Human-Computer Interaction, Bangor University
This article has been republished from The conversation under a Creative Commons license. Read the original article.
Similar:
Loading…