Decoding the Science behind Apple’s Vision Pro Mixed-Reality Headset


Science behind Apple's Vision Pro Mixed-Reality Headset
Decoding the Science behind Apple's Vision Pro Mixed-Reality Headset
Spread the love

7 June 2023, Bengaluru, India

The long-awaited “Vision Pro” mixed reality headset was presented by Apple on Monday. This is the tech giant’s first significant product launch since the 2014 release of the Apple Watch. When it launches in early 2024, the tablet will cost $3499. It is designed for programmers and content creators rather than regular users. Although it may sound futuristic, the helmet could usher in a new age for Apple and the entire industry. 

Simply said, Apple’s Vision Pro introduces a technical layer to your real-world situations, bringing the digital to the real world. The headgear which resembles a pair of ski goggles, brings the Apple experience you’re familiar with while using iPhones or Mac computers into the physical world.

The Vision Pro follows in the footsteps of many previous Apple devices. There is a lot of complex technology that appears to have a simple user interface and experience.

‘Creating our first spatial computer required innovation in practically every facet of the system. ‘Through the integration of hardware and software, we built a stand-alone spatial computer in a compact wearable form factor that is the finest personal electronics device ever,’ said Mike Rockwell, vice president of Apple’s technology development department, in a news release.

According to TechCrunch, the Vision Pro will feature a total of 23 sensors, including 12 cameras, 5 sensors, and 6 microphones. These sensors, along with its ground-breaking R1 chip, two internal displays (one for each eye), and an advanced lens system, will be used to trick the user into thinking they are seeing the real world when they are receiving a “live feed” of their surroundings with an overlay on top.

See also  Karnataka planning to build Hanuman temple on similar lines to Ayodhya

Apple claims that the R1 chip has been created to “eliminate lag” and motion sickness. Of course, the gadget also includes the more traditional M2 chip for the remaining computing tasks that power the programs you use with it.

To imitate how the image of your surroundings will alter depending on your motions, infrared cameras built into the headset will monitor your eyes and change the internal display accordingly.

The headset also has outside cameras that fire downward. These will track your hands so that you may use gestures to interact with visions. Additionally, the Vision Pro’s outside LIDAR sensors will continuously track the locations of surrounding objects.

Though we perceive and live in a three-dimensional world, our eyes can only perceive things in two dimensions. Our brains have just learned to perceive depth in the way that they do. To introduce what humans experience as depth, it uses two slightly different images from each eye and performs its processing.

It is likely that the two screens in the Vision Pro will capitalize on this processing by showing two slightly different images, deceiving our brain into believing it is viewing a three-dimensional image. Once the brain has been misled, the person has been tricked as well, and presto! The user is now experiencing 3D.

[Source of Information: indianexpress.com]


Spread the love

Suraj Verma

As a highly skilled and experienced content writer, I have a passion for creating engaging and informative content that connects with audiences and inspires them to take action. With over 1 year of experience in the industry, I have honed my writing skills to craft content that is both effective and SEO-friendly.