Readers like you help support MUO. When you make a purchase using links on our site, we may earn an affiliate commission. Read More.

The Vision Pro brings new Apple silicon, the R1 chip, dedicated to real-time data processing from all onboard sensors. It's responsible for eye, hand, and head tracking, lag-free rendering of the user's environment in video passthrough mode, and other visionOS features.

By offloading the processing burden from the main processor and optimizing performance, the R1 reduces motion sickness to imperceptible levels, whether using the headset in augmented reality or virtual reality mode. Let's explore how the Apple R1 chip works and compares to the main M2 chip, the Vision Pro features it enables, and more.

What Is Apple's R1 Chip? How Does It Work?

The Apple R1, not the main M2 chip, processes a continuous stream of real-time data feed to the Vision Pro by its twelve cameras, five sensors, and six microphones.

Highlights of the external sensors, cameras and microphones on the Vision Pro headset
Image Credit: Apple

Two main external cameras record your world, pushing over a billion pixels to the headset's 4K screens each second. On top of that, a pair of side cameras, along with two bottom-mounted cameras and two infrared illuminators, track hand movement from a wide range of positions—even in low‑light conditions.

The outward-facing sensors also include the LiDAR Scanner and Apple's TrueDepth camera that capture a depth map of your surroundings, enabling Vision Pro to position digital objects accurately in your space. On the inside, a ring of LEDs around each screen and two infrared cameras track your eye movement, which forms the basis of visionOS navigation.

The R1 is tasked with processing data from all those sensors, including the inertial measurement units, with imperceptible delay. This is of utmost importance in terms of making the spatial experience smooth and believable.

How Does the Apple R1 Compare to M1 and M2?

Woman talking to coworkers on FaceTime while viewing a presentation in augmented reality on Apple Vision Pro headset
Image Credit: Apple

The M1 and M2 are general-purpose processors optimized for Mac computers. The R1 is a narrow-focus coprocessor designed to support smooth AR experiences. It does its job faster than either the M1 or M2 could, allowing for perks like a lag-free experience.

Apple hasn't specified how many CPU and GPU cores the R1 has nor detailed the CPU frequency and the RAM, making a direct comparison between the R1, M1, and M2 difficult.

The R1's primary domains are eye and head tracking, hand gestures, and real-time 3D mapping via the LiDAR sensor. Offloading those computationally intensive operations lets the M2 run the various visionOS subsystems, algorithms, and apps efficiently.

The Key Features of the Vision Pro's R1 Chip

The R1 has these key capabilities:

  • Fast processing: The specialized algorithms and image signal processing in the R1 are optimized for understanding sensor, camera, and microphone inputs.
  • Low latency: Optimized hardware architecture results in very low latency.
  • Power efficiency: The R1 handles a particular set of tasks while using minimal energy, thanks to its effective memory architecture and TSMC's 5nm manufacturing process.

On the downside, the Vision Pro's dual-chip design and the R1's sophistication contribute to the headset's high price and two-hour battery life.

What Perks Does the R1 Bring to the Vision Pro?

The R1 enables precise eye and hand tracking that "just works." To navigate visionOS, for example, you direct your gaze at buttons and other elements.

Closeup of a dotted mapping on the human iris
Image Credit: Apple

The Vision Pro uses hand gestures to select items, scroll, and more. The sophistication and precision of eye and hand tracking have allowed Apple's engineers to create a mixed-reality headset requiring no physical controllers.

The R1's tracking precision and minimal delay enable additional features, like air typing on the virtual keyboard. The R1 also powers reliable head tracking—critical to creating a spatial computing canvas surrounding the user. Again, precision is key here—you want all AR objects to maintain their position no matter how you tilt and turn your head.

Man in Yellow Jacket Using Apple Vision Pro to Browse the Internet
Image Credit: Apple

Spatial awareness is another factor that contributes to the experience. The R1 takes depth data from the LiDAR sensor and the TrueDepth camera, performing real-time 3D mapping. Depth information lets the headset understand its environment, like walls and furniture.

This, in turn, is important for AR persistence, which refers to the fixed placement of virtual objects. It also helps Vision Pro notify the user before they bump into physical objects, helping reduce the risk of accidents in AR applications.

How Does the R1 Sensor Fusion Mitigate AR Motion Sickness?

The Vision Pro's dual‑chip design offloads sensor processing from the main M2 chip, which runs the visionOS operating system and apps. According to the Vision Pro press release, the R1 streams images from the external cameras to the internal displays within 12 milliseconds, or eight times faster than the blink of an eye, minimizing lag.

Young guy holding the palm of his hand on his forehead, having a headache

Lag refers to the latency between what the cameras see and the images displayed on the headset's 4K screens. The shorter the lag, the better.

Motion sickness happens when there's a perceptible lag between the input your brain receives from your eyes and what your inner ear senses. It can occur in many situations, including in an amusement park, on a boat or cruise ride, while using a VR device, etc.

VR can make people sick due to sensory conflict, resulting in motion sickness symptoms such as disorientation, nausea, dizziness, headaches, eye strain, seating, vomiting, and others.

VR can also be bad for your eyes because of eye strain, whose symptoms include sore or itching eyes, double vision, headaches, and a sore neck. Some folks can feel one or more of such symptoms for several hours after taking the headset off.

As a rule of thumb, a VR device should refresh the display at least 90 times per second (FPS), and the screen delay should be below 20 milliseconds to avoid giving people motion sickness.

With the stated lag of just 12 milliseconds, the R1 cuts down the lag to an imperceptible level. While the R1 helps minimize the effects of motion sickness, some Vision Pro testers reported motion sickness symptoms after wearing the headset for over 30 minutes.

Specialized Apple Silicon Coprocessors Bring Major Advantages

Apple is no stranger to specialized processors. Over the years, its silicon team has produced mobile and desktop chips that are the envy of the industry.

Apple silicon chips rely heavily on specialized coprocessors to handle specific features. The Secure Enclave securely manages biometric and payment data, for example, while the Neural Engine accelerates AI functions without destroying the battery.

They're perfect examples of what can be achieved by having a highly focused coprocessor for the right set of tasks vs. using the main processor for everything.