Apple Vision Pro Perception Analysis
--
The Apple Vision Pro device is equipped with multiple sensors to enhance its depth perception capabilities and create a seamless virtual and real experience in three-dimensional space. The visual sensors included are RGB camera, infrared camera, dToF LiDAR, structured light camera, and fisheye infrared camera.
The exterior of the Apple Vision Pro features:
- 2 forward-facing RGB cameras for forward shooting and VST (Virtual Spatial Tracking).
- 4 fisheye infrared cameras facing side and forward for 6DOF (Six Degrees of Freedom) tracking.
- 2 downward-facing infrared cameras for torso tracking and gesture tracking below.
- 2 infrared lasers that emit infrared light to illuminate the torso, legs, knees, hands, and the surrounding area within the control range, assisting the infrared and fisheye infrared cameras in capturing active elements in those areas.
- 1 dToF LiDAR laser radar, similar to the one used in the rear camera of iPhone Pro, supporting 3D shooting, spatial reconstruction, depth perception, and positioning.
- 1 structured light camera, also known as a TrueDepth camera, similar to the front-facing structured light Face ID on iPhones. It supports face scanning in FaceTime applications and precise gesture tracking in the forward area.
Inside the Apple Vision Pro, there are 4 infrared cameras and a circle of LEDs. It is speculated that the structured light scheme’s light field information is used for eye tracking and eye expression analysis.
Apple utilizes a variety of sensor hardware in the Apple Vision Pro, distinguishing it from other VR manufacturers. Apple incorporates LiDAR and structured light sensors, eliminates the need for handheld controllers, and relies on gesture tracking for interaction. This combination, coupled with Apple’s excellent UI interface effects, has led to outstanding performance compared to similar hardware devices. Initial users of the Apple Vision Pro have highly praised most of its functions.