Headset screen recording: Capturing immersive AR/MR experiences
When developing, testing, or providing user support for augmented reality (AR) or mixed reality (MR) headsets, screen recording is a crucial tool for reproducing and analyzing issues. However, unlike mobile devices, headset recording involves specialized technologies such as passthrough and foveated rendering. The recorded footage often does not fully match the complete stereoscopic view seen directly by the user's eyes. Therefore, when interpreting the recorded content, it is essential to be mindful of the differences between the recording and the actual human-eye experience.
Why is there a visual difference in headset recordings?
Users must pay special attention: The recorded screen of a headset device is not exactly the same as the actual image you see behind the lenses. If you notice anomalies in the recording, be sure to describe them in conjunction with your actual experience to avoid misleading developers.
Here are the main differences:
- Resolution and clarity differences
- Recording: Typically a single-eye (e.g., left eye) 1080p or lower-resolution video stream.
- Human eye: Depending on the application, modern headsets can achieve single-eye resolutions of 2K or even higher.
- Impact: Text or blurry visuals you see in the recording may not exist in the headset; conversely, minor edge aliasing you notice in the headset may be completely invisible in the recording.
- Field-of-view (FOV) differences
- Recording: The captured footage is usually square-shaped. Especially in wide-FOV devices, edge content is often cropped.
- Human eye: The image seen by the human eye is displayed binocularly, with a larger field of view.
- Impact: If your issue occurs at the edge of the field of view, the recording may not accurately reflect it.
- Impact of foveated rendering
- Recording: All content in the full field of view is included, including low-resolution edge areas.
- Human eye: The center of the human eye's gaze is high-resolution, while the non-gaze edges are low-resolution.
- Impact: You may perceive the image as very clear, but upon reviewing the recording, you might notice some areas are blurry. This is usually normal and not a bug. Therefore, when recording, always focus on the part you want to showcase to ensure that area remains clear.
- Frame rate and refresh rate impact
- Recording: The frame rate of the recorded video is typically 30FPS or 60FPS.
- Human eye: The actual screen refresh rate experienced may reach 90Hz or 120Hz.
- Impact: Very slight frame rate drops may be imperceptible in the recording, but the human eye can feel obvious "stuttering" or "dizziness." If the recording appears smooth but the actual experience does not, this is usually a performance issue—be sure to clarify this after recording.
Special notes for OST head-mounted displays
Special attention: If you are using an OST (Optical See-Through) head-mounted display, such as Rokid, Xreal, or other AR glasses, the screen recording feature cannot capture the real-world content visible to the human eye.
What is an OST device
OST devices project light directly into your eyes through semi-transparent optical lenses, with virtual content overlaid in the field of view via micro-projectors or waveguides, allowing you to see both the real world and superimposed virtual images simultaneously. The external-world camera is only used for spatial positioning and not for display. Due to this mechanism, the device itself cannot "record" the real-world scenes as seen by the human eye.
Limitations of OST device screen recording
- Cannot record real scenes: The screen recording function of OST devices only captures the virtual content layer (GUI, 3D models, UI pointers, etc.).
- Background is black: When you replay this video on a computer or mobile device, you will see virtual objects on a pure black background, completely losing the context of the real environment.
- Occlusion relationships are lost: If virtual objects should be occluded by real objects (such as your hand or a table), they will appear to float on the black screen in the recording, creating an incorrect layering effect.
Visual Error Issues in Virtual Content Overlay
Even under ideal conditions, OST headsets inevitably exhibit some degree of visual misalignment between the displayed virtual content and the real world. These errors are not always caused by software bugs or tracking failures but often stem from fundamental differences between the device's physical properties and the human visual system. Some errors cannot be completely eliminated with current technology.
When submitting feedback, pay attention to the following scenarios:
- Optical alignment errors:
- Phenomenon: When you move your head, virtual objects appear to "float" or have slight positional deviations from real objects.
- Cause: Users view the real world directly through optical lenses, while the rendering coordinate system for virtual content is typically based on head tracking and environmental understanding systems (e.g., SLAM). Due to the physical offset between the optical display path and the cameras/IMUs used for spatial positioning—and the difficulty of achieving sub-pixel-level joint calibration—virtual-real alignment is particularly sensitive in peripheral vision or near-field scenarios.
- Eye calibration limitations:
- Phenomenon: Virtual objects appear ghosted, have blurred edges, or feel incorrectly projected in space.
- Cause: OST devices heavily rely on precise interpupillary distance (IPD) and eye position calibration. If the device is not accurately calibrated for your eyes or the headset shifts during wear, the light projected by the optical engine cannot properly enter your pupils. Even with calibration, the process is based on limited sampling points, making it difficult to precisely match each user's eye geometry, corneal curvature, and visual perception habits. Minor calibration deviations can cause systematic shifts in the depth or lateral position of virtual objects.
- Individual visual differences:
- Phenomenon: Different users may subjectively perceive the position of virtual content differently.
- Cause: Variations in users' vision (e.g., nearsightedness, astigmatism) and dynamic focusing abilities affect their subjective judgment of virtual content overlay effects. OST devices may lack real-time eye tracking and personalized optical correction capabilities, preventing dynamic compensation for these differences.
- Environmental interference:
- Phenomenon: In bright environments, virtual content becomes faint, hard to see, or even invisible; in dark environments, virtual content may appear overexposed or produce ghosting.
- Cause: OST devices directly overlay virtual light from micro-projectors or waveguides with ambient light from the real world. If the real environment is too bright, it can overpower the faint virtual light, and vice versa. These optical artifacts are inherent to OST devices.
Note
Such errors are inherent characteristics of OST AR glasses technology, not functional failures. During troubleshooting, distinguish between "fixable software/tracking issues" and "experience boundaries limited by hardware and physiology." If user feedback involves minor misalignment, edge distortion, or inconsistent depth perception, first confirm whether the device is operating within its specified conditions (e.g., working distance, lighting range, calibration status).
How to handle feedback for OST devices?
If you encounter visual issues while using OST devices, after ruling out the factors caused by the aforementioned device limitations, you can record a screen capture for feedback. However, simply submitting a screen recording is usually insufficient:
- Include photos: Use your phone to take a photo of the actual view you see through the lenses (referred to as an "eye-view photo"). This can show the relative position of virtual objects to the real environment.
- Describe the environment: Provide a detailed description of your surroundings (lighting conditions, background colors, presence of dynamic elements like pedestrians, etc.), as OST display effects are highly dependent on ambient light.
Common screen recording methods for head-mounted displays
Note
During screen recording, the device may reduce rendering frame rate or resolution, affecting the judgment of smoothness and tracking stability. It is recommended to record as soon as possible after reproducing the issue to avoid prolonged operation affecting the results.
Apple Vision Pro
- Look up at the top of the screen until you see the control center dot. Focus on this point and click.
- Look at and tap Control Center > "Record my view" button
to start recording. - To stop recording, open the control center and tap "Record my view" again.
- Your view recording will be saved in the Photos app
.
Pico 4 ultra enterprise
- Open the control center > settings > general > casting, recording & screenshots, and select "space" as the recording mode under "recording & screenshots."
- Press the home button on the controller briefly, then select the "record" button from the on-screen menu and pull the trigger to start recording.
- The system will display a countdown prompt. Immediately return to the interface of the application to be recorded, and the recording will automatically begin after the countdown ends.
- Press the home button on the controller briefly again, and repeat the selection of the "record" button to stop recording.
- Your recorded files are saved by default in the device's "internal storage/DCIM/screenrecord" directory.
Rokid AR Studio
- Go to the desktop, locate the bottom status bar, and click the "Quick Settings" area next to the user avatar.
- In the pop-up window, click the "Record" button, then immediately return to the interface of the application to be recorded. The system will automatically start recording.
- Click the X button on the Station Pro to return to the desktop. Click the "Record" button in the "Quick Settings" area again to stop recording.
- Your recording files are saved by default in the device's "Internal Shared Storage/ScreenRecorder" directory.
XREAL Air2 Ultra
- Swipe down on the Beam Pro screen to bring up the glasses' control center, then tap the "Record" button at the top.
- The system will display a countdown prompt. Immediately return to the interface of the application you want to record.
- Tap the floating red "Stop" button at the top of the screen to end the recording.
- Your recording files are saved by default in the device's "Internal Shared Storage/Movies/Record" directory.
Best practice recommendations
When using headset screen recording to report issues, focus on the following points:
- Clearly label the device model, recording method, and environmental factors when submitting.
- If the issue involves depth, occlusion, or tracking jitter, supplement with a text description of what the human eye actually observes, as screen recording may not accurately represent it.
- For critical scenarios, consider using an external camera to capture the user's real-world operations while wearing the headset to provide more comprehensive context.
Headset screen recording is convenient but always an approximate representation of the experience. The most reliable diagnosis still requires combining logs, sensor data, and user subjective feedback. For log collection methods, refer to: