Headset screen recording: documenting immersive ar/mr experiences
When developing, testing, or providing user support with augmented reality (AR) or mixed reality (MR) headsets, screen recording is crucial for reproducing and analyzing issues. However, unlike mobile devices, headset recording involves specialized technologies like passthrough and foveated rendering. The recorded footage typically doesn’t fully replicate the complete stereoscopic view seen directly by the user’s eyes. Therefore, interpreting recordings requires special attention to discrepancies between the footage and actual human visual experience.
Why visual differences exist in headset recordings?
Users must note: headset recordings don’t perfectly match what you see through the lenses. If anomalies appear in recordings, always describe your actual perception to avoid misleading developers.
Key differences include:
- Resolution and clarity differences
- Recording: typically a monocular (e.g., left eye) 1080p or lower-resolution video stream.
- Human eye: modern headsets can achieve 2K or higher per-eye resolution depending on the application.
- Impact: blurry text or visuals in recordings might not exist in-headset; conversely, subtle edge aliasing visible in-headset may be invisible in recordings.
- Field of view (FOV) differences
- Recording: footage is usually square-shaped. Content near edges often gets cropped, especially in wide-FOV devices.
- Human eye: binocular vision provides a larger field of view.
- Impact: issues occurring at peripheral vision may not be accurately reflected in recordings.
- Foveated rendering impact
- Recording: captures all content across the full screen, including low-resolution peripheral areas.
- Human eye: high resolution only at the gaze center, with lower resolution in peripheral regions.
- Impact: while visuals feel sharp during use, recordings may show peripheral blurriness. This is often normal, not a bug. Always focus on the area you want to demonstrate during recording to ensure its clarity.
- Frame rate and refresh rate impact
- Recording: typically captured at 30fps or 60fps.
- Human eye: actual refresh rates may reach 90Hz or 120Hz.
- Impact: subtle frame drops may be imperceptible in recordings but cause noticeable "stutter" or "dizziness" during use. If footage appears smooth while actual experience isn’t, this usually indicates performance issues—always mention this after recording.
Special considerations for OST headsets
Critical note: if using OST (optical see-through) headsets like Rokid or Xreal AR glasses, recordings cannot capture the real-world view seen by human eyes.
What are OST devices?
OST devices project light directly onto your eyes through semi-transparent optical lenses, overlaying virtual content via micro-projectors or waveguides. This allows simultaneous viewing of real environments and virtual imagery. External cameras are used solely for spatial tracking, not display. Due to this mechanism, devices fundamentally cannot "record" the real-world scenes seen by users.
Limitations of OST device recordings
- Cannot capture real scenes: recordings only capture the virtual content layer (GUI, 3D models, UI pointers, etc.).
- Black background: playback on computers or phones shows virtual objects floating on pure black, losing all real-world context.
- Lost occlusion relationships: if virtual objects should be occluded by real objects (e.g., hands, tables), they appear suspended in the recording, creating false layering.
Visual misalignment issues in OST content overlay
Even under ideal conditions, OST headsets exhibit unavoidable minor misalignments between virtual content and the real world. These errors aren’t always caused by software bugs or tracking failures—they often stem from fundamental disparities between device physics and human vision, some irreparable with current technology.
When reporting issues, distinguish between:
- Optical alignment errors:
- Symptom: virtual objects "float" or show slight positional shifts during head movement.
- Cause: physical offset between optical display paths and SLAM-based tracking systems makes sub-pixel calibration challenging, especially in peripheral vision or near-field scenarios.
- Human eye calibration limits:
- Symptom: ghosting, blurry edges, or incorrect spatial projection of virtual objects.
- Cause: OST devices require precise IPD and eye-position calibration. Minor calibration errors—due to device shift or biological variations in corneal curvature—cause systematic depth/lateral offsets.
- Individual visual differences:
- Symptom: subjective disagreements about virtual object positioning among users.
- Cause: vision differences (e.g., nearsightedness, astigmatism) and dynamic focus ability affect perception. Most OST devices lack real-time eye-tracking for personalized compensation.
- Environmental interference:
- Symptom: virtual content fades in bright environments or overexposes in darkness.
- Cause: OST overlays projected light with environmental light. Bright settings drown virtual projections; dark settings amplify them, causing artifacts.
Note
These limitations are inherent to OST technology, not malfunctions. When troubleshooting, differentiate "fixable software/tracking issues" from "hardware/physiological constraints." For minor misalignments or depth inconsistencies, first verify if usage conditions (e.g., working distance, lighting, calibration) match specifications.
How to report OST device issues?
If experiencing visual issues with OST devices, after ruling out inherent limitations:
- Supplement with photos: capture "through-the-lens" photos showing virtual objects relative to real environments.
- Describe environment: detail lighting conditions, background colors, and dynamics (e.g., moving people), as OST performance heavily depends on ambient light.
Common headset recording methods
Note
Recording may reduce rendering frame rates or resolution, affecting fluidity and tracking stability assessments. Record immediately after reproducing issues.
Apple Vision Pro
- Look upward until the control center dot appears. Focus and click on it.
- Focus and tap control center > "Record my view" button
to start. - To stop: reopen control center and tap "Record my view" again.
- Recordings save to the Photos app
.
PICO 4 Ultra Enterprise
- Go to control center > settings > general > casting, recording & screenshots. Select "space" as recording mode.
- Short-press the home button. Select "Record" from the menu and pull the trigger to start.
- Exit to your target app interface during the countdown. Recording begins automatically.
- Short-press home again, reselect "Record" to stop.
- Files save to internal storage/DCIM/ScreenRecord.
Rokid AR Studio
- Go to desktop. Tap the "quick settings" area near your profile icon in the status bar.
- Tap "Record," then immediately return to your target app. Recording starts automatically.
- Press the X button to return to desktop. Tap "Record" in quick settings again to stop.
- Files save to internal shared storage/ScreenRecorder.
XREAL Air2 Ultra
- Swipe down on Beam Pro to open control center. Tap "Record" at the top.
- Return to your target app during the countdown.
- Tap the floating red "Stop" button to end recording.
- Files save to internal shared storage/Movies/Record.
Best practices
When submitting headset recordings:
- Specify device model, recording method, and environmental factors.
- For depth, occlusion, or tracking jitter issues, add textual descriptions of actual observations—recordings may not capture these accurately.
- For critical scenarios, supplement with external videos capturing user interactions while wearing the headset.
Headset recordings are convenient approximations, not perfect replicas. Reliable diagnostics require combining logs, sensor data, and user feedback. For log collection methods, see: