Running verification (bring-up) for head-mounted display extension
The most critical and challenging task to make EasyAR work on a device is ensuring the correctness of input data. When running EasyAR for the first time on a new device, over 90% of issues are caused by incorrect data.
Whenever possible, it is recommended to verify data correctness directly using the device and its interfaces with test methods before introducing EasyAR. This article shares empirical approaches using EasyAR features for data verification. This process can help understand external-input-frame, but since EasyAR itself introduces errors, using this coupled system to verify data correctness is not the optimal choice.
Before you begin
- Complete the development of Enable headset support for EasyAR.
- Read Quickstart to learn how to use the EasyAR Sense Unity Plugin.
- Learn how to use headset samples.
- Understand Android project configuration.
Running basic functionality examples
When verifying EasyAR on a device for the first time, ensure to run these features sequentially. Do not rush to test Mega, as it has fault tolerance that may mask issues during short runs or in single real-world scenarios.
Observe the displayed session information to ensure no unexpected issues occur. Verify that the frame count is continuously increasing.
Run
Image, i.e., the Image Tracking feature, and compare its performance with that on mobile devices (iPhone is recommended as the benchmark). Focus on tracking status and target overlay display.Without motion fusion enabled, image tracking will exhibit noticeable latency—this is expected. Ensure the motion process is correct and alignment occurs when the device stops.
Run
Dense, i.e., the Dense Spatial Mapping feature, and compare its performance with that on mobile devices (iPhone is recommended as the benchmark). Focus on mesh position, generation speed, and quality.If input data frame rates are low, mesh generation slows down, but quality remains largely unaffected.
This feature may not run on some Android devices, and mesh quality varies by hardware.
Important
The input extension used by the headset extension package is implemented via a custom camera.
When using trial products (such as personal edition license, trial XR license, or trial Mega service) on custom cameras or headsets, EasyAR Sense will stop responding 100 seconds after each startup (the duration can be adjusted via EasyAR Business upon approval for Mega users). Using the paid version of EasyAR Sense and the paid EasyAR Mega service does not have this restriction.
If Image and Dense perform consistently with or better than on mobile devices, most EasyAR features will function correctly on the device. You may proceed to test Mega.
Solving abnormal situations during operation: problem decomposition
If you cannot reproduce the same results as on the mobile phone, the following is a detailed problem decomposition process that can be referenced to find the root cause. It is recommended to always focus on the system log output.
Step zero: understand the headset's own system errors
Remember the motion tracking and display requirements described in Preparing devices for AR/MR?
Important
Motion tracking/VIO errors will always affect the stability of EasyAR algorithms in different ways.
Important
Display system errors may cause virtual and real objects to fail to align perfectly.
In scenarios with significant errors, virtual objects may appear hovering above or below real objects and then seem to drift continuously. This phenomenon can be observed on Pico 4E, even when not using EasyAR and only enabling its own VST.
Step 1: View session running status
The session status displayed in UI messages requires the following functions or data to be normal:
Availabilityof ExternalFrameSourceVirtual cameraof ExternalFrameSource
If the session status information display is not visible, try modifying the option to Log and then read the session status and the name of the frame source being used in the system log.
Try deleting all other frame sources under the ARSession node and see if anything changes.
Step 2: Confirm the easyar received camera frame count
Essential functions or data that must work:
- The pathway for
camera frame datafrom ExternalFrameSource at the Unity code level (excluding data correctness and the pathway to the native layer)
This count should increase over time; otherwise, a warning will be displayed after a few seconds.
If this value is found not to be increasing, it should be addressed first.
Step three: record eif on device and replay in unity editor
Required functional or data:
- The input pathway of
camera frame datato the native layer via ExternalFrameSource (excluding data correctness)raw camera image datawithincamera frame datatimestampwithincamera frame data(excluding time points and data synchronization)
Click EIF to start recording, click again to stop.
Tip
Recording must be properly stopped to obtain a random-indexed eif file.
When replaying eif data in the unity editor, it is best to use a clean easyar scene or easyar samples to avoid incorrect configurations in the scene.
You can observe the replay of camera frame data in the unity editor. The image data is not byte-identical due to lossy encoding throughout the process.
EasyAR uses distortion parameters in calculations but does not apply undistortion to images during display. Therefore, when replaying eif files in unity, un-undistorted data will be observed, which is expected.
Tip
Adjust the unity game window ratio to match the input, otherwise data will be cropped during display.
If data plays too fast or too slow, check the timestamp input.
Note
Many tasks can be accomplished using eif. You can run image tracking and dense spatial mapping with eif in the unity editor. Note that display effects may differ when running on actual devices.
Step four: Run image tracking using EIF
Required functional or data:
raw camera image dataincamera frame dataintrinsicsincamera frame data(data accuracy cannot be fully guaranteed as the algorithm has tolerance for errors)
To run the image tracking sample ImageTracking_Targets using EIF in the Unity Editor, you need to record an EIF where the image can be tracked.
Note
Image-tracking requires the tracking target to occupy a certain proportion of the entire image. If the image cannot be tracked, try moving your head closer to the image.
If tracking consistently fails or virtual objects appear far from the target position in the image, it is highly likely that there is an issue with intrinsics.
If the image data has distortion, you may observe that virtual objects do not perfectly overlay the tracking target on the image, which is expected. This phenomenon becomes more noticeable when the tracking target is at the edge of the image.
Step 5: Run image tracking on the device
Functions or data that must work properly:
- The device's own display system
raw camera image dataincamera frame dataintrinsicsincamera frame data(data correctness not fully guaranteed, as the algorithm tolerates some errors)extrinsicsincamera frame data- Coordinate consistency of
device posebetweencamera frame dataandrendering frame data- Time difference of
device posebetweencamera frame dataandrendering frame data
Note
Image tracking requires the tracking target to occupy a certain proportion of the entire image. If unable to track the image, try moving your head closer to the image.
Image tracking requires the horizontal side length of the image to match the physical size of the real-world object. In the example, an image needs to be tracked that spans the full length of the long edge of a horizontally placed A4 paper. Therefore, do not track images displayed on a computer screen unless using a ruler to adjust the horizontal side length of the image to the size of A4 paper.
If image tracking is perfect when using EIF but different on the device, this needs to be resolved before proceeding to other tests. Solving problems in later steps is much more difficult.
If a virtual object appears suspended far away from the real object, even when the user remains still, it is very likely that either intrinsics or extrinsics are incorrect, or the device pose in camera frame data and rendering frame data are not in the same coordinate system, or the display system introduced this error.
If the virtual object moves continuously when moving the head and appears to have latency, there is a high probability that the device pose is not stable. This often occurs in several situations (other issues cannot be ruled out),
- The data timestamps of
device poseandraw camera image dataare not synchronized. - The same pose is used in both
camera frame dataandrendering frame data.
Step six: Using EIF and running dense spatial mapping on the device
Must function properly or required data:
- The device's own display system
raw camera image dataincamera frame dataintrinsicsincamera frame data(data accuracy is not fully guaranteed, as the algorithm has tolerance for errors)extrinsicsincamera frame datadevice poseincamera frame data
If mesh generation is very slow and/or ground reconstruction appears bumpy, it is highly likely that there is an issue with the device pose. It is also possible that the pose's coordinate system is incorrect or the pose timestamps are misaligned.
Tip
If the input data has a low frame rate, mesh generation speed will also decrease, but the quality will not degrade significantly. This is expected behavior.
It is generally not very easy to discern precise mesh locations, so display system inaccuracies may not be observable when using dense spatial mapping.
Running mega sample
Read the following to learn how to use mega in unity. If you haven’t activated mega service yet, contact easyar business for trial eligibility.
Then run mega on the device and compare the effect with mobile operation (suggested to use iphone as the standard). Focus on:
- whether objects are displayed in the correct positions
- whether objects beyond 10m are displayed with correct positions and sizes
- whether objects outside the center of view are displayed with correct positions and sizes
- whether objects maintain correct positions and sizes when turning the head
Resolving runtime anomalies
Essential functions or data that must operate normally:
- The device's own display system
- All data in
camera frame dataandrendered frame data
After validating image tracking and dense spatial mapping features, EasyAR Mega should theoretically be supported. If performance on the headset is significantly worse than on mobile devices, pay attention to the following:
- Focus on pose data and timestamps in
camera frame dataandrendered frame data - Monitor motion tracking/VIO system output. The panda under
XR Originserves as a good reference
Additionally, special attention must be paid to the device's own display system, particularly regarding object rendering at long distances, outside the central field of view, and during head rotation. These scenarios are often overlooked during device self-testing, but issues typically stem from the device's display system itself. You need to:
- Explain these problems and potential impacts to EasyAR
- Provide developers with reasonable performance expectations
Important
Users pay significant attention to these display issues, and many devices genuinely cannot provide perfect rendering in large spatial environments. EasyAR cannot resolve display issues inherent to the device hardware. These require iterative improvements by device manufacturers. Meanwhile, users need to understand these limitations.
Next steps
Related topics
Examples that can run on mobile phones:
- Image tracking example ImageTracking_Targets, which demonstrates the expected performance of image tracking functionality
- Dense spatial mapping example SpatialMap_Dense_BallGame, which demonstrates the expected performance of dense spatial-mapping functionality
- Motion fusion example ImageTracking_MotionFusion, which demonstrates the expected performance of motion fusion functionality
- Mega example MegaBlock_Basic, which demonstrates the expected performance of Mega functionality