Running verification (bring-up) for head-mounted display extensions
To make EasyAR work on a device, the most important and also the most challenging part is ensuring the correctness of input data. When running EasyAR on a new device for the first time, over 90% of the issues are caused by incorrect data.
If possible, it is recommended to verify the correctness of the data directly using some testing methods with only the device and its interfaces, without the presence of EasyAR. This article will introduce some empirical methods for verifying data using EasyAR's features. This process can help in understanding external input frame data. However, since EasyAR itself also has errors, using this coupled system to verify data correctness is not the best choice.
Before you begin
- Complete the Make the headset support EasyAR development.
- Read the Quickstart to learn how to use the EasyAR Sense Unity Plugin.
- Learn how to Use headset samples.
- Understand the Android project configuration.
Run basic feature examples
When verifying EasyAR on a device for the first time, ensure to run these features in sequence, especially avoiding rushing to run Mega, as Mega has some fault tolerance, making it difficult to detect issues during short-term or single real-world scenario runs.
Observe the displayed session information to ensure no unexpected issues occur, and confirm the frame count is continuously increasing.
Run
Image, which is the image tracking feature, and compare the results with those on a mobile device (it is recommended to use an iPhone as the standard). Pay attention to the tracking status and target overlay display.When motion fusion is not enabled, image tracking will have a noticeable delay, which is expected. Ensure the motion process is correct and the position aligns when the device stops.
Run
Dense, which is the dense spatial mapping feature, and compare the results with those on a mobile device (it is recommended to use an iPhone as the standard). Focus on the mesh position, generation speed, and quality.If the input data frame rate is low, mesh generation will be slower, but the quality will not significantly degrade.
This feature may not work on some Android devices, and mesh quality will vary depending on the device.
Important
The input extension used by the headset extension package is a custom camera implementation.
When using trial products (such as personal edition license, trial XR license, or trial Mega service, etc.) on custom cameras or headsets, EasyAR Sense will stop responding after 100 seconds (Mega users can adjust the duration after approval by EasyAR business) upon each startup. There is no such limitation when using the paid version of EasyAR Sense and the paid EasyAR Mega service.
If Image and Dense perform the same or better than on a mobile device, most EasyAR features should work properly on the device, and you can proceed to test Mega.
Troubleshooting running exceptions: Problem breakdown
If you cannot reproduce the same results as on the mobile phone, here is a detailed problem breakdown process that you can refer to to find the root cause. It is recommended to always pay attention to the system log output.
Step zero: Understand the system error of the headset itself
Remember the motion tracking and display requirements described in Preparing devices for AR/MR?
Important
Motion tracking/VIO errors will always affect the stability of EasyAR algorithms in different ways.
Important
Display system errors may cause virtual objects and real objects to fail to align perfectly.
In some cases with significant errors, virtual objects may appear to float above or below real objects and then (seemingly) drift continuously. This phenomenon can be observed on the Pico 4E, even when EasyAR is not used and only its own VST is enabled.
Step one: Check session running status
The session status displayed in UI messages requires the following functions or data to be normal:
- ExternalFrameSource
Availability- ExternalFrameSource
Virtual Camera
If the session status information is not visible, try changing the option to Log and then read the session status and the name of the frame source being used in the system log.
You can try deleting all other frame sources under the ARSession node and see if there are any changes.
Step two: Confirm the camera frame count received by EasyAR
Essential functions or data:
- ExternalFrameSource
camera frame datapathway in the Unity code layer (excluding data correctness and the pathway to the native layer)
This data should increase over time; otherwise, a warning message will be displayed after a few seconds.
If this value is found not to increase, it should be addressed first.
Step three: Record EIF on the device and then replay it in the Unity editor
The following functionalities or data must work properly:
- The path of
camera frame datainput to the native layer from ExternalFrameSource (excluding data correctness)- The
raw camera image dataincamera frame data- The
timestampincamera frame data(excluding time points and data synchronization)
Click EIF to start recording, and click again to stop.
Tip
The recording must be stopped properly to obtain an EIF file that can be randomly indexed.
When running EIF data in the Unity editor, it is best to use a clean EasyAR scene or an EasyAR sample to avoid incorrect configurations in the scene.
You can see the playback of camera frame data in the Unity editor. The image data is not byte-equal, as there is lossy encoding and decoding in the entire process.
EasyAR will use distortion parameters in calculations but will not perform anti-distortion on the image during display. Therefore, if such data is input, when replaying the EIF file in Unity, you will observe data without anti-distortion, which is expected.
Tip
Adjust the ratio of the Unity game window to match the input, otherwise the data will be cropped during display.
If the data plays too fast or too slow, check the timestamp input.
Note
You can do many things with EIF, such as running image tracking and dense spatial mapping in the Unity editor using EIF. Note that the display effect may differ when running on the device.
Step four: Run image tracking using EIF
Required functional or data:
raw camera image dataincamera frame dataintrinsicsincamera frame data(data accuracy cannot be fully guaranteed, as the algorithm has a tolerance for errors)
To run the image tracking example ImageTracking_Targets using EIF in the Unity editor, you need to record an EIF where the image can be tracked.
Note
Image tracking requires the tracking target to occupy a certain proportion of the entire image. If the image cannot be tracked, try moving the head closer to the image.
If tracking consistently fails or the virtual object appears far from the target in the image, it is highly likely that there is an issue with the intrinsics.
If the image data is distorted, you may observe that the virtual object does not perfectly cover the tracking target on the image, which is expected. This phenomenon becomes more noticeable when the tracking target is at the edge of the image.
Step five: Run image tracking on the device
Required functional or data:
- The device's own display system
raw camera image dataincamera frame dataintrinsicsincamera frame data(data accuracy cannot be fully guaranteed, as the algorithm has a tolerance for errors)extrinsicsincamera frame data- Coordinate consistency of
device posebetweencamera frame dataandrendering frame data- Time difference of
device posebetweencamera frame dataandrendering frame data
Note
Image tracking requires the tracking target to occupy a certain proportion of the entire image. If the image cannot be tracked, try moving the head closer to the image.
Image tracking requires the horizontal edge length of the image to match the real-world size of the object. In the example, you need to track an image with a horizontal edge length that fully spans the long edge of an A4 paper placed horizontally. Therefore, do not track images displayed on a computer screen unless you use a ruler and adjust the horizontal edge length of the image to the size of an A4 paper.
If image tracking works perfectly with EIF but not on the device, it needs to be resolved before proceeding with other tests. It will be much more difficult to fix the issue in later steps.
If the virtual object appears to hover far away from the real object, even when the person is not moving, it is likely that intrinsics or extrinsics are incorrect, or the device pose in camera frame data and rendering frame data is not in the same coordinate system, or the display system introduced this error.
If the virtual object moves continuously when moving the head and appears to have latency, there is a high probability that the device pose is not healthy. This often occurs in several scenarios (other issues cannot be ruled out):
- The data time of
device poseandraw camera image datais not synchronized - The same pose is used in
camera frame dataandrendering frame data
Step six: Use EIF and run dense spatial mapping on the device
Required functional or data:
- The device's own display system
Raw camera image dataincamera frame dataIntrinsicsincamera frame data(data correctness is not fully guaranteed, as the algorithm has tolerance for errors)Extrinsicsincamera frame dataDevice poseincamera frame data
If the mesh generation is very slow and/or the ground reconstruction is uneven, it is highly likely that the device pose is problematic. It is also possible that the coordinate system of the pose is incorrect or the timing of the pose is off.
Tip
If the input data frame rate is low, the mesh generation speed will also slow down, but the quality will not significantly degrade. This is expected behavior.
It is usually not very easy to discern the exact mesh position, so display system errors may not be noticeable when using dense spatial mapping.
Run Mega example
Read the following to learn how to use Mega in Unity. If you haven't activated the Mega service yet, you need to contact EasyAR Business to obtain a trial qualification.
- EasyAR Mega introduction
- Is my localization library ready to use?
- Quick start with EasyAR Mega Unity sample
Then run Mega on the device and compare the effect with the mobile phone (it is recommended to use iPhone as the standard). Pay attention to
- Whether the object display position is correct
- Whether the position and size of distant objects (10M and beyond) are correct
- Whether the position and size of objects outside the center of view are correct
- Whether the position and size of objects are correct when turning the head
Resolve abnormal situations during operation
Essential functions or data that must work properly:
- The device's own display system
- All data in
camera frame dataandrendering frame data
After completing the verification of image tracking and dense spatial mapping features, EasyAR Mega should theoretically be supported. If the performance on the headset is significantly worse than on a mobile phone, pay attention to the following:
- Focus on the pose data and timestamp in
camera frame dataandrendering frame data - Monitor the output of the motion tracking/VIO system. The panda under
XR Originwill be a good reference
Additionally, special attention should be paid to the device's own display system, especially the display effects of objects at a distance, outside the center of the line of sight, and when rotating the head. Such scenarios are often overlooked during the device's own testing, but the issues are usually caused by the device's own display system. You need to explain these problems and their potential impact to EasyAR, and provide developers with reasonable performance expectations.
Important
Users will pay close attention to these display issues, and many devices indeed cannot provide perfect display effects in large-space scenarios. EasyAR cannot solve the display problems inherent to the device itself—this requires iteration and improvement by the device manufacturer. Meanwhile, users also need to understand these issues.
Next steps
Related topics
Examples that can run on mobile devices:
- Image tracking example ImageTracking_Targets, which helps understand the expected performance of the image tracking feature
- Dense spatial mapping example SpatialMap_Dense_BallGame, which helps understand the expected performance of the dense spatial mapping feature
- Motion fusion example ImageTracking_MotionFusion, which helps understand the expected performance of the motion fusion feature
- Mega example MegaBlock_Basic, which helps understand the expected performance of the Mega feature