Concept and Workflow of MegaTracker
This document introduces the basic concepts of MegaTracker and its relationship with WeChat's native AR system VisionKit and the rendering framework xr-frame.
Before you begin
Learn from Mega Introduction:
- The basic principles of Mega positioning and tracking.
- What is a Mega Block.
- Expected results after integrating Mega.
What is a plane AR tracker
The plane AR tracker in xr-frame is essentially a wrapper for VisionKit 6DoF-plane capability.
After enabling isARCamera in the xr-frame camera component, the camera's 3D transformation is synchronized with the AR system (VisionKit) every frame.
xr-frame provides 3D rendering capabilities, while VisionKit provides motion tracking capabilities in the real-world coordinate system.
The plane AR tracker cannot be used with other AR trackers.
What is MegaTracker
MegaTracker is the core algorithmic component that connects WeChat's AR system (VisionKit) with Mega's spatial computing service, providing cloud positioning functionality.
- Input: The camera pose in the VisionKit coordinate system (i.e., 6DoF data) calculated by VisionKit for each frame and the camera image at the frame used for Mega positioning.
- Output: The camera pose under the currently positioned and tracked Mega Block.
How MegaTracker works on xr-frame
flowchart BT
subgraph Using xr-frame Only
direction BT
PlaneARTracker_1[PlaneARTracker] -->|MotionData & Image| XRFrame_1[xr-frame]
end
subgraph Using Mega Plugin
direction BT
PlaneARTracker_2[PlaneARTracker] -->|MotionData & Image| MegaTracker
MegaTracker -->|CameraTransform| XRFrame_2[xr-frame]
end
- In the data flow provided by WeChat natively, the xr-frame camera component is updated directly by the plane AR tracker results every frame.
- In the data flow provided by the Mega Mini Program, the camera pose in the VisionKit coordinate system (i.e., 6DoF data) and the image data of the positioning frame are input to MegaTracker. After cloud positioning and local computation, the camera pose under the currently positioned and tracked Mega Block is output, ultimately updating the LocalTransform of the camera under the Mega Block node in the xr-frame scene. At this point, MegaTracker takes over control of the camera, and xr-frame no longer updates the camera based on the AR tracker.
The operation of MegaTracker deeply relies on the 6DoF motion data provided by the plane tracker. Therefore, MegaTracker cannot start working until the plane tracker completes initialization and establishes a stable tracking state. Additionally, the stability of AR tracking is limited by environmental features. In extreme scenarios such as large textureless areas (e.g., white walls) or prolonged camera occlusion, if the underlying plane tracking in WeChat drifts or is lost, MegaTracker will simultaneously enter a failed state due to the loss of a reliable input source.