Concepts and workflow of MegaTracker
This document introduces the basic concepts of MegaTracker and its relationship with WeChat's native AR system VisionKit and the rendering framework xr-frame.
Before you begin
Learn through Introduction to Mega:
- Basic principles of Mega positioning and tracking.
- What is a Mega Block.
- Expected results after integrating Mega.
What is the plane ar tracker
The Plane AR Tracker in xr-frame is essentially a wrapper for VisionKit's 6DoF-plane capability.
After enabling isARCamera in xr-frame's camera component, the camera's 3D transformation is synchronized with AR system (VisionKit) every frame.
xr-frame provides 3D rendering capabilities, while VisionKit provides motion tracking capabilities in the real-world spatial coordinate system.
The plane AR tracker cannot be used with other AR trackers.
What is MegaTracker
MegaTracker is the core algorithmic component connecting WeChat's AR system (VisionKit) with Mega's spatial computing service, providing cloud localization functionality.
- Input: Camera pose in VisionKit's coordinate system (i.e., 6DoF data) calculated by VisionKit for each frame, and the camera image at the frame when Mega localization is performed.
- Output: Camera pose in the currently localized and tracked Mega Block.
How MegaTracker works on xr-frame
flowchart BT
subgraph Using xr-frame Only
direction BT
PlaneARTracker_1[PlaneARTracker] -->|MotionData & Image| XRFrame_1[xr-frame]
end
subgraph Using Mega Plugin
direction BT
PlaneARTracker_2[PlaneARTracker] -->|MotionData & Image| MegaTracker
MegaTracker -->|CameraTransform| XRFrame_2[xr-frame]
end
- In WeChat's native data flow, xr-frame's camera component is directly updated by the Plane AR Tracker results every frame.
- In the Mega Mini Program's data flow, camera pose in VisionKit's coordinate system (i.e., 6DoF data) and image data from localization frames are input to MegaTracker. After cloud localization and local computation, it outputs the camera pose in the currently localized and tracked Mega Block, ultimately updating the LocalTransform of the camera in the Mega Block node within the xr-frame scene. At this point, MegaTracker takes control of the camera, and xr-frame no longer updates the camera based on the AR tracker.
MegaTracker's operation deeply relies on the 6DoF motion data provided by the plane tracker. Therefore, MegaTracker cannot start working until the plane tracker completes initialization and establishes a stable tracking state. Additionally, AR tracking stability is limited by environmental features; in extreme scenarios such as large featureless areas (e.g., white walls) or prolonged camera occlusion, if WeChat's underlying plane tracking drifts or fails, MegaTracker will lose its reliable input source and simultaneously enter a failed state.