EasyAR dense spatial mapping
EasyAR dense spatial mapping utilizes the device's camera data to perform 3D reconstruction of the surrounding environment, generating dense point cloud maps and mesh maps. By leveraging dense spatial mapping, virtual objects can be better integrated into the real environment, enabling AR applications such as correct occlusion and collision between real and virtual objects.
Introduction to dense spatial mapping
EasyAR dense spatial mapping builds upon motion tracking to analyze the device's camera data, establishing dense point cloud information of the environment and gradually fusing and constructing a mesh to obtain a geometric representation of the environment. This can be used to achieve effects such as occlusion relationships between virtual objects and the real environment, thereby enhancing the realism of AR experiences.
EasyAR dense spatial mapping relies on a stable motion tracking system to provide six degrees of freedom (6DoF) camera position and orientation, which can be obtained from the EasyAR motion tracking module or ARKit/ARCore.

Best practices
To achieve better reconstruction results, it is recommended that users follow these guidelines:
- Move the device laterally as much as possible, avoiding rotating it in place.
- Cover as many scanning angles as possible.
- Avoid reconstructing large areas of solid color or reflective objects.
- Avoid rapid movements or blocking the camera.