Table of Contents

3D Space Content Display

When using AR, it is often necessary to display virtual objects. In simple test examples, basic geometric shapes can be used, but for consumer-facing development, high-precision 3D models and animations are typically required, and interactions or events may be triggered when clicking on model nodes.

Model

Currently, popular 3D models are generally composed of triangle meshes. To make the model appear realistic, we need to assign materials to each triangle. Materials usually implement a certain lighting model, consisting of textures and lighting model parameters.

  • Texture determines the base color of each point in the triangle, and all textures for the triangles are placed in one or more diffuse maps. In advanced usage, textures can also represent the normal direction or other parameters of each point.
  • Lighting model defines how the object interacts with light, with PBR being a common example. Lighting models are typically implemented using shaders. Parameters for PBR lighting models include color, metallicness, roughness, etc.

In application development, 3D models are not usually loaded directly in OpenGL / Metal / Vulkan / Direct3D but are instead loaded using a 3D engine. 3D engines require specific 3D model formats, such as the long-standing and relatively readable obj/mtl format, or the currently popular glTF format.

Animation

To animate models, skeletal animation is used. "Skeletons" refer to large model nodes in a 3D model that move consistently as rigid bodies.

Displaying animations requires continuously updating the position and orientation (transformation matrix) of model nodes during runtime. Most 3D engines provide animation functionality, allowing animations to be edited in animation software and exported in a format supported by the 3D engine. The aforementioned glTF format also includes animation capabilities.

Interaction

When users click on model nodes, events and interactions may need to be triggered. Typically, bones in skeletal animations are named, and collision detection or raycasting is used to return the name of the clicked bone when an interaction occurs. Event handling can be implemented using scripts or application code.

Note

If you lack experience with 3D engines, it is highly recommended to consider using Unity for your application development. The EasyAR Sense Unity Plugin provides strong support for Unity, whereas other 3D engines may have limited support and fewer available resources.