Table of Contents

3D content rendering

When using AR, virtual objects often need to be displayed. In simple test examples, basic geometric shapes can be used. However, for consumer-facing development, high-precision 3D models and animations are generally required, and interactions may be triggered when model nodes are clicked.

Model

Popular 3D models are typically composed of triangle meshes. To make models appear realistic, we need to assign a material to each triangle. Materials usually implement a lighting model, defined by textures and lighting model parameters.

  • Texture determines the base color of each point within the triangle. All textures for triangles are stored in one or more diffuse texture maps. In advanced usage, textures can also represent normal directions or other parameters for each point.
  • The lighting model defines how an object interacts with light. Common examples include PBR. Lighting models are typically implemented using shaders. Parameters for PBR lighting models include color, metallicness, roughness, etc.

In application development, 3D models are not usually loaded directly on OpenGL / Metal / Vulkan / Direct3D. Instead, 3D engines are used for loading. 3D engines require specific 3D model formats, such as the long-established, more readable obj/mtl format, and the currently popular glTF format.

Animation

To animate models, skeletal animation is used. Skeletons refer to large blocks of model nodes that move consistently during rigid-body motion within the 3D model.

Displaying animation requires continuously updating the position and orientation (transformation matrix) of model nodes during runtime. Most 3D engines provide animation functionality. Simply edit the animation in animation software and export it in a format supported by the 3D engine. The aforementioned glTF format also includes animation capabilities.

Interaction

When users click on model nodes, events and interactions sometimes need to be triggered. Typically, bones in skeletal animations are named. Collision detection or ray detection is then used to trigger events upon clicking, returning the name of the clicked bone. Event handling can be implemented using scripts or application code.

Note

If you lack experience with 3D engines, it is strongly recommended to consider using Unity for your application development. The EasyAR Sense Unity Plugin offers good support for Unity. Using other 3D engines may result in limited support and fewer available resources.