Table of Contents

Play transparent video on xr-frame wechat miniprogram

Before you begin

  • Prepare the transparent video to be played: Upload the transparent video to a file server and obtain the URL for loading in xr-frame.

  • Since transparent video playback relies on replacing textures in the scene, the playback location needs to be annotated in advance.
    You need to be able to create and upload annotations using the Unity editor.

What is transparent video

Transparent video is a technical solution designed to achieve a transparent background effect in video encoding formats that do not natively support alpha channels (such as H.264/AVC, H.265/HEVC).

This solution works by separating the color information (RGB) and transparency information (Alpha) of the video frames, then spatially combining them into a single frame according to a specific layout, resulting in a regular video file with a grayscale mask. On the playback side, the two components are sampled and composited in real-time through the graphics rendering pipeline to restore the dynamic image with a transparent background.

Based on the arrangement of the color and Alpha regions, there are two main formats:

  1. Side-by-Side (SBS)
    Side-by-Side arranges the RGB color frame and the Alpha mask frame horizontally side by side. Typically, the left side contains the color region, and the right side contains the corresponding grayscale Alpha region.

  2. Top-by-Bottom (TBB)
    Top-by-Bottom stacks the RGB color frame and the Alpha mask frame vertically. Typically, the upper half contains the color region, and the lower half contains the corresponding grayscale Alpha region.

Annotate positions to play transparent video on xr-frame miniprogram

First, load the video resource of type video-texture.

async loadAsset() {
    const videoTexture = {
        assetId: "fireball",
        type: "video-texture",
        // Video resource URL
        src: "url/video-resource.mp4",
        options: {
            autoPlay: true,
            loop: true,
        }
    };
    try {
        // Load video-texture type resource
        await scene.assets.loadAsset(videoTexture);
    } catch (err) {
        console.error(`Failed to load video texture: ${err.message}`);
    }
}

In the EMA load callback, use scene.createElement(xrFrameSystem.XRMesh,{}) to create a simple geometric object, assign the easyar-video-tsbs material, and modify the uniform to u_baseColorMap:video-{$assetId}.

Note

The loading, registration, deregistration, and unloading of easyar-video-tsbs and easyar-video-ttbb materials are controlled by the AR Session.

handleEmaResult(ema: easyar.ema.v0_5.Ema) {
    const blockHolder: easyar.BlockHolder = session.blockHolder;
    ema.blocks.forEach(emaBlock => {
        const blockInfo: easyar.BlockInfo = {
            id: emaBlock.id
        };
        // If the Block node does not exist, create a Block node
        blockHolder.holdBlock(blockInfo, easyarPlugin.toXRFrame(emaBlock.transform));
    });
    ema.annotations.forEach(annotation => {
        if (annotation.type !== mega.EmaV05AnnotationType.Node) {
            return;
        }
        const nodeAnnotation = annotation as easyar.ema.v0_5.Node;
        const xrNode: xrfs.XRNode = easyarPlugin.createXRNodeFromNodeAnnotation(nodeAnnotation, blockHolder);
        const assetInfo = AnnotationMetaData[nodeAnnotation.id as keyof typeof AnnotationMetaData];
        let model: xrfs.Element;

        if (assetInfo) {
            // GLTF part
        } else {
            // Use built-in Mesh to create geometry for rendering
            model = scene.createElement(xrFrameSystem.XRMesh, {
                geometry: "cube",
                material: "easyar-video-tsbs",
                uniforms: "u_baseColorMap:video-fireball",
            });
            xrNode.addChild(model);
        }
    });
}
Note

When using video-texture, if the console shows a wx.createVideoDecoder with type: 'wemedia' is deprecated warning, please ignore it.

Confirmed with the official WeChat team, this warning does not affect usage.

Next steps