Play transparent videos on xr-frame WeChat Mini Program
Before you begin
Prepare the transparent video to be played: Upload the transparent video to the file server and obtain the URL for loading in xr-frame.
Since the playback of transparent videos relies on replacing textures in the scene, the positions for playing transparent videos need to be marked in advance.
You need to be able to create and upload annotations using the Unity editor.
What is transparent video
Transparent video is a technical solution designed to achieve transparent background effects in video encoding formats that do not natively support alpha channels (e.g., H.264/AVC, H.265/HEVC).
This solution works by separating the color information (RGB) and transparency information (Alpha) of the video frame, then stitching them together into a single frame according to a specific spatial layout, resulting in a regular video file with a black-and-white mask. On the playback side, the two components are sampled and composited in real-time through the graphics rendering pipeline to restore the dynamic image with a transparent background.
Depending on how the color and alpha regions are stitched, there are two main formats:
Side-by-side (SBS)
Side-by-side is a format where the RGB color frame and the alpha mask frame are stitched horizontally. By convention, the left side is the color region, and the right side is the corresponding grayscale alpha region.
Top-by-bottom (TBB)
Top-by-bottom is a format where the RGB color frame and the alpha mask frame are stitched vertically. By convention, the upper part is the color region, and the lower part is the corresponding grayscale alpha region.
Annotate position to play transparent video on xr-frame mini program
First, load a video resource of type video-texture.
async loadAsset() {
const videoTexture = {
assetId: "fireball",
type: "video-texture",
// Video resource URL
src: "url/video-resource.mp4",
options: {
autoPlay: true,
loop: true,
}
};
try {
// Load video-texture type resource
await scene.assets.loadAsset(videoTexture);
} catch (err) {
console.error(`Failed to load video texture: ${err.message}`);
}
}
In the EMA load callback, use scene.createElement(xrFrameSystem.XRMesh,{}) to create a simple geometry and assign the easyar-video-tsbs material, then modify the uniform to u_baseColorMap:video-{$assetId}.
Note
The loading, registration, deregistration, and unloading of easyar-video-tsbs and easyar-video-ttbb materials are controlled by the AR Session.
handleEmaResult(ema: easyar.ema.v0_5.Ema) {
const blockHolder: easyar.BlockHolder = session.blockHolder;
ema.blocks.forEach(emaBlock => {
const blockInfo: easyar.BlockInfo = {
id: emaBlock.id
};
// If the Block node does not exist, create the Block node
blockHolder.holdBlock(blockInfo, easyarPlugin.toXRFrame(emaBlock.transform));
});
ema.annotations.forEach(annotation => {
if (annotation.type !== mega.EmaV05AnnotationType.Node) {
return;
}
const nodeAnnotation = annotation as easyar.ema.v0_5.Node;
const xrNode: xrfs.XRNode = easyarPlugin.createXRNodeFromNodeAnnotation(nodeAnnotation, blockHolder);
const assetInfo = AnnotationMetaData[nodeAnnotation.id as keyof typeof AnnotationMetaData];
let model: xrfs.Element;
if (assetInfo) {
// GLTF part
} else {
// Use built-in Mesh to create geometry for rendering
model = scene.createElement(xrFrameSystem.XRMesh, {
geometry: "cube",
material: "easyar-video-tsbs",
uniforms: "u_baseColorMap:video-fireball",
});
xrNode.addChild(model);
}
});
}
Note
When using video-texture, if the console shows the warning wx.createVideoDecoder with type: 'wemedia' is deprecated, please ignore it.
Confirmed with the WeChat official team, this warning does not affect usage.