EasyAR Unity headset extension package
This document introduces the concept, capabilities, and boundaries of the EasyAR Unity headset extension package, as well as the background knowledge required to create a headset extension package.
Before you begin
- Read EasyAR headset support to learn about the types of headsets supported by EasyAR and the EasyAR features that can run on headsets.
- Read EasyAR headset support in Unity to understand the overall architecture of EasyAR headset support in Unity.
What is the EasyAR Unity headset extension package
The EasyAR Unity headset extension package is a Unity package that includes a series of codes and examples to help you use the features of EasyAR Sense on your headset device. Through this extension package, you can integrate most of the functionalities of EasyAR Sense (such as image tracking, dense spatial mapping, etc.) into your device, thereby leveraging the powerful AR capabilities provided by EasyAR.
Using the EasyAR Unity headset extension package is one of the ways to support EasyAR on headsets. The diagram below illustrates the overall architecture of EasyAR in Unity and the position of the headset extension package within it.
block
columns 4
block:groupApp:4
block:groupAppWrapper
space
App1["EasyAR + Device A<br>App"]
space
App2["EasyAR<br>App"]
space
App3["EasyAR + Device B<br>App"]
end
end
block:groupSensePluginExtension
columns 1
SensePluginExtension["EasyAR Sense Unity Plugin<br>Extension for Device A"]
space
end
block:groupSensePlugin
columns 1
SensePlugin["EasyAR Sense Unity Plugin"]
space
end
block:groupXRI
columns 1
XRI["XR Interaction Toolkit"]
space
end
block:groupARF
columns 1
ARF["AR Foundation"]
space
end
block:groupDeviceAUnity
columns 1
DeviceAUnity["Device A<br>Unity SDK"]
space
end
block:groupSense
columns 1
Sense["EasyAR Sense"]
block:groupSenseWrapper
MDeviceB["Device B<br>CameraDevice"]
Others["..."]
end
end
block:groupXRSubsystem:2
columns 1
XRSubsystem["XR Subsystems"]
XRSDK["Unity XR SDK"]
end
block:groupSystem:4
columns 1
System["Native Library"]
block:groupSystemWrapper
space
DeviceA["Device A<br> Library "]
space
space
DeviceB["Device B<br> Library "]
space
end
end
SensePluginExtension --> App1
SensePlugin --> App1
SensePlugin --> App2
SensePlugin --> App3
ARF --> App3
XRI --> App1
XRI --> App3
groupSense --> SensePlugin
groupDeviceAUnity --> SensePluginExtension
SensePlugin --> SensePluginExtension
DeviceA --> groupDeviceAUnity
DeviceA --> XRSDK
XRSubsystem --> ARF
XRSubsystem --> XRI
DeviceB --> MDeviceB
DeviceB --> XRSDK
style groupApp fill:none,stroke:none,stroke-width:0px
style groupAppWrapper fill:none,stroke:none,stroke-width:0px
style groupSensePlugin fill:none,stroke:none,stroke-width:0px
style groupARF fill:none,stroke:none,stroke-width:0px
style groupXRI fill:none,stroke:none,stroke-width:0px
style DeviceAUnity fill:none,stroke:none,stroke-width:0px
style Sense fill:none,stroke:none,stroke-width:0px,color
style groupSenseWrapper fill:none,stroke:none,stroke-width:0px
style XRSubsystem fill:none,stroke:none,stroke-width:0px
style System fill:none,stroke:none,stroke-width:0px
style groupSystemWrapper fill:none,stroke:none,stroke-width:0px
style groupSensePluginExtension fill:none,stroke:none,stroke-width:0px
classDef EasyAR fill:#6e6ce6,stroke:#333,color:#fff
class SensePluginExtension EasyAR
The diagram lists two typical ways to support headsets: integrating with the device SDK through the Unity headset extension package (Device A) and directly integrating
Can I create my own headset extension package?
Currently, the AR/VR/MR/XR industry has not yet formed a unified interface solution. Although OpenXR is a good candidate, the evolution of specifications and industry implementation still takes time. Therefore, it is generally not easy for off-the-shelf devices to run EasyAR directly, and there is a high probability of missing data interfaces. As the industry develops, some emerging devices may have good interface support. For example, in 2024, Apple opened up relevant interfaces for Vision Pro, which are sufficient to support EasyAR operation, though some professional knowledge is still required to use them.
If you cannot make a judgment, it is recommended to contact the hardware manufacturer or EasyAR business to obtain corresponding device support.
If you are a hardware manufacturer and want to support EasyAR functionality on your device, you can refer to the following documentation to create a headset extension package, enabling most of EasyAR's features to run on your device. While this document provides data and interface specifications, it does not limit all implementation details. Any implementation method or interface definition can be discussed, and you are welcome to contact us through business channels for communication.
The hardware covered in this document must have motion tracking or SLAM capabilities. EasyAR's functionality relies on good device tracking performance. It is generally not recommended to use EasyAR's features to optimize device tracking, as this can create a circular dependency, theoretically amplifying errors and causing the overall system to become unstable. If the device itself does not have motion tracking capabilities, the support solution is not covered by this document. If needed, please contact us through business channels for further discussion.
Capability boundaries of the headset extension package
The goal of the headset extension package is to enable most of EasyAR Sense's features to run on your device. To achieve this goal, you need to understand the capability boundaries of the headset extension package.
Contents of the headset extension package
The extensions you will implement are:
- A series of codes that use the custom camera feature to grab data from your device API and send it into
EasyAR Sense. - In Unity, the headset extension will use the external frame source and a set of
EasyAR Sensedata streams defined by theEasyAR Sense Unity Pluginto simplify custom camera development. - In Unity, the headset extension is a Unity package containing runtime scripts, editor scripts, and sample extensions, which you or EasyAR can distribute to downstream users.
Tip
If you do not want to expose the integration details to external systems, you can contact EasyAR for discussion. It is feasible and precedented to directly integrate using the C interface within EasyAR Sense.
When implementing the extension, you may:
- Modify the interface design and internal implementation of your SDK.
- Discuss and confirm the data acquisition and usage plan with your team.
- Spend a lot of time verifying data correctness instead of writing code.
After completing the extension, you will see:
- Most
EasyAR Sensefeatures can be used on your device, and these features will utilize your device's motion tracking capabilities. - EasyAR cloud services supported within
EasyAR Sensecan be used on your device. - Only EasyAR XR licenses can be used. Personal, professional, and classic licenses cannot be used on your device.
- All EasyAR license restrictions when using custom cameras apply to your device in the same way.
What the headset extension does not include
This extension cannot be used without EasyAR Sense:
- The headset extension does not run independently. As a dependency,
EasyAR Senseis also required. In Unity, theEasyAR Sense Unity Pluginmust be used. - It does not directly call EasyAR cloud service APIs (such as EasyAR Mega localization service). These calls are handled internally by
EasyAR Sense. - In Unity, it does not directly call interface methods for AR functionalities (such as image tracking). These are handled internally by the
EasyAR Sense Unity Plugin. - In Unity, it does not modify the transform of objects or tracking targets in the scene. These are handled internally by the
EasyAR Sense Unity Plugin.
This extension cannot be used without your device SDK:
- In Unity, the headset extension or
EasyAR Sense Unity Plugindoes not modify the transform of the camera in the scene. This must be done within your device SDK or its dependency path.
Some EasyAR functionalities remain unavailable through the headset extension:
- Surface tracking functionality will not be available.
- EasyAR's own motion tracking will not be available.
- Plane detection (part of EasyAR motion tracking) will not be available.
How to use Mega on my device?
Running Mega on a device is a concern for many users. In Unity, the Mega service is a functional module built on top of many foundational features of EasyAR Sense. Therefore, as long as your device fully supports EasyAR Sense, Mega will also be supported.
Generally, it is not recommended to directly run the Mega sample on a device initially to verify its support for Mega. This is because Mega comprehensively utilizes all input data and has a relatively high tolerance for errors in this data. Running the Mega sample directly may result in unreasonable performance due to mismatched data interfaces or poor data quality, making it difficult to identify the root cause of issues, which can complicate debugging later.
Important
The Mega service has certain requirements for the device's motion-tracking capability. If the device's motion-tracking capability is poor, Mega's performance will also be affected. In large-scale AR scenarios, special attention should be paid to performance differences between indoor and outdoor environments.
Important
Mega typically serves large-scale scenes, so special attention should be paid to the display effect of distant objects and objects when turning the head or moving. If the device's display system has significant errors, even if Mega itself runs normally, users may perceive virtual objects as not aligning correctly with real-world objects.
Required background knowledge and team configuration
Creating a headset extension package is not a simple task and requires you and your team to work deeply in multiple areas. Generally, completing a headset extension requires the involvement of Unity developers as well as team members outside of Unity development. Due to the lack of standards, modifying only the 3D engine is usually insufficient to complete a headset extension. It is recommended to involve underlying development engineers such as system engineers and SDK engineers from day one.
Building AR/VR devices requires domain knowledge. Similarly, running and validating EasyAR Sense on the device will require you or your team to be experts in the following areas:
- The physical structure and rendering system of your device
- Camera system geometry
- SDK development
- General Android debug skills, such as adb (Mainland China, International)
If you are working on Unity, you will also need to know the following:
- Unity development basics and package usage
- Unity package development
- C# language basics, including IDisposable
Additionally, some knowledge in the following areas will help you better understand the system, especially how to send the correct data to EasyAR:
- Android development (Mainland China, International)
- Geometric vision, especially image matching and 3D reconstruction
Next steps
In the following articles, you will learn the complete process of creating a headset extension package:
- Make the headset support EasyAR introduces how to use the template to create a new headset extension package and complete basic input extension development
- Run verification (bring-up) introduces how to verify the correctness of the input extension on the device
- Publish the extension package introduces how to package and distribute the headset extension package to downstream users