Table of Contents

Namespace easyar

Classes

Accelerometer

Accelerometer calls the system-provided accelerometer and outputs AccelerometerResult. When the device is no longer needed, you can call close to close it. Should not be used after close. It is not recommended to open it multiple times simultaneously, which may lead to it not working properly or a decrease in accuracy.

AccelerometerResult

Accelerometer readings. The positive direction of the x-axis points from the center of the device to the right of the screen. The positive direction of the y-axis points from the center of the device to the top of the screen. The positive direction of the z-axis points vertically outward from the center of the device perpendicular to the screen. The units of x, y, z are m/s^2. The timestamp is in seconds.

AccelerometerResultSink

Accelerometer Result Input Port. Used to expose an input port of a component. All members of this class are thread-safe.

AccelerometerResultSource

The output port for acceleration sensor results. Used to expose the output port of a component. All members of this class are thread-safe.

ARCoreCameraDevice

ARCoreCameraDevice implements an ARCore-based camera device, outputting InputFrame (including image, camera parameters, timestamp, 6DOF position information, and tracking status). When using, you need to first use java.lang.System.loadLibrary to load libarcore_sdk_c.so. After creation, you can call start/stop to start and stop collecting video stream data. When the device is no longer needed, you can call close to close it. Do not continue to use it after close. ARCoreCameraDevice outputs InputFrame through inputFrameSource, and the inputFrameSource should be connected to InputFrameSink for use. bufferCapacity indicates the capacity of InputFrame buffer. If there are more InputFrame output from this device than this number and not released, the device will no longer output new InputFrame until the previous InputFrame is released. This may cause problems such as stuck screen. Note: The current implementation of ARCore (v1.13.0) has a memory leak when creating and destroying the session. Multiple creations and destructions will lead to continuous growth of memory usage and will not be released after destruction.

ARCoreDeviceListDownloader

ARCoreDeviceListDownloader is used to download and update the calibration parameters of the device list used in ARCoreCameraDevice.

ARKitCameraDevice

ARKitCameraDevice implements an ARKit-based camera device, outputting InputFrame (including image, camera parameters, timestamp, 6DOF position information, and tracking state). After creation, you can call start/stop to start and stop collecting video stream data. When the device is no longer needed, you can call close to close it. It should not be used further after close. ARKitCameraDevice outputs InputFrame through inputFrameSource, and inputFrameSource should be connected to InputFrameSink for use. bufferCapacity indicates the capacity of InputFrame buffer. If there are more InputFrame output from this device than this number and not released, the device will no longer output new InputFrame until the previous InputFrame is released. This may cause problems such as screen freezing.

AttitudeSensor

AttitudeSensor calls the system-provided attitude sensor and outputs AttitudeSensorResult. When the device is no longer needed, you can call close to shut it down. After close, it should not be used anymore. It is not recommended to open it multiple times simultaneously, as it may become unusable or the precision may decrease.

AttitudeSensorResult

Attitude sensor readings. The positive direction of the x-axis points from the center of the device to the right part of the screen. The positive direction of the y-axis points from the center of the device to the upper part of the screen. The positive direction of the z-axis points vertically outward from the center of the device from the screen. The device is rotated around the axis (x, y, z) by an angle θ. The unit rotation quaternion is (cos(θ/2), xsin(θ/2), ysin(θ/2), zsin(θ/2)). (v0, v1, v2, v3) = (cos(θ/2), xsin(θ/2), ysin(θ/2), zsin(θ/2)). The reference coordinate system is defined as a set of oriented orthonormal bases, where

Z points to the sky and is parallel to the ground.

(X, Y, Z) form a right-handed system.

timestamp is in seconds.

AttitudeSensorResultSink

Attitude Sensor Result Input Port. Used to expose an input port of a component. All members of this class are thread-safe.

AttitudeSensorResultSource

Attitude sensor result output port. Used to expose an output port of a component. All members of this class are thread-safe.

BlockInfo

The model obtained from dense reconstruction is represented by a triangular mesh, called a mesh. Since the mesh is frequently updated, to ensure efficiency, the entire reconstructed model's mesh is split into many mesh blocks. A mesh block consists of a cube with a side length of approximately 1 meter, containing elements such as vertex and index. BlockInfo is used to describe the content of a mesh block. The (x,y,z) is the index of the mesh block. Multiplying (x,y,z) by the physical size of each mesh block gives the origin coordinates of this mesh block in the world coordinate system. The parts to be displayed can be pre-filtered based on the world position of the mesh block to save rendering time.

BlockPriorResult

Prior map information. blockIds is the map ID. mode is the mode in which prior information is used by the system, and can be 'soft' or 'hard'. Refer to BlockPriorMode.

Buffer

Buffer stores the original byte array, which can be used to access image data. In the Java API, you can get the buffer from Image and then copy data to a Java byte array. In all versions of EasyAR Sense, you can access image data. Reference Image.

BufferDictionary

A mapping from a file path to Buffer. Used to represent multiple files placed in memory.

BufferPool

BufferPool implements a memory pool that can be used for functions like custom camera access that need to repeatedly allocate memory of the same size, reducing memory allocation time.

CalibrationDownloader

CalibrationDownloader is used to download and update the calibration parameters used in MotionTracker. After downloading, MotionTracker needs to be recreated for it to take effect.

CallbackScheduler

Callback scheduler. There are two subclasses DelayedCallbackScheduler and ImmediateCallbackScheduler. Among them, DelayedCallbackScheduler is used to delay the callback until it is called manually, and can be used in a single-threaded environment (such as various UI environments). ImmediateCallbackScheduler is used to execute the callback immediately and can be used in a multi-threaded environment (such as servers or background services)

CameraDevice

CameraDevice implements a camera device that outputs InputFrame (including image, camera parameters, and timestamp). Available on Windows, Mac, Android, and iOS. After opening, you can call start/stop to start and stop data acquisition. start/stop will not affect previously set camera parameters. When the device is no longer needed, you can call close to close it. Should not be used after close. CameraDevice outputs InputFrame through inputFrameSource. The inputFrameSource should be connected to InputFrameSink for use. bufferCapacity indicates the buffer capacity of InputFrame. If more than this number of InputFrame are output from the device and not released, the device will stop outputting new InputFrame until the previous InputFrame are released. This may cause issues like frame freeze. When using on Android, you need to add the android.permission.CAMERA permission declaration in AndroidManifest.xml. When using on iOS, you need to add the NSCameraUsageDescription permission declaration in Info.plist.

CameraDeviceSelector

Used to select the Camera API (camera1 or camera2) on Android. camera1 has better compatibility but lacks some necessary information, such as timestamp. camera2 has compatibility issues on some devices. Different options will choose camera1 or camera2 according to the use case.

CameraParameters

Camera parameters, including image size, focal length, principal point, camera type, and the rotation angle of the camera relative to the natural orientation of the device.

CloudLocalizer

CloudLocalizer implements cloud localization functionality.

CloudLocalizerBlockInstance

The instance of the block located by CloudLocalizer.

CloudLocalizerResult
CloudRecognizationResult
CloudRecognizer

CloudRecognizer implements cloud recognition functionality. Cloud recognition functionality requires creating a cloud recognition image library on the cloud before use; please refer to the EasyAR CRS documentation. When the component is no longer needed, you can call close to shut it down. It should not be used after close. Before using CloudRecognizer, you need to set up and prepare an ImageTracker. Any returned target should be manually loaded into ImageTracker using loadTarget before being tracked. After loading, the recognition and tracking of the target will be the same as the use of local targets. After a target is recognized, you can obtain it from the callback, and then you should use the target uid to distinguish different targets. The target runtimeID is dynamically generated and is not suitable for unique identification of targets in cloud recognition scenarios.

DelayedCallbackScheduler

Delayed callback scheduler. Used to postpone the callback to be called when manually invoked, and can be used in single-threaded environments (such as various UI environments). All members of this class are thread-safe.

DenseSpatialMap

DenseSpatialMap is used for accurate 3D dense reconstruction of the environment. The reconstructed model is represented by a triangular mesh, called mesh. DenseSpatialMap occupies 1 camera buffer.

DeviceAuxiliaryInfo
Engine
EventDumpRecorder

Event Dump Recorder. Used to save some key diagnostic information to EED files. All members of this class are thread-safe.

FeedbackFrame

Feedback frame. Contains an input frame and a historical output frame for ImageTracker and other feedback synchronization processing components.

FeedbackFrameFork

Feedback Frame Splitter. Used to transmit a feedback frame in parallel to multiple components. All members of this class are thread-safe.

FeedbackFrameSink

Feedback frame input port. Used to expose an input port of a component. All members of this class are thread-safe.

FeedbackFrameSource

Feedback Frame Output Port. Used to expose the output port of a component. All members of this class are thread-safe.

FrameFilterResult

FrameFilterResult is the base class of all the results of components using synchronous algorithms.

Gyroscope

The Gyroscope calls the system-provided gyroscope and outputs GyroscopeResult. When the device is no longer needed, you can call close to shut it down. Do not continue to use it after close. Do not open it multiple times at the same time; it may become unusable or the accuracy may decrease.

GyroscopeResult

Gyroscope readings. The positive direction of the x-axis points from the center of the device to the right part of the screen. The positive direction of the y-axis points from the center of the device to the upper part of the screen. The positive direction of the z-axis points vertically outward from the center of the device to the screen. x, y, z represent the angular velocity of rotation around the corresponding axes, in radians per second. The counterclockwise direction when looking at the device from somewhere in the positive direction of the coordinate axis is the positive direction of rotation. The timestamp unit is seconds.

GyroscopeResultSink

Gyroscope result input port. Used to expose an input port of a component. All members of this class are thread-safe.

GyroscopeResultSource

Gyroscope result output port. Used to expose an output port of a component. All members of this class are thread-safe.

Image

Image stores image data and is used to represent an image in memory. Image provides access to raw data as a byte array and also provides interfaces to access information such as width/height. In all versions of EasyAR Sense, you can access the image data. In iOS, you can access it as follows:

#import <easyar/buffer.oc.h>
#import <easyar/image.oc.h>

easyar_OutputFrame * outputFrame = [outputFrameBuffer peek];
if (outputFrame != nil) {
    easyar_Image * i = [[outputFrame inputFrame] image];
    easyar_Buffer * b = [i buffer];
    char * bytes = calloc([b size], 1);
    memcpy(bytes, [b data], [b size]);
    // use bytes here
    free(bytes);
}

In Android,

import cn.easyar.*;

OutputFrame outputFrame = outputFrameBuffer.peek();
if (outputFrame != null) {
    InputFrame inputFrame = outputFrame.inputFrame();
    Image i = inputFrame.image();
    Buffer b = i.buffer();
    byte[] bytes = new byte[b.size()];
    b.copyToByteArray(0, bytes, 0, bytes.length);
    // use bytes here
    b.dispose();
    i.dispose();
    inputFrame.dispose();
    outputFrame.dispose();
}
ImageHelper

Image assistance.

ImageTarget

ImageTarget represents a target for planar images, which can be tracked by ImageTracker. The values inside ImageTarget need to be filled in first through methods like create... before they can be read. Then, after successfully loading ImageTracker via loadTarget, it can be detected and tracked by ImageTracker.

ImageTargetParameters

ImageTargetParameters represents the parameters required to create the ImageTarget.

ImageTracker

ImageTracker implements the detection and tracking of planar cards. ImageTracker occupies (1 + SimultaneousNum) camera buffers. Should use camera.setBufferCapacity to set a value not less than the total number of camera buffers occupied by all components. After creation, you can call start/stop to start and stop the operation. start/stop are very lightweight calls. When the component is no longer needed, you can call close to close it. Should not be used after close. ImageTracker inputs via feedbackFrameSink FeedbackFrame. FeedbackFrameSource should be connected to feedbackFrameSink for use. Before Target can be tracked by ImageTracker, you need to load it via loadTarget/unloadTarget. You can pass the callback of the interface to get the result of load/unload.

ImageTrackerConfig

ImageTracker Create configuration.

ImageTrackerResult

The result of ImageTracker.

ImmediateCallbackScheduler

Immediate Callback Scheduler. Used to execute callbacks immediately, applicable in multi-threaded environments (such as servers or background services). All members of this class are thread-safe.

InertialCameraDevice

InertialCameraDevice implements a camera device based on motion inertia, and outputs CameraTransformType as FiveDofRotXZ of InputFrame (including image, camera parameters, timestamp, pose transformation matrix and tracking status). After creation, start/stop can be called to start and stop collecting video stream data. When the device is no longer needed, close can be called to close it. It should not be used continuously after close. InertialCameraDevice outputs through inputFrameSource InputFrame, and inputFrameSource should be connected to InputFrameSink for use. bufferCapacity indicates the capacity of the InputFrame buffer. If there are more InputFrame output from the device and not released than this number, the device will no longer output new InputFrame until the previous InputFrame is released. This may cause problems such as screen freezing.

InputFrame

Input frame. Contains image, camera parameters, timestamp, camera transformation relative to the world coordinate system, and tracking status. Among them, camera parameters, timestamp, camera transformation relative to the world coordinate system, and tracking status are all optional, but specific algorithm components may have specific requirements for the input.

InputFrameFork

Input frame splitter. Used to transmit an input frame in parallel to multiple components. All members of this class are thread-safe.

InputFramePlayer

Input Frame Player. There is an input frame output port to extract the input frame from the EIF file. All members of this class are thread-safe.

InputFrameRecorder

Input frame recorder. There is an input frame input port and an input frame output port for saving the passing input frames into EIF files. All members of this class are thread-safe.

InputFrameSink

Input Frame input port. Used to expose an input port of a component. All members of this class are thread-safe.

InputFrameSource

Input frame output port. Used to expose the output port of a component. All members of this class are thread-safe.

InputFrameThrottler

InputFrameThrottler. There is an input frame input port and an input frame output port, used to prevent new input frames from entering the algorithm component when the algorithm component has not finished processing the previous frame data. InputFrameThrottler occupies 1 camera's buffer. The camera's setBufferCapacity should be used to set a value not less than the total number of camera buffers occupied by all components. All members of this class are thread-safe. It should be noted that the connection and disconnection of its signalInput should not be performed while data is flowing in, otherwise it may fall into a state where output is not possible. (It is recommended to complete the data flow connection before the Camera starts.)

InputFrameToFeedbackFrameAdapter

InputFrameToFeedbackFrameAdapter. There is an input frame input port, a historical output frame input port, and a feedback frame output port, which is used to combine the input frame and the historical output frame into a feedback frame and pass it to the algorithm component that requires the input feedback frame, such as ImageTracker. Each time the input frame is input, the historical output frame input last time is combined to form a feedback frame. If there is no historical output frame input, the historical output frame in the feedback frame is empty. InputFrameToFeedbackFrameAdapter occupies 1 camera's buffer. The setBufferCapacity of the camera should be used to set the number of camera buffers occupied by all components to not less than. All members of this class are thread-safe.

InputFrameToOutputFrameAdapter

Input Frame to Output Frame Adapter. There is an Input Frame Input Port and an Output Frame Output Port, used to wrap input frames into output frames, to achieve the function of directly performing rendering without accessing algorithm components. All members of this class are thread-safe.

JniUtility

JNI utility class. Used to wrap Java arrays and ByteBuffer in Unity. Does not support iOS platform.

LocationResult

Position reading. Latitude, longitude in degrees. Altitude in meters. horizontalAccuracy is horizontal accuracy in meters. verticalAccuracy is vertical direction accuracy in meters.

LocationResultSink

Location Result Input Port. Used to expose an input port of a component. All members of this class are thread-safe.

LocationResultSource

Output port for position results. Used to expose an output port of a component. All members of this class are thread-safe.

Log

Log class. Used to output logs or set a custom log output function.

Magnetometer

Magnetometer calls the system-provided magnetometer and outputs MagnetometerResult. When the device is no longer needed, you can call close to shut it down. After close, it should not be used again. It is not recommended to open it multiple times at the same time, as it may become unusable or the accuracy may decrease.

MagnetometerResult

Magnetometer readings. The positive direction of the x-axis points from the center of the device to the right part of the screen. The positive direction of the y-axis points from the center of the device to the upper part of the screen. The positive direction of the z-axis points vertically outward from the center of the device to the screen. The units of x, y, z are uT(microtesla). timestamp is in seconds.

MagnetometerResultSink

Magnetometer result input port. Used to expose an input port of a component. All members of this class are thread-safe.

MagnetometerResultSource

The output port for magnetometer results. Used to expose the output port of a component. All members of this class are thread-safe.

Matrix33F

3x3 square matrix. Data is arranged in row-major order.

Matrix44F

Fourth-order square matrix. The data is arranged in row-major.

MegaLandmarkFilter

MegaLandmarkFilter implements VPS cloud location filtering function.

MegaLandmarkFilterResult
MegaTracker

Provides cloud positioning function. MegaTracker occupies 1 camera buffer.

MegaTrackerBlockInstance

The instance of the block located by MegaTracker.

MegaTrackerLocalizationResponse

Response to MegaTracker positioning request.

MegaTrackerResult

The output of MegaTracker will be updated at the frequency of OutputFrame.

MotionInputData

Motion input data.Includes,timestamp,transformation relative to the world coordinate system,and tracking status.

MotionTrackerCameraDevice

MotionTrackerCameraDevice implements a real-scale 6DOF motion tracking camera device, outputting InputFrame (including image, camera parameters, timestamp, 6DOF position information, and tracking state). After creation, start/stop can be called to start and stop the data stream. When the device is no longer needed, call close to shut it down. Should not be used after close. MotionTrackerCameraDevice outputs InputFrame through inputFrameSource; inputFrameSource should be connected to InputFrameSink for use.

ObjectTarget

ObjectTarget represents a 3D object target, which can be tracked by ObjectTracker. The size of ObjectTarget is determined by the obj file. You can modify the size by changing scale, which defaults to 1. After ObjectTarget is successfully loaded into ObjectTracker through loadTarget, it can be detected and tracked by ObjectTracker.

ObjectTargetParameters

ObjectTargetParameters represents the parameters required to create ObjectTarget.

ObjectTracker

ObjectTracker implements the detection and tracking of 3D object targets. ObjectTracker occupies (1 + SimultaneousNum) camera buffers. The camera's setBufferCapacity should be set to no less than the number of camera buffers occupied by all components. After creation, you can call start/stop to begin and stop operation; start/stop are very lightweight calls. When the component is no longer needed, you can call close to shut it down. It should not be used after close. ObjectTracker inputs through feedbackFrameSink FeedbackFrame, and FeedbackFrameSource should be connected to feedbackFrameSink for use. Before a Target can be tracked by ObjectTracker, you need to load it via loadTarget/unloadTarget. You can obtain the load/unload results by passing in a callback interface.

ObjectTrackerResult

result of ObjectTracker

OutputFrame

Output frame. Contains the output results of the input frame and synchronization processing components.

OutputFrameBuffer

OutputFrameBuffer. There is an output frame input port and an output frame acquisition function, which is used to convert the output frame acquisition method from asynchronous to synchronous polling, suitable for frame-by-frame rendering. OutputFrameBuffer occupies 1 camera's buffer. The setBufferCapacity function of the camera should be used to set a value no less than the total number of camera buffers occupied by all components. All members of this class are thread-safe.

OutputFrameFork

Output frame splitter. Used to transmit an output frame in parallel to multiple components. All members of this class are thread-safe.

OutputFrameJoin

Output Frame Merger. Used to merge the output frames of multiple components into a single output frame. All members of this class are thread-safe. Note that the connection and disconnection of its multiple inputs should not be performed while data is flowing in, otherwise it may get stuck in a state where it cannot output. (It is recommended to complete the data flow connection before the Camera starts.)

OutputFrameSink

Output frame input port. Used to expose an input port of a component. All members of this class are thread-safe.

OutputFrameSource

Output Frame Output Port. Used to expose an output port of a component. All members of this class are thread-safe.

PlaneData
PoseUtility
ProximityLocationResult

Adjacent position reading. x, y, z in meters. Origin is the map block origin. y is up. accuracy in meters. timestamp, validTime in seconds. is2d means whether y is not used.

ProximityLocationResultSink

Adjacent Position Result Input Port. Used to expose an input port of a component. All members of this class are thread-safe.

ProximityLocationResultSource

Proximity position result output port. Used to expose an output port of a component. All members of this class are thread-safe.

Recorder

The Recorder implements screen recording functionality for the current rendering environment. Currently, the Recorder only works on Android (4.3 or newer) and iOS with OpenGL ES 3.0 environment. Due to its dependency on OpenGLES, all functions of this class (except requestPermissions, including the destructor) must be called in a single thread that contains an OpenGLES context. Unity Only If Unity uses the Multi-threaded rendering feature, the script thread will be separated from the rendering thread, and updateFrame cannot be called on the rendering thread. Therefore, if you need to use the screen recording feature, you should disable the Multi-threaded rendering feature. When using Android, you need to add the android.permission.RECORD_AUDIO permission declaration in AndroidManifest.xml. When using iOS, you need to add the NSMicrophoneUsageDescription permission declaration in Info.plist.

RecorderConfiguration

RecorderConfiguration is the startup configuration for Recorder.

SceneMesh
SignalSink

Signal input port. Used to expose an input port of a component. All members of this class are thread-safe.

SignalSource

Signal Output Port. Used to expose the output port of a component. All members of this class are thread-safe.

SparseSpatialMap

Provide the main functions of the SparseSpatialMap system, map generation and storage, map loading and localization, and can obtain environmental information such as point clouds and planes and perform hit Test. SparseSpatialMap occupies 2 camera buffers. The setBufferCapacity of the camera should be used to set the number of camera buffers occupied by all components to not less than the number.

SparseSpatialMapConfig

Used to configure the localization strategy in sparse mapping.

SparseSpatialMapManager

The SparseSpatialMap management class is used to manage the sharing function of SparseSpatialMap.

SparseSpatialMapResult

Get the output of the Sparse Mapping and Localization System, which will be updated at the frequency of OutputFrame.

Storage
SurfaceTracker

SurfaceTracker implements the tracking of environmental surfaces. SurfaceTracker occupies 1 camera's buffer. Should use camera's setBufferCapacity to set the number of camera buffers occupied by all components to be no less than. After creation, start/stop can be called to start and stop operation, and start/stop are very lightweight calls. When the component is no longer needed, close can be called to close it. Should not be used continue after close. SurfaceTracker inputs through inputFrameSink InputFrame, and InputFrameSource should be connected to inputFrameSink for use.

SurfaceTrackerResult

Result of SurfaceTracker.

Target

Target is the base class for all targets in EasyAR that can be tracked by ImageTracker or other algorithms.

TargetInstance

TargetInstance is the target tracked by the tracker. TargetInstance includes the original Target that is being tracked and the current state and pose of this Target

TargetTrackerResult

TargetTrackerResult is the base class of ImageTrackerResult and ObjectTrackerResult.

TextureId

TextureId encapsulates the texture object in the graphics API. Among them, OpenGL/OpenGLES should use getInt and fromInt, Direct3D should use getPointer and fromPointer.

ThreeDofCameraDevice

ThreeDofCameraDevice implements a three DOF camera device, outputs CameraTransformType as ThreeDofRotOnly of InputFrame (including image, camera parameters, timestamp, pose transformation matrix, and tracking status). After creation, start/stop can be called to start and stop collecting video stream data. When no longer needed, the device can be closed by calling close. It should not be used after close. ThreeDofCameraDevice outputs InputFrame through inputFrameSource, which should be connected to InputFrameSink for use. bufferCapacity indicates the buffer capacity of InputFrame. If there are more than this number of InputFrame output from the device and not released, the device will no longer output new InputFrame until the previous InputFrame is released. This may cause issues such as screen freezing.

Vec2F

2D float vector.

Vec2I

2D int vector

Vec3D

3D double vector.

Vec3F

3D float vector.

Vec4F

Four-dimensional float vector.

Vec4I

4-dimensional int vector

VideoInputFramePlayer

Input frame player. There is an input frame output port for extracting input frames from EIF MKV files. All members of this class are thread-safe.

VideoInputFrameRecorder

Input Frame Recorder. There is an input frame input port and an input frame output port, which are used to save the passed input frames into EIF MKV files. All members of this class are thread-safe.

VideoPlayer

VideoPlayer is a video player class. EasyAR supports normal videos, transparent videos, and stream playback. Video content will be rendered to the texture passed in setRenderTexture. This class only supports textures of OpenGLES 3.0. Because it depends on OpenGLES, all functions of this class (including the destructor) must be called in a single thread containing an OpenGLES context. The current version requires both width and height to be multiples of 16. Supported video file formats Windows: Media Foundation compatible formats, installing additional decoders can support more formats, please refer to Supported Media Formats in Media Foundation , does not support DirectShow Mac: Not supported Android: Formats supported by the system, please refer to Supported media formats . iOS: Formats supported by the system, currently there is no valid reference document

VisionOSARKitCameraDevice

VisionOSARKitCameraDevice implements a camera device based on VisionOS ARKit, outputting InputFrame (including image, camera parameters, timestamp, 6DOF position information, and tracking status). After creation, you can call start/stop to start and stop collecting video stream data. When the device is no longer needed, you can call close to shut it down. It should not be used after close. VisionOSARKitCameraDevice outputs InputFrame through inputFrameSource, and inputFrameSource should be connected to InputFrameSink for use. bufferCapacity indicates the capacity of InputFrame buffer. If more than this number of InputFrame are output from the device and not released, the device will no longer output new InputFrame until the previous InputFrame are released. This may cause issues such as screen freezing.

XREALCameraDevice

The XREALCameraDevice implements a camera device based on the XREAL Enterprise Native SDK Plugin, outputting InputFrame (unable to obtain image, camera parameters, timestamp, 6DOF position information, and tracking status). After creation, start/stop can be called to start and stop collecting video stream data. Ensure that start is called after isDeviceSupported returns true. When the device is no longer needed, close can be called to close it. Do not continue to use after close. The XREALCameraDevice outputs InputFrame through inputFrameSource, which should be connected to InputFrameSink for use. bufferCapacity indicates the capacity of the InputFrame buffer. If there are more InputFrame output from this device than this number and not released, the device will no longer output new InputFrame until the previous InputFrame is released. This may cause problems such as frozen screen.

Enums

AndroidCameraApiType
ARCoreCameraDeviceFocusMode
ARCoreDeviceListDownloadStatus
ARKitCameraDeviceFocusMode
BlockPriorMode

Transcendental Map Information Working Mode

CalibrationDownloadStatus
CameraDeviceFocusMode
CameraDevicePreference
CameraDevicePresetProfile
CameraDeviceType

Camera device type.

CameraModelType

Camera model type.

CameraState
CameraTransformType

Camera transformation type.

CloudLocalizerStatus

Cloud positioning status.

CloudRecognizationStatus
EngineOperatingSystem
ImageTrackerMode
InertialCameraDeviceFocusMode
LocalizationMode
LogLevel
MegaApiType

MEGA API Type

MegaLandmarkFilterStatus
MegaTrackerLocalizationStatus
MotionTrackerCameraDeviceFocusMode
MotionTrackerCameraDeviceFPS
MotionTrackerCameraDeviceQualityLevel
MotionTrackerCameraDeviceResolution
MotionTrackerCameraDeviceTrackingMode
MotionTrackingStatus

Describes the quality of the device's motion tracking.

PermissionStatus
PixelFormat

PixelFormat indicates the image pixel format. All formats have pixel orientation of left to right, top to bottom.

PlaneType
RecordProfile
RecordStatus
RecordVideoOrientation
RecordVideoSize
RecordZoomMode
StorageType

StorageType indicates the storage location of images, json files, videos, or other files. StorageType specifies the root directory for file storage, and you can use relative paths relative to this root directory in all related interfaces.

TargetStatus
ThreeDofCameraDeviceFocusMode
ValidationState
VideoInputFrameRecorderCompletionReason
VideoInputFrameRecorderVideoCodec
VideoStatus
VideoType