Raw sensor feeds for both the point cloud and the RGB image stream are available through the SDK. Intrinsic parameters are also available for the RGB stream.

Our sensor APIs provide poses which are time aligned from the given camera frame (RGB or Depth) to the world frame. That is, the APIs report the position and rotation of the given sensor at the time the data was captured from the sensor.

Note: These APIs are direct interactions with our core system APIs; they may crash if used incorrectly!

Point Cloud

The point cloud can be accessed directly via the system interoperation method meta_get_point_cloud(). For performance reasons, it is recommended to only allocate the required memory once at initialization and reuse it as needed.


An example of this API’s usage is available in MetaPointCloudExample.cs.

MetaCoreInterop.MetaPointCloud _metaPointCloud = new MetaCoreInterop.MetaPointCloud();
MetaCoreInterop.meta_get_point_cloud(ref _metaPointCloud, _translation, _rotation);

Use the MetaPointCloud prefab and check Render Point Cloud to see a simple demo of this API.

Warning: This prefab is an example and not meant for production use! It requires a powerful computer to render all of the points in the point cloud.


Our RGB stream can be accessed via the system interop as well meta_get_rgb_frame(). The camera intrinsics can be accessed via meta_get_rgb_intrinsics().


See CameraApi.cs for an in depth example.

MetaCoreInterop.meta_get_rgb_frame(RawPixelBuffer, _translation, _new_rotation);



using UnityEngine;
using Meta.Interop;

public class IntrinsicsDemo : MonoBehavior {

    MetaCoreInterop.MetaPolyCameraParams camera_params
            = new MetaCoreInterop.MetaPolyCameraParams();

    void Update()
        bool has_intrinsics =  MetaCoreInterop.meta_get_rgb_intrinsics(ref camera_params);
        if (has_intrinics)