SDK application reference

By running the executable file in the SDK development package lib/win/hostapp/x64 , or running the executable file in the sample/build/bin/Release directory, the camera can be easily used.

In addition, Percipio provides Percipio Viewer, an image viewing software developed based on the SDK, which supports users to quickly view depth maps, color images, infrared images, and point clouds. It also allows real-time adjustment of camera exposure parameters, laser brightness, etc. Download link for Percipio Viewer installation package: https://en.percipio.xyz/. Please refer to Percipio Viewer user guide for instructions on how to use it.

Sample program instructions

The sample program in sample_v2 builds on sample_v1 by introducing a user-friendly camera control interface and offering an optional OpenCV dependency that users can choose to include if necessary.

Sample_v1

  • ListDevices: This program demonstrates how to list all the depth cameras that are connected to the host system.

  • DeviceStorage: This sample program demonstrates how to read from and write to the camera’s custom block storage area (64KB) and the ISP block storage area (64KB).

  • DumpAllFeatures: This sample program demonstrates how to list the components and properties supported by the depth camera, along with the read and write operations available for each property.

  • ForceDeviceIP: This sample program demonstrates how to manually set the IP address for depth camera.

  • LoopDetect: This sample program demonstrates how to address data connection anomalies caused by unstable environmental factors.

  • NetStatistic: This sample calculates the packet loss rate of the network connection for the depth camera.

  • SimpleView_FetchFrame: This sample demonstrates how to continuously capture and output image data when the depth camera is in free mode.

  • SimpleView_Callback: This sample program demonstrates how to capture image data in free mode and render it using OpenCV in a separate processing thread to prevent blocking.

  • SimpleView_FetchHisto: This sample program demonstrates how to retrieve the image brightness distribution histogram.

  • SimpleView_MultiDevice: This sample program demonstrates how to use multiple depth cameras to continuously capture and output image data.

  • SimpleView_Point3D: This sample program demonstrates how to acquire 3D point cloud data.

  • SimpleView_Registration: This sample program demonstrates how to acquire the internal and external parameters, the depth image, and the color image of the depth camera, and perform the register the depth and color images.

  • SimpleView_TriggerDelay: This sample program demonstrates how to control the camera’s trigger delay.

  • SimpleView_TriggerMode0: This sample program demonstrates how to set the depth camera to trigger mode 0, allowing it to continuously capture and output images at the highest frame rate.

  • SimpleView_TriggerMode1: This sample program demonstrates how to set the depth camera to trigger mode 1, allowing it to acquire and output images upon receiving a trigger signal.

  • SimpleView_TriggerMode_M2S1: This sample program demonstrates how to configure the master device (camera) to trigger mode 2, multiple slave devices to trigger mode 1, and implement cascade triggering for multiple depth cameras.

  • SimpleView_TriggerMode_M3S1: This sample program demonstrates how to configure the master device (camera) to trigger mode 3, multiple slave devices to trigger mode 1, and implement cascade triggering of depth cameras based on the set frame rate.

  • SimpleView_SaveLoadConfig: This sample program demonstrates how to import local JSON files into the camera or export the camera settings as local JSON files.

  • SimpleView_XYZ48: This sample program demonstrates how to parse and display depth maps in XYZ format.

  • SimpleView_Point3D_XYZ48: This sample program demonstrates how to acquire 3D point cloud data represented in xyz48 format.

Sample_v2

  • ListDevices_v2: This program demonstrates how to list all depth cameras that are connected to the host computer.

  • DepthStream_v2: This example program demonstrates how to acquire the depth image from the ToF (Time-of-Flight) camera.

  • TofDepthStream_v2: This sample program demonstrates how to perform distortion correction on a TOF depth image.

  • ExposureTimeSetting_v2: This sample program demonstrates how to set the exposure time of the camera’s color image.

  • ForceDeviceIP_v2: This sample program demonstrates how to set the IP address of the depth camera.

  • GetCalibData_v2: This sample program demonstrates how to get the raw calibration parameters of the depth camera.

  • NetStatistic_v2: This sample program demonstrates how to calculate the network packet loss rate of the depth camera.

  • OfflineReconnection_v2: This sample program demonstrates how to automatically reconnect the camera after a disconnection.

  • OpenWithInterface_v2: This sample program demonstrates how to access the camera through a specified network interface.

  • OpenWithIP_v2: This sample program demonstrates how to access the camera through a specified IP address.

  • PointCloud_v2: This sample program acquires and save 3D point cloud data in PLY format.

  • Registration_v2: This sample program demonstrates how to register the depth and color images.

  • ResolutionSetting_v2: This sample program demonstrates how to set the image resolution through user interaction or by directly specifying the image mode.

  • SaveLoadConfig_v2: This sample program demonstrates how to save camera parameters to custom_block.bin (the camera’s internal memory) and export the camera parameters locally from it.

  • SoftTrigger_v2: This sample program demonstrates how to acquire and output images upon receiving a trigger signal.

  • StreamAsync_v2: This sample program demonstrates how to configure the asynchronous output of image data streaming from the camera.

Image Acquisition Process

The configuration and image acquisition process for the depth camera is illustrated below. The sample program of the C++ SDK routine Simpleview_FetchFrame is used to detail the image acquisition process.

图像获取流程图

Image Acquisition Process

API Initialization

TYInitLib Initialing Device Objects and Data Structure.

// Load the library
LOGD("Init lib");
ASSERT_OK( TYInitLib() );

// Retrieve SDK version information
TY_VERSION_INFO ver;
ASSERT_OK( TYLibVersion(&ver) );
LOGD("     - lib version: %d.%d.%d", ver.major, ver.minor, ver.patch);

Open Device

  1. Get Device List

    When first retrieving device information, use selectDevice() to query the number of connected devices and obtain a list of all connected devices.

    std::vector<TY_DEVICE_BASE_INFO> selected;
    ASSERT_OK( selectDevice(TY_INTERFACE_ALL, ID, IP, 1, selected) );
    ASSERT(selected.size() > 0);
    TY_DEVICE_BASE_INFO& selectedDev = selected[0];
    
  2. Open Interface

    ASSERT_OK( TYOpenInterface(selectedDev.iface.id, &hIface) );
    
  3. Open Device

    ASSERT_OK( TYOpenDevice(hIface, selectedDev.id, &hDevice) );
    

Configure Components

  1. Query Component Status

    // Retrieve supported component information
    TY_COMPONENT_ID allComps;
    ASSERT_OK( TYGetComponentIDs(hDevice, &allComps) );
    
  2. Configure Components and Set Properties

    After the device is opened, only the virtual component TY_COMPONENT_DEVICE is enabled by default.

    // Enable RGB Component + Configure RGB Component Properties
     if(allComps & TY_COMPONENT_RGB_CAM  && color) {
         LOGD("Has RGB camera, open RGB cam");
         ASSERT_OK( TYEnableComponents(hDevice, TY_COMPONENT_RGB_CAM) );
         //create a isp handle to convert raw image(color bayer format) to rgb image
         ASSERT_OK(TYISPCreate(&hColorIspHandle));
         //Init code can be modified in common.hpp
         //NOTE: Should set RGB image format & size before init ISP
         ASSERT_OK(ColorIspInitSetting(hColorIspHandle, hDevice));
         //You can  call follow function to show  color isp supported features
    
     // Enable Left IR Component
     if (allComps & TY_COMPONENT_IR_CAM_LEFT && ir) {
         LOGD("Has IR left camera, open IR left cam");
         ASSERT_OK(TYEnableComponents(hDevice, TY_COMPONENT_IR_CAM_LEFT));
     }
    
     // Enable Right IR Component
     if (allComps & TY_COMPONENT_IR_CAM_RIGHT && ir) {
         LOGD("Has IR right camera, open IR right cam");
         ASSERT_OK(TYEnableComponents(hDevice, TY_COMPONENT_IR_CAM_RIGHT));
     }
    
     // Enable Depth Component + Configure Depth Component Properties
     LOGD("Configure components, open depth cam");
     DepthViewer depthViewer("Depth");
     if (allComps & TY_COMPONENT_DEPTH_CAM && depth) {
    
         /// Configure Depth Component Properties(Depth Image Resolution)
         TY_IMAGE_MODE image_mode;
         ASSERT_OK(get_default_image_mode(hDevice, TY_COMPONENT_DEPTH_CAM, image_mode));
         LOGD("Select Depth Image Mode: %dx%d", TYImageWidth(image_mode), TYImageHeight(image_mode));
         ASSERT_OK(TYSetEnum(hDevice, TY_COMPONENT_DEPTH_CAM, TY_ENUM_IMAGE_MODE, image_mode));
    
         /// Enable Depth Component
         ASSERT_OK(TYEnableComponents(hDevice, TY_COMPONENT_DEPTH_CAM));
    
         /// Configure Depth Component Properties (Scale Unit)
         //depth map pixel format is uint16_t ,which default unit is  1 mm
         //the acutal depth (mm)= PixelValue * ScaleUnit
         float scale_unit = 1.;
         TYGetFloat(hDevice, TY_COMPONENT_DEPTH_CAM, TY_FLOAT_SCALE_UNIT, &scale_unit);
         depthViewer.depth_scale_unit = scale_unit;
     }
    

Frame Buffer Management

Note

Before managing frame buffers, ensure that the required components are enabled using the TYEnableComponents() interface and that the correct image format and resolution are set using the TYSetEnum() interface.the size of the frame buffer depends on these settings; otherwise, insufficient frame buffer space issues may occur..

// Query frame buffer size
LOGD("Prepare image buffer");
uint32_t frameSize;
ASSERT_OK( TYGetFrameBufferSize(hDevice, &frameSize) );
LOGD("     - Get size of framebuffer, %d", frameSize);

// Allocate frame buffers
LOGD("     - Allocate & enqueue buffers");
char* frameBuffer[2];
frameBuffer[0] = new char[frameSize];
frameBuffer[1] = new char[frameSize];

// Enqueue frame buffers
LOGD("     - Enqueue buffer (%p, %d)", frameBuffer[0], frameSize);
ASSERT_OK( TYEnqueueBuffer(hDevice, frameBuffer[0], frameSize) );
LOGD("     - Enqueue buffer (%p, %d)", frameBuffer[1], frameSize);
ASSERT_OK( TYEnqueueBuffer(hDevice, frameBuffer[1], frameSize) );

Register Callback Functions

TYRegisterEventCallback

Register an event callback function. When an exception occurs, the system calls the function registered via TYRegisterEventCallback.The following example demonstrates a callback function that includes exception handling for reconnection scenarios.

static bool offline = false;
void eventCallback(TY_EVENT_INFO *event_info, void *userdata)
{
   if (event_info->eventId == TY_EVENT_DEVICE_OFFLINE) {
       LOGD("=== Event Callback: Device Offline!");
       // Note:
       // Please set TY_BOOL_KEEP_ALIVE_ONOFF feature to false if you need to debug with breakpoint!
       offline = true;
   }
}

int main(int argc, char* argv[])
{
  LOGD("Register event callback");
  ASSERT_OK(TYRegisterEventCallback(hDevice, eventCallback, NULL))
  while(!exit && !offline) {
      //Fetch and process frame data
  }
  if (offline) {
     //Release resources
     TYStopCapture(hDevice);
     TYCloseDevice(hDevice);
     // Can try re-open and start device to capture image
     // or just close interface exit
  }
  return 0;
}

Configure Operating Modes

Configure the depth camera’s operating mode based on actual requirements. For Steps to Configure Other Operating Modes, refer to Work mode settings

// Check for Operating Mode Properties
bool hasTrigger;
ASSERT_OK(TYHasFeature(hDevice, TY_COMPONENT_DEVICE, TY_STRUCT_TRIGGER_PARAM_EX, &hasTrigger));

if (hasTrigger) {

    // Set Depth Camera to a Specific Operating Mode 0
    LOGD("Disable trigger mode");
    TY_TRIGGER_PARAM_EX trigger;
    trigger.mode = TY_TRIGGER_MODE_OFF;
    ASSERT_OK(TYSetStruct(hDevice, TY_COMPONENT_DEVICE, TY_STRUCT_TRIGGER_PARAM_EX, &trigger, sizeof(trigger)));
}

Start Image Capture

LOGD("Start capture");
ASSERT_OK( TYStartCapture(hDevice) );

Fetch Frame Data

LOGD("While loop to fetch frame");
bool exit_main = false;
TY_FRAME_DATA frame;
int index = 0;
while(!exit_main) {
    int err = TYFetchFrame(hDevice, &frame, -1);
    if( err == TY_STATUS_OK ) {
        LOGD("Get frame %d", ++index);

        int fps = get_fps();
        if (fps > 0){
            LOGI("fps: %d", fps);
        }

        cv::Mat depth, irl, irr, color;
        parseFrame(frame, &depth, &irl, &irr, &color, hColorIspHandle);
        if(!depth.empty()){
            depthViewer.show(depth);
        }
        if(!irl.empty()){ cv::imshow("LeftIR", irl); }
        if(!irr.empty()){ cv::imshow("RightIR", irr); }
        if(!color.empty()){ cv::imshow("Color", color); }

        int key = cv::waitKey(1);
        switch(key & 0xff) {
        case 0xff:
            break;
        case 'q':
            exit_main = true;
            break;
        default:
            LOGD("Unmapped key %d", key);
        }

        TYISPUpdateDevice(hColorIspHandle);
        LOGD("Re-enqueue buffer(%p, %d)"
            , frame.userBuffer, frame.bufferSize);
        ASSERT_OK( TYEnqueueBuffer(hDevice, frame.userBuffer, frame.bufferSize) );
    }
}

Stop Capture

ASSERT_OK( TYStopCapture(hDevice) );

Close Device

// Close Device
ASSERT_OK( TYCloseDevice(hDevice));

// Release Interface Handle
ASSERT_OK( TYCloseInterface(hIface) );
ASSERT_OK(TYISPRelease(&hColorIspHandle));

Release API

// Unload Library
ASSERT_OK( TYDeinitLib() );

// Free Allocated Memory Resources
delete frameBuffer[0];
delete frameBuffer[1];

Application example: Setting the IP address of the network depth camera

This section introduces how to use the compiled ForceDeviceIP example to set the camera’s IP address and provides common examples of IP address configurations.

Note

Description of IP Address Types:

  • Temporary IP Address: A manually configured IP address temporarily assigned to the device.

  • Static IP Address: A manually configured IP address permanently assigned to the device.

  • Dynamic IP Address: An IP address automatically assigned by the DHCP (Dynamic Host Configuration Protocol) server in the network.

IP Address Command Description

After executing the command, the IP address of the network depth camera will be modified to the IP address specified by the command, and it will take effect immediately; after the camera is powered off and rebooted, the original configuration will be restored.

  • Command: ForceDeviceIP.exe -force <MAC> <newIP> <newNetmask> <newGateway>

  • Sample code: ForceDeviceIP.exe -force 68:f7:56:36:90:a3 192.168.1.160 255.255.255.0 192.168.1.1

  • <MAC> can be obtained from the device label, the format is xx:xx:xx:xx:xx:xx.

  • <newIP> is the specified IP address.

  • <newNetmask> and <newGateway> are set according to newIP.

Example of Setting IP Address

Application Scenario 1

Set a static Class C IP address (192.168.5.12) for the Percipio network camera.

Steps on Windows 10:

  1. Check the Current IP Address Configuration.

    Press WIN+R, type “cmd”, then enter “ipconfig” and press Enter.

    Check the IP Address

    Check the IP Address

  2. Check if the computer’s IP address is within the target subnet. If it is not (e.g., the target subnet is 192.168.5.XX, but the current IP address is in the 192.168.6.XX subnet), then need to modify the computer’s IP address.

    Open the Control Panel on your computer, navigate to “Network and Internet” > “Network and Sharing Center” > “Change adapter settings” > “Ethernet” > “Internet Protocol Version 4 (TCP/IPv4)”. In the Internet Protocol Version 4 (TCP/IPv4) Properties dialog box that appears, select “Use the following IP address” and configure the IP address, subnet mask, and gateway.

    Modify Computer IP Address

    Modify Computer IP Address

  3. Open the lib\win\hostapp\x64 folder in the SDK. Open Windows PowerShell in this directory and execute the following commands:

    ForceDeviceIP.exe -static 06:29:39:05:DA:D1 192.168.5.12 255.255.255.0 192.168.5.1
    

    06:29:39:05:DA:D1 is the MAC address of the camera; 192.168.5.12 is the newly assigned IP address; 255.255.255.0 is the subnet mask corresponding to the new IP address, and 192.168.5.1 is the default gateway corresponding to the new IP address.

Application Scenario 2

Setting a Dynamic IP Address for a Percipio Network Camera.

Steps on Windows 10:

  1. Open the lib\win\hostapp\x64 folder in the SDK.

  2. Open Windows PowerShell in the directory and run the following command:

    ForceDeviceIP.exe -dynamic 06:29:39:05:DA:D1
    

    06:29:39:05:DA:D1 is the MAC address of the camera.