由于项目需要,使用了Azure Kinect DK的深度相机设备,由于内含人体姿态检测的SDK,所以拿来玩一玩

设备文档介绍:https://docs.microsoft.com/zh-cn/azure/Kinect-dk/set-up-azure-kinect-dk

一、设备连接

硬件部分的连接示意图如下:

1、将电源连接器插入设备背面的电源插孔。 将 USB 电源适配器连接到线缆的另一端,然后将适配器插入电源插座;

2、将 USB 数据线的一端连接到设备,将另一端连接到电脑上的 USB 3.0 端口(这里我遇到一些问题,驱动始终无法识别,经过查找网上经验吗,将其连接到距离主板最近的那个USB口,才可以连接成功)

3、验证电源指示器 LED(USB 电缆旁边)是否为纯白。

4、系统要求:官网中的系统要求是Windows10或者ubuntu18.04的操作系统

网上的经验说Ubuntu16.04也可以,不过我直接用了win10的系统。

二、下载SDK

SDK下载地址:https://docs.microsoft.com/zh-cn/azure/Kinect-dk/sensor-sdk-download

也可以在GitHub源码中找到其他版本下载

源码:https://github.com/microsoft/Azure-Kinect-Sensor-SDK

SDK版本:https://github.com/microsoft/Azure-Kinect-Sensor-SDK/blob/develop/docs/usage.md

三、检查设备连接成功

启动 Azure Kinect 查看器:双击文件中的执行文件 C:\Program Files\Azure Kinect SDK vX.Y.Z\tools\k4aviewer.exe,其中的 X.Y.Z 是安装的 SDK版本,可以看到如下图:

这里会自动搜索到相机,如果没搜索到,说明驱动安装失败,还是要连接到距离主板最近的那个USB口,并且USB口需要时USB3.0版本,选择相机的配置选项后,点击start就可以看到效果:

进入 C:\Program Files\Azure Kinect Body Tracking SDK\tools\文件夹中,有一个k4abt_simple_3d_viewer可执行文件,可以在Powershell中运行。命令是:./k4abt_simple_3d_viewer.exe CPU,如果没有使用CUDA,可在后面加CPU,表示用CPU运行

四、SDK的使用

相机函数调用:https://docs.microsoft.com/zh-cn/azure/Kinect-dk/build-first-app

人体姿态识别函数调用:https://docs.microsoft.com/zh-cn/azure/Kinect-dk/build-first-body-app

安装opencv3

下载VS2019 ,并新建一个C++   windows  控制台的项目,然后需要拷贝一些SDK内容以及相关库过来:

  1. 将Azure Kinect SDK v1.3.0\sdk\include\中的k4a.h、k4a.hpp、k4a_export.h、k4atypes.h、k4aversion.h拷贝到项目文件夹下的include文件夹下
  2. 将Azure Kinect Body Tracking SDK\sdk\include中的k4abt.h、k4abt.hpp、k4abttypes.h、k4abtversion.h拷贝到项目文件夹下的include文件夹下
  3. 将opencv\build\include\opencv2中的core.hpp、opencv.hpp、opencv_modules.hpp拷贝到项目文件夹下的include文件夹下
  4. 将opencv\build\include\opencv2文件夹直接拷贝到项目wen'文件夹下
  5. 将Azure Kinect SDK v1.3.0\tools文件夹下的depthengine_2_0.dll、k4a.dll拷贝到项目文件夹下
  6. 将Azure Kinect Body Tracking SDK\tools文件夹下的cublas64_100.dll、cudart64_100.dll、cudnn64_7.dll、dnn_model_2_0.onnx拷贝到项目文件夹下

接下来就可以写自己的主函数了

k4abttypes.h的函数中设置使用CPU还是GPUwo'd

typedef enum
{K4ABT_TRACKER_PROCESSING_MODE_GPU = 0, /**< SDK will use GPU mode to run the tracker */K4ABT_TRACKER_PROCESSING_MODE_CPU,     /**< SDK will use CPU only mode to run the tracker */
} k4abt_tracker_processing_mode_t;

我的主程序,主要是展示关节点坐标,及一些关键点的关系

#include <stdio.h>
#include <stdlib.h>
#include <iostream>
#include "include/k4a.h"
#include "include/k4abt.h"
#include <math.h>// OpenCV
#include "include/opencv.hpp"
// Kinect DK
#include "include/k4a.hpp"#define VERIFY(result, error)                                                                            \if(result != K4A_RESULT_SUCCEEDED)                                                                   \{                                                                                                    \printf("%s \n - (File: %s, Function: %s, Line: %d)\n", error, __FILE__, __FUNCTION__, __LINE__); \exit(1);                                                                                         \}                                                                                                    \int main()
{//定义身高float UPPER_BODY; //上身长度float body_angel; //上身倾角k4a_device_t device = NULL;VERIFY(k4a_device_open(0, &device), "Open K4A Device failed!");const uint32_t device_count = k4a_device_get_installed_count();if (1 == device_count){std::cout << "Found " << device_count << " connected devices. " << std::endl;}else{std::cout << "Error: more than one K4A devices found. " << std::endl;}//打开设备k4a_device_open(0, &device);std::cout << "Done: open device. " << std::endl;// Start camera. Make sure depth camera is enabled.k4a_device_configuration_t deviceConfig = K4A_DEVICE_CONFIG_INIT_DISABLE_ALL;deviceConfig.depth_mode = K4A_DEPTH_MODE_NFOV_2X2BINNED;deviceConfig.color_resolution = K4A_COLOR_RESOLUTION_720P;deviceConfig.camera_fps = K4A_FRAMES_PER_SECOND_30;deviceConfig.color_format = K4A_IMAGE_FORMAT_COLOR_BGRA32;deviceConfig.synchronized_images_only = true;// ensures that depth and color images are both available in the capture//开始相机//k4a_device_start_cameras(device, &deviceConfig);VERIFY(k4a_device_start_cameras(device, &deviceConfig), "Start K4A cameras failed!");std::cout << "Done: start camera." << std::endl;//查询传感器校准k4a_calibration_t sensor_calibration;k4a_device_get_calibration(device, deviceConfig.depth_mode, deviceConfig.color_resolution, &sensor_calibration);VERIFY(k4a_device_get_calibration(device, deviceConfig.depth_mode, deviceConfig.color_resolution, &sensor_calibration),"Get depth camera calibration failed!");//创建人体跟踪器k4abt_tracker_t tracker = NULL;k4abt_tracker_configuration_t tracker_config = K4ABT_TRACKER_CONFIG_DEFAULT;k4abt_tracker_create(&sensor_calibration, tracker_config, &tracker);VERIFY(k4abt_tracker_create(&sensor_calibration, tracker_config, &tracker), "Body tracker initialization failed!");cv::Mat cv_rgbImage_with_alpha;cv::Mat cv_rgbImage_no_alpha;cv::Mat cv_depth;cv::Mat cv_depth_8U;int frame_count = 0;while(true){k4a_capture_t sensor_capture;k4a_wait_result_t get_capture_result = k4a_device_get_capture(device, &sensor_capture, K4A_WAIT_INFINITE);//获取RGB和depth图像k4a_image_t rgbImage = k4a_capture_get_color_image(sensor_capture);k4a_image_t depthImage = k4a_capture_get_depth_image(sensor_capture);//RGBcv_rgbImage_with_alpha = cv::Mat(k4a_image_get_height_pixels(rgbImage), k4a_image_get_width_pixels(rgbImage), CV_8UC4, k4a_image_get_buffer(rgbImage));cvtColor(cv_rgbImage_with_alpha, cv_rgbImage_no_alpha, cv::COLOR_BGRA2BGR);      //depthcv_depth = cv::Mat(k4a_image_get_height_pixels(depthImage), k4a_image_get_width_pixels(depthImage), CV_16U, k4a_image_get_buffer(depthImage), k4a_image_get_stride_bytes(depthImage));cv_depth.convertTo(cv_depth_8U, CV_8U, 1);//计算姿态if (get_capture_result == K4A_WAIT_RESULT_SUCCEEDED){frame_count++;k4a_wait_result_t queue_capture_result = k4abt_tracker_enqueue_capture(tracker, sensor_capture, K4A_WAIT_INFINITE);k4a_capture_release(sensor_capture); // Remember to release the sensor capture once you finish using itif (queue_capture_result == K4A_WAIT_RESULT_TIMEOUT){// It should never hit timeout when K4A_WAIT_INFINITE is set.printf("Error! Add capture to tracker process queue timeout!\n");break;}else if (queue_capture_result == K4A_WAIT_RESULT_FAILED){printf("Error! Add capture to tracker process queue failed!\n");break;}k4abt_frame_t body_frame = NULL;k4a_wait_result_t pop_frame_result = k4abt_tracker_pop_result(tracker, &body_frame, K4A_WAIT_INFINITE);if (pop_frame_result == K4A_WAIT_RESULT_SUCCEEDED){// Successfully popped the body tracking result. Start your processing size_t num_bodies = k4abt_frame_get_num_bodies(body_frame);for (size_t i = 0; i < num_bodies; i++){k4abt_skeleton_t skeleton;k4abt_frame_get_body_skeleton(body_frame, i, &skeleton);//std::cout << typeid(skeleton.joints->position.v).name();k4a_float2_t P_HEAD_2D;k4a_float2_t P_NECK_2D;k4a_float2_t P_CHEST_2D;k4a_float2_t P_HIP_2D;k4a_float2_t P_CLAVICLE_RIGHT_2D;k4a_float2_t P_CLAVICLE_LEFT_2D;k4a_float2_t P_HIP_RIGHT_2D;k4a_float2_t P_HIP_LEFT_2D;k4a_float2_t P_KNEE_LEFT_2D;k4a_float2_t P_KNEE_RIGHT_2D;int result;//头部k4abt_joint_t  P_HEAD = skeleton.joints[K4ABT_JOINT_NOSE];//3D转2D,并在color中画出k4a_calibration_3d_to_2d(&sensor_calibration, &P_HEAD.position, K4A_CALIBRATION_TYPE_DEPTH, K4A_CALIBRATION_TYPE_COLOR, &P_HEAD_2D, &result);cv::circle(cv_rgbImage_no_alpha, cv::Point(P_HEAD_2D.xy.x, P_HEAD_2D.xy.y), 3, cv::Scalar(0, 255, 255));//颈部k4abt_joint_t  P_NECK = skeleton.joints[K4ABT_JOINT_NECK];//3D转2D,并在color中画出k4a_calibration_3d_to_2d(&sensor_calibration, &P_NECK.position, K4A_CALIBRATION_TYPE_DEPTH, K4A_CALIBRATION_TYPE_COLOR, &P_CHEST_2D, &result);cv::circle(cv_rgbImage_no_alpha, cv::Point(P_CHEST_2D.xy.x, P_CHEST_2D.xy.y), 3, cv::Scalar(0, 255, 255));//胸部k4abt_joint_t  P_CHEST = skeleton.joints[K4ABT_JOINT_SPINE_CHEST];//3D转2D,并在color中画出k4a_calibration_3d_to_2d(&sensor_calibration, &P_CHEST.position, K4A_CALIBRATION_TYPE_DEPTH, K4A_CALIBRATION_TYPE_COLOR, &P_NECK_2D, &result);cv::circle(cv_rgbImage_no_alpha, cv::Point(P_NECK_2D.xy.x, P_NECK_2D.xy.y), 3, cv::Scalar(0, 255, 255));//髋部k4abt_joint_t  P_HIP = skeleton.joints[K4ABT_JOINT_PELVIS];//3D转2D,并在color中画出k4a_calibration_3d_to_2d(&sensor_calibration, &P_HIP.position, K4A_CALIBRATION_TYPE_DEPTH, K4A_CALIBRATION_TYPE_COLOR, &P_HIP_2D, &result);cv::circle(cv_rgbImage_no_alpha, cv::Point(P_HIP_2D.xy.x, P_HIP_2D.xy.y), 3, cv::Scalar(0, 255, 255));//右肩(计算上身倾角需要)k4abt_joint_t  P_CLAVICLE_RIGHT = skeleton.joints[K4ABT_JOINT_CLAVICLE_RIGHT];//3D转2D,并在color中画出k4a_calibration_3d_to_2d(&sensor_calibration, &P_CLAVICLE_RIGHT.position, K4A_CALIBRATION_TYPE_DEPTH, K4A_CALIBRATION_TYPE_COLOR, &P_CLAVICLE_RIGHT_2D, &result);cv::circle(cv_rgbImage_no_alpha, cv::Point(P_CLAVICLE_RIGHT_2D.xy.x, P_CLAVICLE_RIGHT_2D.xy.y), 3, cv::Scalar(0, 255, 255));//右髋(计算上身倾角需要)k4abt_joint_t  P_HIP_RIGHT = skeleton.joints[K4ABT_JOINT_HIP_RIGHT];//3D转2D,并在color中画出,并画出右肩到右髋的连线k4a_calibration_3d_to_2d(&sensor_calibration, &P_HIP_RIGHT.position, K4A_CALIBRATION_TYPE_DEPTH, K4A_CALIBRATION_TYPE_COLOR, &P_HIP_RIGHT_2D, &result);cv::circle(cv_rgbImage_no_alpha, cv::Point(P_HIP_RIGHT_2D.xy.x, P_HIP_RIGHT_2D.xy.y), 3, cv::Scalar(0, 255, 255));cv::line(cv_rgbImage_no_alpha, cv::Point(P_CLAVICLE_RIGHT_2D.xy.x, P_CLAVICLE_RIGHT_2D.xy.y), cv::Point(P_HIP_RIGHT_2D.xy.x, P_HIP_RIGHT_2D.xy.y), cv::Scalar(0, 0, 255), 2);//右膝(计算上身倾角需要)k4abt_joint_t  P_KNEE_RIGHT = skeleton.joints[K4ABT_JOINT_KNEE_RIGHT];//3D转2D,并在color中画出,并画出右肩到右膝、右髋到右膝的连线k4a_calibration_3d_to_2d(&sensor_calibration, &P_KNEE_RIGHT.position, K4A_CALIBRATION_TYPE_DEPTH, K4A_CALIBRATION_TYPE_COLOR, &P_KNEE_RIGHT_2D, &result);cv::circle(cv_rgbImage_no_alpha, cv::Point(P_KNEE_RIGHT_2D.xy.x, P_KNEE_RIGHT_2D.xy.y), 3, cv::Scalar(0, 255, 255));cv::line(cv_rgbImage_no_alpha, cv::Point(P_CLAVICLE_RIGHT_2D.xy.x, P_CLAVICLE_RIGHT_2D.xy.y), cv::Point(P_KNEE_RIGHT_2D.xy.x, P_KNEE_RIGHT_2D.xy.y), cv::Scalar(0, 0, 255), 2);cv::line(cv_rgbImage_no_alpha, cv::Point(P_HIP_RIGHT_2D.xy.x, P_HIP_RIGHT_2D.xy.y), cv::Point(P_KNEE_RIGHT_2D.xy.x, P_KNEE_RIGHT_2D.xy.y), cv::Scalar(0, 0, 255), 2);//左肩(计算上身倾角需要)k4abt_joint_t  P_CLAVICLE_LEFT = skeleton.joints[K4ABT_JOINT_CLAVICLE_LEFT];//3D转2D,并在color中画出k4a_calibration_3d_to_2d(&sensor_calibration, &P_CLAVICLE_LEFT.position, K4A_CALIBRATION_TYPE_DEPTH, K4A_CALIBRATION_TYPE_COLOR, &P_CLAVICLE_LEFT_2D, &result);cv::circle(cv_rgbImage_no_alpha, cv::Point(P_CLAVICLE_LEFT_2D.xy.x, P_CLAVICLE_LEFT_2D.xy.y), 3, cv::Scalar(0, 255, 255));//左髋(计算上身倾角需要)k4abt_joint_t  P_HIP_LEFT = skeleton.joints[K4ABT_JOINT_HIP_LEFT];//3D转2D,并在color中画出,并画出左肩到左髋的连线k4a_calibration_3d_to_2d(&sensor_calibration, &P_HIP_LEFT.position, K4A_CALIBRATION_TYPE_DEPTH, K4A_CALIBRATION_TYPE_COLOR, &P_HIP_LEFT_2D, &result);cv::circle(cv_rgbImage_no_alpha, cv::Point(P_HIP_LEFT_2D.xy.x, P_HIP_LEFT_2D.xy.y), 3, cv::Scalar(0, 255, 255));cv::line(cv_rgbImage_no_alpha, cv::Point(P_HIP_LEFT_2D.xy.x, P_HIP_LEFT_2D.xy.y), cv::Point(P_CLAVICLE_LEFT_2D.xy.x, P_CLAVICLE_LEFT_2D.xy.y), cv::Scalar(0, 0, 255), 2);//左膝(计算上身倾角需要)k4abt_joint_t  P_KNEE_LEFT = skeleton.joints[K4ABT_JOINT_KNEE_LEFT];//3D转2D,并在color中画出,并画出左肩到左膝、左髋到左膝的连线k4a_calibration_3d_to_2d(&sensor_calibration, &P_KNEE_LEFT.position, K4A_CALIBRATION_TYPE_DEPTH, K4A_CALIBRATION_TYPE_COLOR, &P_KNEE_LEFT_2D, &result);cv::circle(cv_rgbImage_no_alpha, cv::Point(P_KNEE_LEFT_2D.xy.x, P_KNEE_LEFT_2D.xy.y), 3, cv::Scalar(0, 255, 255));cv::line(cv_rgbImage_no_alpha, cv::Point(P_CLAVICLE_LEFT_2D.xy.x, P_CLAVICLE_LEFT_2D.xy.y), cv::Point(P_KNEE_LEFT_2D.xy.x, P_KNEE_LEFT_2D.xy.y), cv::Scalar(0, 0, 255), 2);cv::line(cv_rgbImage_no_alpha, cv::Point(P_HIP_LEFT_2D.xy.x, P_HIP_LEFT_2D.xy.y), cv::Point(P_KNEE_LEFT_2D.xy.x, P_KNEE_LEFT_2D.xy.y), cv::Scalar(0, 0, 255), 2);//输出头部关键点坐标(skeleton.joints_HEAD->position.v为头部坐标点,数据结构float[3])char text_head[50];std::cout << "头部坐标:";for (size_t i = 0; i < 3; i++){std::cout << P_HEAD.position.v[i] << " ";}printf("\n");//输出颈部关键点坐标std::cout << "颈部坐标:";for (size_t i = 0; i < 3; i++){std::cout << P_NECK.position.v[i] << " " ;}printf("\n");//输出胸部关键点坐标std::cout << "胸部坐标:";for (size_t i = 0; i < 3; i++){std::cout << P_CHEST.position.v[i] << " ";}printf("\n");//输出髋部关键点坐标std::cout << "髋部坐标:";for (size_t i = 0; i < 3; i++){std::cout << P_HIP.position.v[i] << " ";}printf("\n");//估算人体身高UPPER_BODY= sqrt(pow((P_NECK.position.xyz.x - P_HIP.position.xyz.x),2) +pow((P_NECK.position.xyz.y - P_HIP.position.xyz.y),2) + pow((P_NECK.position.xyz.z - P_HIP.position.xyz.z),2));float HEIGHT = UPPER_BODY  * 1770/518;std::cout << "估算身高高度是;" << HEIGHT << std::endl;//估算上身倾角float ang_clavicetohip_right = sqrt(pow((P_CLAVICLE_RIGHT.position.xyz.x - P_HIP_RIGHT.position.xyz.x), 2) + pow((P_CLAVICLE_RIGHT.position.xyz.y - P_HIP_RIGHT.position.xyz.y), 2) + pow((P_CLAVICLE_RIGHT.position.xyz.z - P_HIP_RIGHT.position.xyz.z), 2));float ang_clavicetohip_left = sqrt(pow((P_CLAVICLE_LEFT.position.xyz.x - P_HIP_LEFT.position.xyz.x), 2) + pow((P_CLAVICLE_LEFT.position.xyz.y - P_HIP_LEFT.position.xyz.y), 2) + pow((P_CLAVICLE_LEFT.position.xyz.z - P_HIP_LEFT.position.xyz.z), 2));float ang_hiptoknee_right = sqrt(pow((P_HIP_RIGHT.position.xyz.x - P_KNEE_RIGHT.position.xyz.x), 2) + pow((P_HIP_RIGHT.position.xyz.y - P_KNEE_RIGHT.position.xyz.y), 2) + pow((P_HIP_RIGHT.position.xyz.z - P_KNEE_RIGHT.position.xyz.z), 2));float ang_hiptoknee_left = sqrt(pow((P_HIP_LEFT.position.xyz.x - P_KNEE_LEFT.position.xyz.x), 2) + pow((P_HIP_LEFT.position.xyz.y - P_KNEE_LEFT.position.xyz.y), 2) + pow((P_HIP_LEFT.position.xyz.z - P_KNEE_LEFT.position.xyz.z), 2));float ang_clavicetoknee_right = sqrt(pow((P_CLAVICLE_RIGHT.position.xyz.x - P_KNEE_RIGHT.position.xyz.x), 2) + pow((P_CLAVICLE_RIGHT.position.xyz.y - P_KNEE_RIGHT.position.xyz.y), 2) + pow((P_CLAVICLE_RIGHT.position.xyz.z - P_KNEE_RIGHT.position.xyz.z), 2));float ang_clavicetoknee_left = sqrt(pow((P_CLAVICLE_LEFT.position.xyz.x - P_KNEE_LEFT.position.xyz.x), 2) + pow((P_CLAVICLE_LEFT.position.xyz.y - P_KNEE_LEFT.position.xyz.y), 2) + pow((P_CLAVICLE_LEFT.position.xyz.z - P_KNEE_LEFT.position.xyz.z), 2));float body_angel_right = acos((ang_clavicetohip_right * ang_clavicetohip_right + ang_hiptoknee_right * ang_hiptoknee_right - ang_clavicetoknee_right * ang_clavicetoknee_right) / (2 * ang_clavicetohip_right * ang_hiptoknee_right)) * 180.0 / 3.1415926;float body_angel_left = acos((ang_clavicetohip_left * ang_clavicetohip_left + ang_hiptoknee_left * ang_hiptoknee_left - ang_clavicetoknee_left * ang_clavicetoknee_left) / (2 * ang_clavicetohip_left * ang_hiptoknee_left)) * 180.0 / 3.1415926;body_angel = (body_angel_right + body_angel_left)/2;std::cout << "估算上身倾角是;" << body_angel << std::endl;uint32_t id = k4abt_frame_get_body_id(body_frame, i);}printf("%zu bodies are detected!\n", num_bodies);k4abt_frame_release(body_frame); // Remember to release the body frame once you finish using it}else if (pop_frame_result == K4A_WAIT_RESULT_TIMEOUT){//  It should never hit timeout when K4A_WAIT_INFINITE is set.printf("Error! Pop body frame result timeout!\n");break;}else{printf("Pop body frame result failed!\n");break;}}else if (get_capture_result == K4A_WAIT_RESULT_TIMEOUT){// It should never hit time out when K4A_WAIT_INFINITE is set.printf("Error! Get depth frame time out!\n");break;}else{printf("Get depth capture returned error: %d\n", get_capture_result);break;}// show imageimshow("color", cv_rgbImage_no_alpha);imshow("depth", cv_depth_8U);cv::waitKey(0);k4a_image_release(rgbImage);}printf("Finished body tracking processing!\n");k4abt_tracker_shutdown(tracker);k4abt_tracker_destroy(tracker);k4a_device_stop_cameras(device);k4a_device_close(device);return 0;
}

【姿态检测】win10+Azure Kinect Dk 人体姿态跟踪使用教程相关推荐

  1. WIN11/win10+Azure Kinect DK详细驱动配置教程(亲测)

    本人3000多大洋 买了一台 Azure Kinect DK设备,打算研究研究人体姿态.今天配置一下,网上的教程不少,有的过期教程,有的和我的不匹配,所以,只能参考他们的,取其精华 去其糟粕. 下面 ...

  2. 十、获取人体骨骼关键点三维坐标——Azure Kinect DK入门

    Azure Kinect DK 文档 Azure-Kinect-Sensor-SDK 文档 Azure Kinect Body Tracking SDK 文档 OpenCV文档 参考文档学习,边学,边 ...

  3. Azure Kinect DK入门学习(一)设置Azure Kinect DK

    环境安装 windows10+显卡驱动+vs2019+Azure Kinect DK+人体跟踪SDK 官方提供的是vs2015 附官方说明文档:官方教程 设置Azure Kinect DK 下载相关S ...

  4. 基于Azure Kinect DK的人体姿态跟踪监测,获取关节信息(Windows)

    Azure Kinect DK相机安装配置:https://blog.csdn.net/y18771025420/article/details/113468859 OpenCV安装: https:/ ...

  5. 通过Azure Kinect DK 基于Ubuntu18.04实现室内三维重建(一)

    Ubuntu18.04上.Azure Kinect DK采集数据,实现室内三维重建 采用这个思路下的平台和设备环境的实现室内三维重建可能是大多数研究三维重建方向的初始思路.之前接触并作了一些简单的实践 ...

  6. Ubuntu18.04 测试Azure Kinect DK 安装Azure Kinect传感器SDK

    配置Azure Kinect SDK ubuntu18.04配置Azure Kinect SDK的时候其实不需要下载源码进行编译. 之前从Github上下载的源码进行编译,遇到很多错误.在extern ...

  7. Azure Kinect DK 基本开发流程

    Azure Kinect DK 基本开发流程 简单介绍一下Azure Kinect DK传感器SDK的系统要求 支持的操作系统 主机电脑的最低硬件要求 Azure Kinect DK的基本开发流程 1 ...

  8. Azure Kinect DK 产品调研

    1. 什么是Azure Kinect DK Azure Kinect DK是微软公司旗下的一款专门为开发人员和企业用户量身的工具包,配有先进的 AI 传感器,提供复杂的计算机视觉和语音模型. Kine ...

  9. Azure Kinect DK 首次配置流程

    [人体跟踪配置准备] 1.下载并安装显卡的最新 NVIDIA 驱动程序. 旧版驱动程序可能与随人体跟踪 SDK 一起重新分发的 CUDA 二进制文件不兼容.最好是到官网下载,驱动精灵等平台可能不是最新 ...

最新文章

  1. 展望未来,总结过去10年的程序员生涯,给程序员小弟弟小妹妹们的一些总结性忠告...
  2. solidworks无法获得下列许可standard_SolidWorks2020安装无法获得下列许可SOLIDWORKS Standard怎么解决?...
  3. Java 8 Friday Goodies:java.io终于成功了!
  4. 2019年中国IT市场趋势热点
  5. 快速了解前端开发HTML的正确姿势
  6. 最大堆MaxHeap和最小堆MinHeap的实现(转)
  7. Cache计算的再总结
  8. 单片机应用案例大全-900套(保持更新)
  9. vue 音频文件打包后找不到文件
  10. PHP 的oop思想
  11. 计算机组成原理_DRAM和SRAM
  12. 项目的运筹帷幄—项目进度、质量和成本最优决策理论探讨
  13. pdf转换成word转换器免费版
  14. 微信朋友圈这样招生,才不会被屏蔽!(附实操案例)
  15. java之本周、上周、开始、结束时间
  16. Android---MVC/MVP/MVVM的演进
  17. 智联招聘数据Hbase数据分析+可视化
  18. android elevation translationz 简书,Android5.x中的阴影效果elevation和translationZ的实现方法...
  19. 联想服务器应用场景,联想服务器承载沈阳地铁三大核心应用
  20. 某公司.Net高级开发面试题(1)

热门文章

  1. 企业微信管理系统分析
  2. 使用ClickHouse JDBC官方驱动,踩坑无数
  3. 如何搭建一套个性化推荐系统?
  4. c#—MemoryStream读图片存入ImageList
  5. Struts2的 两个蝴蝶飞 你好 (一)
  6. 给世界上色——滤镜底层原理
  7. gateway官网解读(三)
  8. 【C++】面向对象之封装篇(下)
  9. 河北大学计算机科学与技术考研,计算机专业考研经验贴(重)
  10. 【文章阅读】The Devil is in the Decoder【计算机视觉中的上采样方式-6种】