一个月一步步的学习历程已经在我前面的随笔中。现在终于迎来了最后一步

不多说,贴代码,不懂的,先看看我之前的随笔,有一步步的过程。还是不懂就在评论中问。

#ifndef _DYNAMIC_RTSP_SERVER_HH
#define _DYNAMIC_RTSP_SERVER_HH#ifndef _RTSP_SERVER_SUPPORTING_HTTP_STREAMING_HH
#include <RTSPServerSupportingHTTPStreaming.hh>
#endifclass DynamicRTSPServer: public RTSPServerSupportingHTTPStreaming {
public:static DynamicRTSPServer* createNew(UsageEnvironment& env, Port ourPort,UserAuthenticationDatabase* authDatabase,unsigned reclamationTestSeconds = 65);protected:DynamicRTSPServer(UsageEnvironment& env, int ourSocket, Port ourPort,UserAuthenticationDatabase* authDatabase, unsigned reclamationTestSeconds);// called only by createNew();virtual ~DynamicRTSPServer();protected: // redefined virtual functionsvirtual ServerMediaSession*lookupServerMediaSession(char const* streamName, Boolean isFirstLookupInSession);
};#endif

DynamicRTSPServer.hh

#include "DynamicRTSPServer.hh"
#include "H264LiveVideoServerMediaSubssion.hh"
#include <liveMedia.hh>
#include <string.h>DynamicRTSPServer* DynamicRTSPServer::createNew(UsageEnvironment& env, Port ourPort,UserAuthenticationDatabase* authDatabase,unsigned reclamationTestSeconds)
{int ourSocket = setUpOurSocket(env, ourPort);if (ourSocket == -1) return NULL;return new DynamicRTSPServer(env, ourSocket, ourPort, authDatabase, reclamationTestSeconds);
}DynamicRTSPServer::DynamicRTSPServer(UsageEnvironment& env, int ourSocket, Port ourPort,UserAuthenticationDatabase* authDatabase, unsigned reclamationTestSeconds): RTSPServerSupportingHTTPStreaming(env, ourSocket, ourPort, authDatabase, reclamationTestSeconds) {}DynamicRTSPServer::~DynamicRTSPServer() {}static ServerMediaSession* createNewSMS(UsageEnvironment& env, char const* fileName/*, FILE* fid*/); // forward
ServerMediaSession* DynamicRTSPServer::lookupServerMediaSession(char const* streamName, Boolean isFirstLookupInSession)
{// Next, check whether we already have a "ServerMediaSession" for this file:ServerMediaSession* sms = RTSPServer::lookupServerMediaSession(streamName);Boolean smsExists = sms != NULL;// Handle the four possibilities for "fileExists" and "smsExists":if (smsExists && isFirstLookupInSession){ // Remove the existing "ServerMediaSession" and create a new one, in case the underlying// file has changed in some way:
      removeServerMediaSession(sms); sms = NULL;} if (sms == NULL) {sms = createNewSMS(envir(), streamName/*, fid*/); addServerMediaSession(sms);}return sms;
}static ServerMediaSession* createNewSMS(UsageEnvironment& env, char const* fileName/*, FILE* fid*/)
{// Use the file name extension to determine the type of "ServerMediaSession":char const* extension = strrchr(fileName, '.');if (extension == NULL) return NULL;ServerMediaSession* sms = NULL;Boolean const reuseSource = False;if (strcmp(extension, ".264") == 0) {// Assumed to be a H.264 Video Elementary Stream file:char const* descStr = "H.264 Video, streamed by the LIVE555 Media Server"; sms = ServerMediaSession::createNew(env, fileName, fileName, descStr);OutPacketBuffer::maxSize = 100000; // allow for some possibly large H.264 framessms->addSubsession(H264LiveVideoServerMediaSubssion::createNew(env, fileName, reuseSource));}return sms;
}

DynamicRTSPServer.cpp

#include <BasicUsageEnvironment.hh>
#include "DynamicRTSPServer.hh"
#include "H264FramedLiveSource.hh"
#include <opencv/highgui.h>//"version"
#ifndef _MEDIA_SERVER_VERSION_HH
#define _MEDIA_SERVER_VERSION_HH
#define MEDIA_SERVER_VERSION_STRING "0.85"
#endifCameras Camera;
int main(int argc, char** argv) {// Begin by setting up our usage environment:TaskScheduler* scheduler = BasicTaskScheduler::createNew();UsageEnvironment* env = BasicUsageEnvironment::createNew(*scheduler);UserAuthenticationDatabase* authDB = NULL;
#ifdef ACCESS_CONTROL// To implement client access control to the RTSP server, do the following:authDB = new UserAuthenticationDatabase;authDB->addUserRecord("username1", "password1"); // replace these with real strings// Repeat the above with each <username>, <password> that you wish to allow// access to the server.
#endif// Create the RTSP server.  Try first with the default port number (554),// and then with the alternative port number (8554):RTSPServer* rtspServer;portNumBits rtspServerPortNum = 554;Camera.Init();rtspServer = DynamicRTSPServer::createNew(*env, rtspServerPortNum, authDB);if (rtspServer == NULL) {rtspServerPortNum = 8554;rtspServer = DynamicRTSPServer::createNew(*env, rtspServerPortNum, authDB);}if (rtspServer == NULL) {*env << "Failed to create RTSP server: " << env->getResultMsg() << "\n";exit(1);}*env << "LIVE555 Media Server\n";*env << "\tversion " << MEDIA_SERVER_VERSION_STRING<< " (LIVE555 Streaming Media library version "<< LIVEMEDIA_LIBRARY_VERSION_STRING << ").\n";char* urlPrefix = rtspServer->rtspURLPrefix();*env << "Play streams from this server using the URL\n\t"<< urlPrefix << "<filename>\nwhere <filename> is a file present in the current directory.\n";*env << "Each file's type is inferred from its name suffix:\n";*env << "\t\".264\" => a H.264 Video Elementary Stream file\n";// Also, attempt to create a HTTP server for RTSP-over-HTTP tunneling.// Try first with the default HTTP port (80), and then with the alternative HTTP// port numbers (8000 and 8080).if (rtspServer->setUpTunnelingOverHTTP(80) || rtspServer->setUpTunnelingOverHTTP(8000) || rtspServer->setUpTunnelingOverHTTP(8080)) {*env << "(We use port " << rtspServer->httpServerPortNum() << " for optional RTSP-over-HTTP tunneling, or for HTTP live streaming (for indexed Transport Stream files only).)\n";} else {*env << "(RTSP-over-HTTP tunneling is not available.)\n";}env->taskScheduler().doEventLoop(); // does not return
  Camera.Destory();return 0; // only to prevent compiler warning
}

live555MediaServer.cpp

/*
*  H264LiveVideoServerMediaSubssion.hh
*/
#ifndef _H264_LIVE_VIDEO_SERVER_MEDIA_SUBSESSION_HH
#define _H264_LIVE_VIDEO_SERVER_MEDIA_SUBSESSION_HH
#include <liveMedia/H264VideoFileServerMediaSubsession.hh>
#include <UsageEnvironment/UsageEnvironment.hh>class H264LiveVideoServerMediaSubssion : public H264VideoFileServerMediaSubsession {public:static H264LiveVideoServerMediaSubssion*createNew(UsageEnvironment& env,char const* fileName,Boolean reuseFirstSource);protected: H264LiveVideoServerMediaSubssion(UsageEnvironment& env, char const* fileName, Boolean reuseFirstSource);~H264LiveVideoServerMediaSubssion();protected:FramedSource* createNewStreamSource(unsigned clientSessionId,unsigned& estBitrate);
public:char fFileName[100];};#endif

H264LiveVideoServerMediaSubssion.hh

/*
*  H264LiveVideoServerMediaSubssion.cpp
*/#include "H264LiveVideoServerMediaSubssion.hh"
#include "H264FramedLiveSource.hh"
#include <H264VideoStreamFramer.hh>
#include <string.h>H264LiveVideoServerMediaSubssion* H264LiveVideoServerMediaSubssion::createNew (UsageEnvironment& env, char const* fileName, Boolean reuseFirstSource)
{return new H264LiveVideoServerMediaSubssion(env, fileName, reuseFirstSource);
}H264LiveVideoServerMediaSubssion::H264LiveVideoServerMediaSubssion(UsageEnvironment& env, char const* fileName, Boolean reuseFirstSource)
: H264VideoFileServerMediaSubsession(env, fileName, reuseFirstSource)
{strcpy(fFileName, fileName);
}H264LiveVideoServerMediaSubssion::~H264LiveVideoServerMediaSubssion()
{
}FramedSource* H264LiveVideoServerMediaSubssion::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate)
{estBitrate = 1000; // kbps
H264FramedLiveSource* liveSource = H264FramedLiveSource::createNew(envir(), fFileName);if (liveSource == NULL){return NULL;}return H264VideoStreamFramer::createNew(envir(), liveSource);
}

H264LiveVideoServerMediaSubssion.cpp

/*
* H264FramedLiveSource.hh
*/#ifndef _H264FRAMEDLIVESOURCE_HH
#define _H264FRAMEDLIVESOURCE_HH#include <FramedSource.hh>
#include <UsageEnvironment.hh>
#include <opencv/highgui.h>extern "C"
{
#include "encoder.h"
}class H264FramedLiveSource : public FramedSource
{
public:static H264FramedLiveSource* createNew(UsageEnvironment& env, char const* fileName, unsigned preferredFrameSize = 0, unsigned playTimePerFrame = 0);x264_nal_t * my_nal;protected:H264FramedLiveSource(UsageEnvironment& env, char const* fileName, unsigned preferredFrameSize, unsigned playTimePerFrame); // called only by createNew()~H264FramedLiveSource();private:// redefined virtual functions:virtual void doGetNextFrame();int TransportData(unsigned char* to, unsigned maxSize);//static int nalIndex;protected:FILE *fp;};class Cameras
{
public:void Init();void GetNextFrame();void Destory();
public:CvCapture * cap ;my_x264_encoder*  encoder;int n_nal;x264_picture_t pic_out;IplImage * img;unsigned char *RGB1;
};#endif

H264FramedLiveSource.hh

/*
*  H264FramedLiveSource.cpp
*/#include "H264FramedLiveSource.hh"
#include <stdio.h>
#include <stdint.h>
#include <unistd.h>
#include <fcntl.h>
#include <stdlib.h>
#include <string.h>
extern class Cameras Camera; //in mainRTSPServer.cpp#define WIDTH 320
#define HEIGHT 240
#define widthStep 960
#define ENCODER_TUNE   "zerolatency"
#define ENCODER_PROFILE  "baseline"
#define ENCODER_PRESET "veryfast"
#define ENCODER_COLORSPACE X264_CSP_I420
#define CLEAR(x) (memset((&x),0,sizeof(x)))void Convert(unsigned char *RGB, unsigned char *YUV, unsigned int width, unsigned int height);H264FramedLiveSource::H264FramedLiveSource(UsageEnvironment& env, char const* fileName, unsigned preferredFrameSize, unsigned playTimePerFrame) : FramedSource(env)
{//fp = fopen(fileName, "rb");
}H264FramedLiveSource* H264FramedLiveSource::createNew(UsageEnvironment& env, char const* fileName, unsigned preferredFrameSize /*= 0*/, unsigned playTimePerFrame /*= 0*/)
{H264FramedLiveSource* newSource = new H264FramedLiveSource(env, fileName, preferredFrameSize, playTimePerFrame);return newSource;
}H264FramedLiveSource::~H264FramedLiveSource()
{//fclose(fp);
}void H264FramedLiveSource::doGetNextFrame()
{fFrameSize = 0;//不知道为什么,多几帧一起发送效果会好一点点,也许是心理作怪for(int i = 0; i < 2; i++){Camera.GetNextFrame();for (my_nal = Camera.encoder->nal; my_nal < Camera.encoder->nal + Camera.n_nal; ++my_nal){memmove((unsigned char*)fTo + fFrameSize, my_nal->p_payload, my_nal->i_payload);fFrameSize += my_nal->i_payload;}}nextTask() = envir().taskScheduler().scheduleDelayedTask(0,(TaskFunc*)FramedSource::afterGetting, this);//表示延迟0秒后再执行 afterGetting 函数return;
}void Cameras::Init()
{int ret;//打开第一个摄像头cap = cvCreateCameraCapture(0);if (!cap){fprintf(stderr, "Can not open camera1.\n");exit(-1);}cvSetCaptureProperty(cap, CV_CAP_PROP_FRAME_WIDTH, WIDTH);cvSetCaptureProperty(cap, CV_CAP_PROP_FRAME_HEIGHT, HEIGHT);encoder = (my_x264_encoder *)malloc(sizeof(my_x264_encoder));if (!encoder){printf("cannot malloc my_x264_encoder !\n");exit(EXIT_FAILURE);}CLEAR(*encoder);strcpy(encoder->parameter_preset, ENCODER_PRESET);strcpy(encoder->parameter_tune, ENCODER_TUNE);encoder->x264_parameter = (x264_param_t *)malloc(sizeof(x264_param_t));if (!encoder->x264_parameter){printf("malloc x264_parameter error!\n");exit(EXIT_FAILURE);}/*初始化编码器*/CLEAR(*(encoder->x264_parameter));x264_param_default(encoder->x264_parameter);if ((ret = x264_param_default_preset(encoder->x264_parameter, encoder->parameter_preset, encoder->parameter_tune))<0){printf("x264_param_default_preset error!\n");exit(EXIT_FAILURE);}/*cpuFlags 去空缓冲区继续使用不死锁保证*/encoder->x264_parameter->i_threads = X264_SYNC_LOOKAHEAD_AUTO;/*视频选项*/encoder->x264_parameter->i_width = WIDTH;//要编码的图像的宽度encoder->x264_parameter->i_height = HEIGHT;//要编码的图像的高度encoder->x264_parameter->i_frame_total = 0;//要编码的总帧数,不知道用0encoder->x264_parameter->i_keyint_max = 25;/*流参数*/encoder->x264_parameter->i_bframe = 5;encoder->x264_parameter->b_open_gop = 0;encoder->x264_parameter->i_bframe_pyramid = 0;encoder->x264_parameter->i_bframe_adaptive = X264_B_ADAPT_TRELLIS;/*log参数,不需要打印编码信息时直接注释掉*/
//    encoder->x264_parameter->i_log_level = X264_LOG_DEBUG;
encoder->x264_parameter->i_fps_num = 25;//码率分子encoder->x264_parameter->i_fps_den = 1;//码率分母
encoder->x264_parameter->b_intra_refresh = 1;encoder->x264_parameter->b_annexb = 1;/////
strcpy(encoder->parameter_profile, ENCODER_PROFILE);if ((ret = x264_param_apply_profile(encoder->x264_parameter, encoder->parameter_profile))<0){printf("x264_param_apply_profile error!\n");exit(EXIT_FAILURE);}/*打开编码器*/encoder->x264_encoder = x264_encoder_open(encoder->x264_parameter);encoder->colorspace = ENCODER_COLORSPACE;/*初始化pic*/encoder->yuv420p_picture = (x264_picture_t *)malloc(sizeof(x264_picture_t));if (!encoder->yuv420p_picture){printf("malloc encoder->yuv420p_picture error!\n");exit(EXIT_FAILURE);}if ((ret = x264_picture_alloc(encoder->yuv420p_picture, encoder->colorspace, WIDTH, HEIGHT))<0){printf("ret=%d\n", ret);printf("x264_picture_alloc error!\n");exit(EXIT_FAILURE);}encoder->yuv420p_picture->img.i_csp = encoder->colorspace;encoder->yuv420p_picture->img.i_plane = 3;encoder->yuv420p_picture->i_type = X264_TYPE_AUTO;/*申请YUV buffer*/encoder->yuv = (uint8_t *)malloc(WIDTH*HEIGHT * 3 / 2);if (!encoder->yuv){printf("malloc yuv error!\n");exit(EXIT_FAILURE);}CLEAR(*(encoder->yuv));encoder->yuv420p_picture->img.plane[0] = encoder->yuv;encoder->yuv420p_picture->img.plane[1] = encoder->yuv + WIDTH*HEIGHT;encoder->yuv420p_picture->img.plane[2] = encoder->yuv + WIDTH*HEIGHT + WIDTH*HEIGHT / 4;n_nal = 0;encoder->nal = (x264_nal_t *)calloc(2, sizeof(x264_nal_t));if (!encoder->nal){printf("malloc x264_nal_t error!\n");exit(EXIT_FAILURE);}CLEAR(*(encoder->nal));RGB1 = (unsigned char *)malloc(HEIGHT * WIDTH * 3);}
void Cameras::GetNextFrame()
{img = cvQueryFrame(cap);for (int i = 0; i< HEIGHT; i++){for (int j = 0; j< WIDTH; j++)            {RGB1[(i*WIDTH + j) * 3] = img->imageData[i * widthStep + j * 3 + 2];;RGB1[(i*WIDTH + j) * 3 + 1] = img->imageData[i * widthStep + j * 3 + 1];                RGB1[(i*WIDTH + j) * 3 + 2] = img->imageData[i * widthStep + j * 3];}}Convert(RGB1, encoder->yuv, WIDTH, HEIGHT);encoder->yuv420p_picture->i_pts++;
//printf("!!!!!\n");if ( x264_encoder_encode(encoder->x264_encoder, &encoder->nal, &n_nal, encoder->yuv420p_picture, &pic_out) < 0){printf("x264_encoder_encode error!\n");exit(EXIT_FAILURE);}
//printf("@@@@@@\n");/*for (my_nal = encoder->nal; my_nal < encoder->nal + n_nal; ++my_nal){write(fd_write, my_nal->p_payload, my_nal->i_payload);}*/
}
void Cameras::Destory()
{free(RGB1);cvReleaseCapture(&cap);free(encoder->yuv);free(encoder->yuv420p_picture);free(encoder->x264_parameter);x264_encoder_close(encoder->x264_encoder);free(encoder);
}

H264FramedLiveSource.cpp

#include <x264.h>typedef struct my_x264_encoder{x264_param_t  * x264_parameter;char parameter_preset[20];char parameter_tune[20];char parameter_profile[20];x264_t  * x264_encoder;x264_picture_t * yuv420p_picture;long colorspace;unsigned char *yuv;x264_nal_t * nal;
} my_x264_encoder;

encoder.h

#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <iostream>
//转换矩阵
#define MY(a,b,c) (( a*  0.2989  + b*  0.5866  + c*  0.1145))
#define MU(a,b,c) (( a*(-0.1688) + b*(-0.3312) + c*  0.5000 + 128))
#define MV(a,b,c) (( a*  0.5000  + b*(-0.4184) + c*(-0.0816) + 128))
//#define MY(a,b,c) (( a*  0.257 + b*  0.504  + c*  0.098+16))
//#define MU(a,b,c) (( a*( -0.148) + b*(- 0.291) + c* 0.439 + 128))
//#define MV(a,b,c) (( a*  0.439  + b*(- 0.368) + c*( - 0.071) + 128))//大小判断
#define DY(a,b,c) (MY(a,b,c) > 255 ? 255 : (MY(a,b,c) < 0 ? 0 : MY(a,b,c)))
#define DU(a,b,c) (MU(a,b,c) > 255 ? 255 : (MU(a,b,c) < 0 ? 0 : MU(a,b,c)))
#define DV(a,b,c) (MV(a,b,c) > 255 ? 255 : (MV(a,b,c) < 0 ? 0 : MV(a,b,c)))
#define CLIP(a) ((a) > 255 ? 255 : ((a) < 0 ? 0 : (a)))
//RGB to YUV
void Convert(unsigned char *RGB, unsigned char *YUV, unsigned int width, unsigned int height)
{//变量声明unsigned int i, x, y, j;unsigned char *Y = NULL;unsigned char *U = NULL;unsigned char *V = NULL;Y = YUV;U = YUV + width*height;V = U + ((width*height) >> 2);for (y = 0; y < height; y++)for (x = 0; x < width; x++){j = y*width + x;i = j * 3;Y[j] = (unsigned char)(DY(RGB[i], RGB[i + 1], RGB[i + 2]));if (x % 2 == 1 && y % 2 == 1){j = (width >> 1) * (y >> 1) + (x >> 1);//上面i仍有效U[j] = (unsigned char)((DU(RGB[i], RGB[i + 1], RGB[i + 2]) +DU(RGB[i - 3], RGB[i - 2], RGB[i - 1]) +DU(RGB[i - width * 3], RGB[i + 1 - width * 3], RGB[i + 2 - width * 3]) +DU(RGB[i - 3 - width * 3], RGB[i - 2 - width * 3], RGB[i - 1 - width * 3])) / 4);V[j] = (unsigned char)((DV(RGB[i], RGB[i + 1], RGB[i + 2]) +DV(RGB[i - 3], RGB[i - 2], RGB[i - 1]) +DV(RGB[i - width * 3], RGB[i + 1 - width * 3], RGB[i + 2 - width * 3]) +DV(RGB[i - 3 - width * 3], RGB[i - 2 - width * 3], RGB[i - 1 - width * 3])) / 4);}}
}

RGB2YUV.cpp

以上就是全部代码了,代码有些是修改源代码的,有些是修改别人的代码的,也许还有一些没用的变量或没用的步骤,忽略就行了。不影响编译。

g++ -c *.cpp -I /usr/local/include/groupsock  -I /usr/local/include/UsageEnvironment -I /usr/local/include/liveMedia -I /usr/local/include/BasicUsageEnvironment -I .

g++  *.o /usr/local/lib/libliveMedia.a /usr/local/lib/libgroupsock.a /usr/local/lib/libBasicUsageEnvironment.a /usr/local/lib/libUsageEnvironment.a /usr/local/lib/libx264.so /usr/local/lib/libopencv_highgui.so /usr/local/lib/libopencv_videoio.so /usr/lib/x86_64-linux-gnu/libx264.so.142 -ldl  -lm -lpthread -ldl -g

这个是我的编译命令,因为libx264误安装了两个版本,所以两个库都用上。

切换到root权限,然后执行./a.out

除了VLC软件之外,手机软件MX player更好用,我一直都是用手机软件看的,连上自己的WIFI,输入地址就行了。

代码中设置的帧速是25帧每秒,但实际只有...我看不出来。。总之算是很流畅了,分辨率为320x240(可以自己适当调节),延迟在1秒钟之内或1秒左右。

看了很多视频监控、直播的论文,别人要不就是不搭建服务器,要不就是直接传jpeg图片,要不就是帧速才6-7帧每秒,感觉都不怎么符合要求。

谢谢阅读~,有什么建议或问题的话希望能够提出。

测试没有问题后,剩下的步骤就是把代码移植到开发板上面了,这个根据自己的开发板来弄把,过程也不是很复杂。

转载于:https://www.cnblogs.com/chaingank/p/4702554.html

视频监控、直播——基于opencv,libx264,live555的RTSP流媒体服务器 (zc301P摄像头)By Chain_Gank...相关推荐

  1. live555作为RTSP流媒体服务器RTSPServer时解决对接海康NVR时G711音频不能正常播放的问题

    live555作为NVR内置的流媒体服务器RTSPServer在对接海康NVR,视频正常,音频不能正常播放, 但VLC可以正常播放. 经过问题的分析,发现live555作为NVR流媒体服务器输出视频为 ...

  2. 如何实现web浏览器无插件播放视频监控直播?

    很多年前,监控视频的直播只能够进行单一的服务器传输,而如今,很多网站已经可以观看视频直播了,不过大多网站观看视频直播的时候还是需要下载插件,有时候就会碰到系统不兼容.版本不对应等问题,那么能不能实现w ...

  3. 实现摄像头在内网、外网、gb28181 实现 “视频监控/直播” 的常用几种方式

    对于新手来说,实现摄像头远程视频监控(包括直播.回放.云台.录像云存储.截图操作等)有哪些常用的方案,头脑很模糊,或者网上找到了很多资料,不知道这些资料是属于哪一种方案,可以满足自己的需求! 对于我来 ...

  4. 实现摄像头在内网、外网、GB28181实现“视频监控/直播”的常用几种方式

    作者 :Eric 对于新手来说,实现摄像头远程视频监控(包括直播.回放.云台.录像云存储.截图操作等)有哪些常用的方案,头脑很模糊,或者网上找到了很多资料,不知道这些资料是属于哪一种方案,可以满足自己 ...

  5. linux移动视频监控系统,基于Linux的高速公路视频移动监控系统的研究与实现

    摘要: 在高速公路建设.维修和突发事件的处理上,由于涉及地域范围广.地理情况复杂以及监控点变动大等特点,使有线监控手段在高速公路实际应用中遇到一些困难,加之使用传统视频监控设备涉及布线繁琐.体积大.移 ...

  6. 远程视频监控php,基于 CentOS 搭建远程视频监控系统

    妹子说想看一下没人在家的时候小狗在干嘛,于是我就开始寻找开源的远程视频监控系统.经常一翻查找.对比,最后选定了 Zoneminder. A full-featured, open source, st ...

  7. 网页视频直播、微信视频直播技术解决方案:EasyNVR与EasyDSS流媒体服务器组合之区分不同场景下的直播接入需求...

    背景分析 熟悉EasyNVR产品的朋友们都知道,EasyNVR不仅可以独成体系,而且还可以跟其他系列产品相配合,形成各种不同类型的解决方案,满足各种不同应用场景的实际需求.针对很多设备现场没有固定公网 ...

  8. 国标GB/T28181视频流媒体服务器4G摄像头视频无插件直播方案对接过程中前端设备正常上线但视频无法播放问题解决

    背景分析 2012年6月1日,由公安部牵头起草的GB/T28181-2011 <安全防范视频监控联网系统信息传输.交换.控制技术要求>正式发布实施.要将全国视频统一联网,制定统一的国家标准 ...

  9. 网页视频直播、微信视频直播技术解决方案:EasyNVR与EasyDSS流媒体服务器组合之区分不同场景下的直播接入需求

    背景分析 熟悉EasyNVR产品的朋友们都知道,EasyNVR不仅可以独成体系,而且还可以跟其他系列产品相配合,形成各种不同类型的解决方案,满足各种不同应用场景的实际需求.针对很多设备现场没有固定公网 ...

最新文章

  1. elasticsearch使用指南之Elasticsearch Document Index API详解、原理与示例
  2. 数据结构--二叉树的创建和相关操作
  3. ServletContextListener在Tomcat中的配置问题
  4. 贺TDSQL喜提286万QPS!本文回顾了它的十年锻造之路
  5. 不要一辈子靠技术生存!!
  6. ABAP单元帮助类的两种使用方式
  7. java整数四则运算课设_用面向对象方法设计实现整数的四则运算(java)
  8. Ajax应用需要注意的事项
  9. go 接口 构造器_Go 中接口值的复制
  10. centos7安装yum_centos7下yum方式安装jenkins
  11. spring boot入门学习---热部署
  12. 【Elasticsearch】如何构建一个好的电商搜索引擎?
  13. 剑指OFFER之包含min函数的栈(九度OJ1522)
  14. 贺利坚老师汇编课程24笔记:内存单元[...]和(...)内容
  15. 大数据应用需注意哪些安全问题
  16. 用php实现遍历目录
  17. c语言实验报告1华科,华科操作系统实验报告
  18. 离散数学杜忠复版答案_离散数学 杜忠复 陈兆均
  19. Windows10下下载安装ideaIU
  20. cocos2dx[3.x](11)——拖尾渐隐效果MotionStreak

热门文章

  1. pythonsupermro_Python高级编程之继承问题详解(super与mro)
  2. Golang实践录:命令行cobra库实例再三优化
  3. Linux管道函数使用
  4. 【java】java 并发编程 Condition 源码分析
  5. 【ElasticSearch】Es 源码之 NetworkModule 源码解读
  6. 【MySQL】MySQL监控工具 mysql-monitor
  7. 95-080-048-源码-启动-启动standalonesession
  8. 【Flink】Flink报错 Could not forward element to next operator
  9. 【Docker】Docker 一个偶现的错误 bash命令突然找不到
  10. flink报错;IllegalArgumentException: requirement failed The class xx$3 is an instance class, mean