1,linux 环境:
官网上下载,下载地址:http://www.live555.com/liveMedia/public/
live555 版本:“2018.12.14”
参考:http://www.live555.com/liveMedia/faq.html 这个FAQ要仔细阅读。
2,编译
根据不同的平台来配置,并生成对应的Makefile
2.1 ARM平台:
修改交叉编译工具
cp config.armlinux config.arm
vi config.arm
CROSS_COMPILE?= arm-buildroot-linux-uclibcgnueabi-
生成Makefile: ./genMakefiles arm
2.2 Linux 64位平台(x86-64 ):
./genMakefiles linux-64bit
2.3 Linux 32位平台(x86):
./genMakefiles linux
make

生成mediaServer/live555MediaServer

3,测试
3.1,mediaServer下 会生成 live555MediaServer。
live555MediaServer test.264
如果出现Correct this by increasing “OutPacketBuffer::maxSize” to at least 186818, before creating this ‘RTPSink’. (Current value is 100000.)
在DynamicRTSPServer.cpp文件ServerMediaSession* createNewSMS()
里修改OutPacketBuffer::maxSize

if (strcmp(extension, ".264") == 0) {// Assumed to be a H.264 Video Elementary Stream file:NEW_SMS("H.264 Video");OutPacketBuffer::maxSize = 300000; //100000;// allow for some possibly large H.264 framessms->addSubsession(H264VideoFileServerMediaSubsession::createNew(env, fileName, reuseSource));}

createNewSMS是在RTSP setup时调用的。
3.2,testProgs
testProgs 目录下各种测试文件,每个文件的作用和用法,官网上有详细的介绍。这些测试用例目前基本上都是以文件的形式作为输入源,下面重点介绍以实时流的形式作为输入源的2种方法。
主要是参考testH264VideoStreamer 和testOnDemandRTSPServer来修改。

4.不用读文件,使用实时视频流作为输入源
**
最简单的方法:将实时视频流推送到一个FIFO管道(或stdin),将文件名改为这个管道的文件名,这里不做详细介绍了。

4.1  方法1,参考testH264VideoStreamer

参考"liveMedia/DeviceSource.cpp"
定义一个H264LiveVideoSource例并继承DeviceSource,填充其成员,

void play() {// Open the input file as a 'byte-stream file source':ByteStreamFileSource* fileSource=ByteStreamFileSource::createNew(*env, inputFileName);
}

这里用H264LiveVideoSource代替ByteStreamFileSource

H264LiveVideoSource类后面会给出具体的代码。
修改testH264VideoStreamer.cpp main()

ServerMediaSession* sms= ServerMediaSession::createNew(*env, "testStream", NULL,"Session streamed by \"testH264VideoStreamer\"",True /*SSM*/);

修改play()函数如下:

void play() {// Open the input file as a 'byte-stream file source':#if 1H264LiveVideoSource* fileSource= new H264LiveVideoSource(*env);if (fileSource == NULL) {*env << "Unable to open file \"" << inputFileName<< "\" as a byte-stream file source\n";exit(1);}#elseByteStreamFileSource* fileSource= ByteStreamFileSource::createNew(*env, inputFileName);if (fileSource == NULL) {*env << "Unable to open file \"" << inputFileName<< "\" as a byte-stream file source\n";exit(1);}#endifFramedSource* videoES = fileSource;// Create a framer for the Video Elementary Stream:videoSource = H264VideoStreamFramer::createNew(*env, videoES);// Finally, start playing:*env << "Beginning to read from file...\n";videoSink->startPlaying(*videoSource, afterPlaying, videoSink);
}

4.2 方法2,参考testOnDemandRTSPServer
1)set the variable “reuseFirstSource” to “True”
2)根据类H264VideoFileServerMediaSubsession,新建一个新类H264LiveVideoServerMediaSubsession, implementation of the two pure virtual functions “createNewStreamSource()” and “createNewRTPSink()”
在createNewStreamSource()里用上面的H264LiveVideoSource代替ByteStreamFileSource。

H264VideoRTPSink继承关系:
H264VideoRTPSink->H264or5VideoRTPSink->VideoRTPSink->MultiFramedRTPSink->RTPSink->MediaSink->Medium。
H264VideoRTPSource继承关系:
H264VideoRTPSource->MultiFramedRTPSource->RTPSource->FramedSource->MediaSource->Medium.
H264VideoStreamFramer继承关系:
H264VideoStreamFramer->H264or5VideoStreamFramer->MPEGVideoStreamFramer->FramedFilter->FramedSource ->MediaSource->Medium.

下面列出具体实现的代码。

#ifndef _H264_LIVE_VIDEO_SERVER_MEDIA_SUBSESSION_HH
#define _H264_LIVE_VIDEO_SERVER_MEDIA_SUBSESSION_HH#include "OnDemandServerMediaSubsession.hh"
#include "liveMedia.hh"
#include "UsageEnvironment.hh"
#include "GroupsockHelper.hh"class H264LiveVideoServerMediaSubsession: public OnDemandServerMediaSubsession
{
public:H264LiveVideoServerMediaSubsession(UsageEnvironment & env,Boolean reuseFirstSource);~H264LiveVideoServerMediaSubsession();static H264LiveVideoServerMediaSubsession* createNew(UsageEnvironment& env,Boolean reuseFirstSource);
public: // new virtual functions, defined by all subclassesvirtual FramedSource* createNewStreamSource(unsigned clientSessionId,unsigned& estBitrate) ;// "estBitrate" is the stream's estimated bitrate, in kbpsvirtual RTPSink* createNewRTPSink(Groupsock* rtpGroupsock,unsigned char rtpPayloadTypeIfDynamic,FramedSource* inputSource);virtual char const * getAuxSDPLine(RTPSink * rtpSink, FramedSource * inputSource);static H264LiveVideoServerMediaSubsession* createNew(UsageEnvironment & env, FramedSource * source);static void afterPlayingDummy(void * ptr);static void chkForAuxSDPLine(void * ptr);void chkForAuxSDPLine1();
private:FramedSource * m_pSource;char * m_pSDPLine;RTPSink * m_pDummyRTPSink;char m_done;
};#endif

H264LiveVideoServerMediaSubsession.cpp

H264LiveVideoServerMediaSubsession::H264LiveVideoServerMediaSubsession(UsageEnvironment & env,Boolean reuseFirstSource):OnDemandServerMediaSubsession(env,reuseFirstSource)
{m_pSource = NULL;//source;m_pSDPLine = NULL;m_pDummyRTPSink =NULL;m_done=0;}H264LiveVideoServerMediaSubsession::~H264LiveVideoServerMediaSubsession()
{if (m_pSDPLine){free(m_pSDPLine);}
}
H264LiveVideoServerMediaSubsession * H264LiveVideoServerMediaSubsession::createNew(UsageEnvironment& env,Boolean reuseFirstSource)
{return new H264LiveVideoServerMediaSubsession(env,reuseFirstSource);
}
FramedSource * H264LiveVideoServerMediaSubsession::createNewStreamSource(unsigned clientSessionId, unsigned & estBitrate)
{//printf("===========createNewStreamSource===================\n");estBitrate = 500; return H264VideoStreamFramer::createNew(envir(), new H264LiveVideoSource(envir()));
}RTPSink * H264LiveVideoServerMediaSubsession::createNewRTPSink(Groupsock * rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource * inputSource)
{return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
}char const *H264LiveVideoServerMediaSubsession::getAuxSDPLine(RTPSink * rtpSink, FramedSource * inputSource){if (m_pSDPLine){return m_pSDPLine;}m_pDummyRTPSink = rtpSink;if(NULL == m_pDummyRTPSink)return NULL;//mp_dummy_rtpsink->startPlaying(*source, afterPlayingDummy, this);m_pDummyRTPSink->startPlaying(*inputSource, 0, 0);chkForAuxSDPLine(this);m_done = 0;char const *dasl = m_pDummyRTPSink->auxSDPLine();envir().taskScheduler().doEventLoop(&m_done);if(dasl)m_pSDPLine = strdup(dasl);m_pDummyRTPSink->stopPlaying();return m_pSDPLine;}
void H264LiveVideoServerMediaSubsession::afterPlayingDummy(void * ptr)
{H264LiveVideoServerMediaSubsession * This = (H264LiveVideoServerMediaSubsession *)ptr;This->m_done = ~0;
}void H264LiveVideoServerMediaSubsession::chkForAuxSDPLine(void * ptr)
{H264LiveVideoServerMediaSubsession * This = (H264LiveVideoServerMediaSubsession *)ptr;This->chkForAuxSDPLine1();
}void H264LiveVideoServerMediaSubsession::chkForAuxSDPLine1()
{if (m_pDummyRTPSink->auxSDPLine()){m_done =  ~0;}else{double delay = 1000.0 / (FRAME_PER_SEC);  // msint to_delay = delay * 1000;  // usnextTask() = envir().taskScheduler().scheduleDelayedTask(to_delay, chkForAuxSDPLine, this);}
}

修改testOnDemandRTSPServer.cpp文件如下:
在main()里加入下面的代码

// A H.264 live video  stream:{OutPacketBuffer::maxSize = 300000;char const* streamName = "h264LiveVideo";char const* inputFileName = "test";ServerMediaSession* sms= ServerMediaSession::createNew(*env, streamName, streamName,descriptionString,True);UsageEnvironment& envr = rtspServer->envir();envr << "\n\"" << sms<< "\"\n" ;if(NULL == sms)printf("sms is null  \n");sms->addSubsession(H264LiveVideoServerMediaSubsession ::createNew(*env,True));rtspServer->addServerMediaSession(sms);announceStream(rtspServer, sms, streamName, inputFileName);}

H264LiveVideoSource.hh

#ifndef _H264_LIVE_VIDEO_SOURCE_HH
#define _H264_LIVE_VIDEO_SOURCE_HH#ifndef _FRAMED_SOURCE_HH
#include "FramedSource.hh"
#endif
#include "DeviceSource.hh"class H264LiveVideoSource: public FramedSource {
public:H264LiveVideoSource(UsageEnvironment& env);// called only by createNew()virtual ~H264LiveVideoSource();private:// redefined virtual functions:virtual void doGetNextFrame();//virtual void doStopGettingFrames();int maxFrameSize();static void getNextFrame(void * ptr);void GetFrameData();private:void *m_pToken;char *m_pFrameBuffer;char *fTruncatedBytes;int fTruncatedBytesNum;
};#endif

H264LiveVideoSource.cpp

#include "H264LiveVideoSource.hh"
//#include "InputFile.hh"
#include "GroupsockHelper.hh"#define FRAME_BUF_SIZE  (1024*1024)
#define FMAX (300000)
H264LiveVideoSource::H264LiveVideoSource(UsageEnvironment& env):FramedSource(env),
m_pToken(0),
m_pFrameBuffer(0)fTruncatedBytesNum(0),fTruncatedBytes(0)
{m_pFrameBuffer = new char[FRAME_BUF_SIZE];fTruncatedBytes = new char[FRAME_BUF_SIZE];if(m_pFrameBuffer == NULL || fTruncatedBytes== NULL ){printf("[MEDIA SERVER] error malloc data buffer failed\n");return;}memset(m_pFrameBuffer,0,FRAME_BUF_SIZE);//fMaxSize =  FMAX;printf("[H264LiveVideoSource] fMaxSize:%d\n",fMaxSize);
}H264LiveVideoSource::~H264LiveVideoSource()
{envir().taskScheduler().unscheduleDelayedTask(m_pToken);if(m_pFrameBuffer){delete[] m_pFrameBuffer;m_pFrameBuffer = NULL;}if(fTruncatedBytes){delete[] fTruncatedBytes;fTruncatedBytes = NULL;}
}
int H264LiveVideoSource::maxFrameSize()
{return FRAME_BUF_SIZE;
}
void H264LiveVideoSource::doGetNextFrame()
{int uSecsToDelay = 40000; // 40 msm_pToken  = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,(TaskFunc*)getNextFrame, this);//printf("m_pToken =%p \n" ,m_pToken);
}
void H264LiveVideoSource::getNextFrame(void *ptr)
{H264LiveVideoSource *p=(H264LiveVideoSource *)ptr;if(NULL  == p)printf("null point \n");p->GetFrameData();
}#include <sys/types.h>
#include <sys/stat.h>
#include <string.h>
#include <fcntl.h>
#include <unistd.h>
#include <limits.h>typedef struct
{unsigned long long timeTick;            //时间(ms)unsigned int dataLen;                   //数据长度unsigned char dataType;                   //数据类型(DataType_E)unsigned char rsv[3]; unsigned long long timeStamp;       //编码时间戳(us)unsigned char iFrame;                //是否为关键帧unsigned char frameRate;            //帧率int encodeType;            //编码类型VideoEncodeType_Eunsigned short width;                //视频宽度unsigned short height;                //视频高度unsigned char rsv1[8];unsigned char data[0];
}IFVFrameHeader_S;
void H264LiveVideoSource::GetFrameData()
{
#if 1//memcpy(fTo,m_pFrameBuffer,fFrameSize);int read = ShareBufGetOneFrame(g_BufHandle[0], FRAME_BUF_SIZE, (char *)m_pFrameBuffer); //这里要改成你自己实际获取视频帧的函数if (read == 0){printf("read byte =0 \n");fFrameSize =0;// FramedSource::afterGetting(this);return;  }IFVFrameHeader_S *pFrameHead  = reinterpret_cast<IFVFrameHeader_S *>(m_pFrameBuffer);if(pFrameHead == NULL ){printf("pFrameHead =0 \n");fFrameSize =0;return;}if(iframetype == 0){ if(1==pFrameHead->iFrame){iframetype =1;}else{//printf("no i frame \n");//fFrameSize =0;//gettimeofday(&fPresentationTime,NULL);//FramedSource::afterGetting(this);//return;}}int framelen=pFrameHead->dataLen;#if 0if(pFrameHead->dataLen > fMaxSize)pFrameHead->dataLen = fMaxSize;memcpy(fTo,m_pFrameBuffer+sizeof(IFVFrameHeader_S),pFrameHead->dataLen);fFrameSize =pFrameHead->dataLen;#else//printf("pFrameHead->dataLen =%d fMaxSize=%u\n",pFrameHead->dataLen,fMaxSize);if(framelen > fMaxSize){framelen = fMaxSize;fNumTruncatedBytes = pFrameHead->dataLen-fMaxSize;memcpy(fTo,pFrameHead->data,framelen);memmove(fTruncatedBytes,pFrameHead->data + framelen,fNumTruncatedBytes);fFrameSize =framelen;}else{if(fNumTruncatedBytes > 0 ) {memmove(fTo,fTruncatedBytes,fTruncatedBytesNum);memmove(fTo + fTruncatedBytesNum,pFrameHead->data,framelen);fFrameSize += fTruncatedBytesNum;// printf("send last truncted %d bytes\n",fTruncatedBytesNum);fTruncatedBytesNum = 0;}else{memcpy(fTo,pFrameHead->data,framelen);fFrameSize =framelen;   }}#endiffDurationInMicroseconds = 1000000/25;gettimeofday(&fPresentationTime,NULL);//*nextPT=fPresentationTime;FramedSource::afterGetting(this);
#else#define FIFO_NAME "./test.264"//#define BUFFER_SIZE (30000)static int fd=-1;//static u_int64_t fFileSize =0;if(fd ==-1){fd = open(FIFO_NAME,O_RDONLY);if(fd > 0){// fFileSize =5316637;// GetFileSize(FIFO_NAME, fd);}}if(fd ==-1){printf("open file %s fail \n",FIFO_NAME);return;}int len =0;int remain = fMaxSize;//if(remain >fMaxSize)//    remain =fMaxSize;if((len = read(fd,fTo,remain))>0){//memmove(fTo,m_pFrameBuffer,len);gettimeofday(&fPresentationTime, NULL);fFrameSize=len;}else{if(fd >0){::close(fd);fd = -1;//printf("GetFrameData close file %d\n",len);}}fDurationInMicroseconds = 1000000/25;gettimeofday(&fPresentationTime,NULL);// printf("fMaxSize=%d fFrameSize=%d\n",fMaxSize,fFrameSize);//nextTask() = envir().taskScheduler().scheduleDelayedTask(0,//           (TaskFunc*)FramedSource::afterGetting, this);FramedSource::afterGetting(this);#endif
}

linve555常用修改点:

1, 输入的一帧数据最大值
StreamParser.cpp
#define BANK_SIZE 1500000 //帧越大,这个值就要越大

2, rtp buffer最大值
(1)Source端使用 MultiFramedRTPSource.cpp
BufferedPacket::BufferedPacket()
定义输入Buffer的上限值,即BufferedPacket的最大值
#define MAX_PACKET_SIZE 65536
(2)Sink端使用 MultiFramedRTPSink.cpp
#define RTP_PAYLOAD_MAX_SIZE 1456 //(1500-14-20-8)/4 *4 //ethernet=14,IP=20, UDP=8, a multiple of 4 bytes
MediaSink.cpp
静态变量OutPacketBuffer::maxSize = 600000; // allow for some possibly large H.265 frames,2000000 is by default
最好是RTP_PAYLOAD_MAX_SIZE的整数倍
值小了,会不断打印信息: Correct this by increasing “OutPacketBuffer::maxSize” to at least

,3,获取IP地址失败
RTSPServer::rtspURLPrefix(){
ourIPAddress(envir())
}

GroupsockHelper.cpp

 ourIPAddress(){if (badAddressForUs(from)) {#if 0char tmp[100];sprintf(tmp, "This computer has an invalid IP address: %s", AddressString(from).val());env.setResultMsg(tmp);from = 0;#endifstruct ifreq req;int ret = 0;char szIpBuf[32];sock = socket(AF_INET, SOCK_DGRAM, 0);if (-1 != sock){memset(&req, 0, sizeof(req));strncpy(req.ifr_name, "eth0", sizeof(req.ifr_name));ret = ioctl(sock, SIOCGIFADDR, &req);if (-1 == ret){close(sock);}else{memset(&szIpBuf, 0, sizeof(szIpBuf));strcpy(szIpBuf, inet_ntoa(((struct sockaddr_in *)&req.ifr_addr)->sin_addr));close(sock);fromAddr.sin_addr.s_addr=our_inet_addr(szIpBuf);from = fromAddr.sin_addr.s_addr;}}else{char tmp[100];sprintf(tmp, "This computer has an invalid IP address: %s", AddressString(from).val());env.setResultMsg(tmp);from = 0;}}

3,内存泄漏点
RTCPInstance::processIncomingReport
if(NULL != reason)
{
delete[] reason;
reason = NULL;
}
在申请内存时加上上面释放语句
reason = new char[reasonLength + 1];
4,fill sei data DeltaTfiDivisor
H264or5VideoStreamParser::H264or5VideoStreamParser()
{
//according to H264 and H265 spec, if not fill sei data, then frame_field_info_present_flag is zero. so need to set DeltaTfiDivisor to 2.0 in H264 and 1.0 in H265
if(fHNumber == 264) {
DeltaTfiDivisor = 2.0;
} else {
DeltaTfiDivisor = 1.0;
}

}
5,长时间拉取拉取RTSP流
报错误"Hit limit when reading incoming packet over TCP"
可考虑提高maxRTCPPacketSize的值
RTCP.CPP
static unsigned const maxRTCPPacketSize = 1456;

6,如播放越久延时越大
MultiFramedRTPSink.cpp->MultiFramedRTPSink::sendPacketIfNecessary() 最后延时列队uSecondsToGo 每帧都有延时时间。将uSecondsToGo 值赋为0。

7, 裁剪

只需留下这些目录,其它可删除掉。
其中liveMedia目录下有很多类型的文件,不需要的也可删除,同时修改
MediaSubsession::createSourceObjects()把相关类型的createNew也删除掉,否则编译失败。

Live555实时视频流应用总结相关推荐

  1. p2p传输实时视频流

    HYP2P是好游科技开发的p2p sdk,利用nat穿透的原理实现p2p打洞,主要用于实时音视频的传输.使用时可以配置成tcp模式或udp模式,tcp模式会自动帮您解决丢包.乱序.重传问题.中转模式则 ...

  2. 使用Python,OpenCV实现图像和实时视频流中的人脸模糊和马赛克

    使用Python,OpenCV实现图像和实时视频流中的人脸模糊和人脸马赛克 1. 效果图 2. 原理 2.1 什么是人脸模糊,如何将其用于人脸匿名化? 2.2 执行人脸模糊/匿名化的步骤 3. 源码 ...

  3. 基于 OpenCV 的网络实时视频流传输

    作者 | 努比 来源 | 小白学视觉 大多数人会选择使用IP摄像机(Internet协议摄像机)而不是CCTV(闭路电视),因为它们具有更高的分辨率并降低了布线成本.在本文中,我们将重点介绍IP摄像机 ...

  4. 基于OpenCV的网络实时视频流传输

    点击上方"小白学视觉",选择加"星标"或"置顶" 重磅干货,第一时间送达 很多小伙伴都不会在家里或者办公室安装网络摄像头或监视摄像头.但是有 ...

  5. python编程实例视屏-python实现实时视频流播放代码实例

    这篇文章主要介绍了python实现实时视频流播放代码实例,文中通过示例代码介绍的非常详细,对大家的学习或者工作具有一定的参考学习价值,需要的朋友可以参考下 @action(methods=['GET' ...

  6. python实现流媒体传输_基于OpenCV的网络实时视频流传输的实现

    很多小伙伴都不会在家里或者办公室安装网络摄像头或监视摄像头.但是有时,大家又希望能够随时随地观看视频直播. 大多数人会选择使用IP摄像机(Internet协议摄像机)而不是CCTV(闭路电视),因为它 ...

  7. 利用flask将opencv实时视频流输出到浏览器

    opencv通过webcam可以获取本地实时视频流,但是如果需要将视频流共享给其他机器调用,就可以将利用flask框架构建一个实时视频流服务器,然后其他机器可以通过向这个服务器发送请求来获取这台机器上 ...

  8. H5解码H264实时视频流

    浏览器如何解码实时视频流?最近研究了一下,大体思路为通过websocket把裸H264传输到浏览器,在通过js封装成mp4格式,再通过Html5的video标签进行解码,效果还是比较不错. <! ...

  9. 新的Google Lyra音频编解码器对实时视频流意味着什么?

    正文字数:2602  阅读时长:4分钟 通过语言编码中的码率缩减趋势,Lyra与Opus中的区别比较,Lyra的作用,XDN平台上的高效语音编码技术几个方面探讨新的Google Lyra音频编解码器对 ...

  10. python播放视频代码_python实现实时视频流播放代码实例

    这篇文章主要介绍了python实现实时视频流播放代码实例,文中通过示例代码介绍的非常详细,对大家的学习或者工作具有一定的参考学习价值,需要的朋友可以参考下 @action(methods=['GET' ...

最新文章

  1. pandas基于条件判断更新dataframe中特定数据列数值内容的值(Conditionally updating values in specific pandas Dataframe )
  2. python自动发邮件富文本_django 实现后台从富文本提取纯文本
  3. Bootstrap组件_输入框组
  4. ADF_Tutorials系列17_ADF Faces_使用布局组件
  5. 【渝粤教育】广东开放大学 数据结构 形成性考核 (30)
  6. Linux如何自动获取IP地址
  7. 百度Q2扭亏为盈 李彦宏发信勉励:变革带来阵痛 但能走得更稳更远
  8. 如何实现音频合成立体声录制?
  9. 浅析EDA技术应用于电子设计竞赛的可行性
  10. 数据库迁移工具flyway
  11. 实验2:tga格式图像转换为yuv格式
  12. java gc 监控_JAVA网站full GC监控脚本
  13. 放量十字星——黎明前的曙光还是黑暗前的夕阳
  14. Tomcat-线程模型及设计精髓
  15. mysql根据班级排序语文成绩_mysql 成绩排序
  16. 苹果马上又要更新系统,iOS 14.5 Beta中的所有新功能
  17. Windows 上安装 Bugzilla 详解
  18. 对话哈希未来贾英昊:资产上链的第一性原理 |链捕手
  19. 把超星阅览器的文件转换为PDF文件
  20. jade支持html,Jade !HTML框架

热门文章

  1. 尾气冒黑烟是什么问题_当你的汽车排气管冒黑烟时,该如何处理呢?
  2. win7网上邻居_win7系统网上邻居在哪
  3. Unity Kinect体感跑酷互动游戏方案
  4. gis与一般计算机应用系统有哪些异同,gis概论各章练习题..doc
  5. dlp技术(dlp技术和单片lcd的区别)
  6. Linux 一条命令删除某端口被占用的进程
  7. python字符串后面添加字符串_Python字符串中添加、插入特定字符的方法
  8. SpringBoot整合Elasticsearch之索引,映射,文档,搜索的基本操作案例分析
  9. 首发速看:智微JMS901双接口U盘成功量产,附量产工具软件+固件+教程分享
  10. 自媒体必死 大家不要被马化腾忽悠了