5.1放假了,昨日研究了LIVE555的play部分的源码,不敢独享,贡献出来分享。

调用栈

BasicTaskScheduler0::doEventLoop()

{

// Repeatedly loop, handling readble socketsand timed events:

while (1)

{

  SingleStep();

}

}

void BasicTaskScheduler::SingleStep()

{

//handler->handlerProc =RTSPServer::RTSPClientSession::incomingRequestHandler

(*handler->handlerProc)(handler->clientData,resultConditionSet);

}

voidRTSPServer::RTSPClientSession::incomingRequestHandler1()

{

handleRequestBytes(bytesRead);

}

void RTSPServer::RTSPClientSession::handleRequestBytes(intnewBytesRead)

{

handleCmd_withinSession("PLAY", urlPreSuffix,urlSuffix, cseq,

                 (char const*)fRequestBuffer);

}

RTSPServer::RTSPClientSession::handleCmd_PLAY(ServerMediaSubsession*subsession,

char const* cseq,

char const* fullRequestStr)

{

    fStreamStates[i].subsession->startStream(fOurSessionId,

                           fStreamStates[i].streamToken,

                           (TaskFunc*)noteClientLiveness, this,

                           rtpSeqNum, rtpTimestamp,

                           handleAlternativeRequestByte, this);

}

void OnDemandServerMediaSubsession::startStream(unsignedclientSessionId,

       void* streamToken,

       TaskFunc* rtcpRRHandler,void* rtcpRRHandlerClientData,

       unsigned short& rtpSeqNum,

       unsigned& rtpTimestamp,

       ServerRequestAlternativeByteHandler*serverRequestAlternativeByteHandler,

       void* serverRequestAlternativeByteHandlerClientData )

{

到了Play阶段,会调用startStream,里面会调用StreamState::startPlaying然后,fRTPSin->startPlaying(定义在MediaSink::startPlaying),里面会把MediaSink的参数fSource设置且前面的fMediaSource,然后执行continurePlaying()

 streamState->startPlaying(destinations,

                 rtcpRRHandler, rtcpRRHandlerClientData,

                 serverRequestAlternativeByteHandler,

serverRequestAlternativeByteHandlerClientData);

}

StreamState:startPlaying(Destinations* dests,

       TaskFunc* rtcpRRHandler,

       void* rtcpRRHandlerClientData,

       ServerRequestAlternativeByteHandler*serverRequestAlternativeByteHandler,

       void*serverRequestAlternativeByteHandlerClientData)

{

fRTPSink->startPlaying(*fMediaSource,

                 afterPlayingStreamState,

                 this);

}

MediaSink::startPlaying(MediaSource& source,

             afterPlayingFunc* afterFunc,

             void* afterClientData)

{

fSource =(FramedSource*)&source;

fAfterFunc = afterFunc;

fAfterClientData = afterClientData;

return continuePlaying();

}

H264VideoRTPSink::continuePlaying()

{

if (fOurFragmenter == NULL) {

  fOurFragmenter = new H264FUAFragmenter(envir(),

                       fSource,

在H264VideoRTPSink::continurePlaying()里面,会让fSource等于H264VideoRTPSink::fOurFragmenter(一个H264FUAFagmenterclass),而原来的fSource变为fOurFragmenter的fInputSource,然后执行MultiFramedRTPSink::continuePlaying()......MultiFramedRTPSink::buildAndSendPacket(true)...... MultiFramedRTPSink :: packFrame()

                       OutPacketBuffer::maxSize,

                       ourMaxPacketSize() - 12);

   fSource =fOurFragmenter;

}

returnMultiFramedRTPSink::continuePlaying();

}

H264FUAFragmenter::H264FUAFragmenter(UsageEnvironment&env,

                     FramedSource* inputSource,

                     unsigned inputBufferMax,

                     unsigned maxOutputPacketSize)

: FramedFilter(env, inputSource)

{

}

MultiFramedRTPSink::continuePlaying()

{

buildAndSendPacket(True);

}

MultiFramedRTPSink::buildAndSendPacket(BooleanisFirstPacket)

{

处理RTP Header

//packing as many (complete) frames into thepacket

packFrame();

}

MultiFramedRTPSink::packFrame()

H264FUAFragmenter继承自FramedFilter :FramedSource在上面的fOurFragmenter的构造函数里面就定义了FramedFilter::fInputSource等于fSourceH264FUAFragm

enter :: doGetNextFrame() 會 設定 FramedFilter ::fInputSource 的 fTo , i.e.

memmove(fTo, &fInputBuffer[1],fNumValidDataBytes - 1);

&fInputBuffer[1] =>startcode应该可以一起放进去,所以 fInputBuffer 是我们 startcode + NAL hdr + NALpayload 要存的地方

{

//前一帧如果太大会在buffer留下剩余部分,加上

//或者设定好fSource::fTo这个指针指向OutBuf->curPtr()

fSource->getNextFrame(fOutBuf->curPtr(),

            fOutBuf->totalBytesAvailable(),

            afterGettingFrame,

            this,

            ourHandleClosure,

            this);

}

FramedSource::getNextFrame(unsigned char* to,

               unsigned maxSize,

               afterGettingFunc*afterGettingFunc,

               void*afterGettingClientData,

               onCloseFunc* onCloseFunc,

               void* onCloseClientData)

{

//设置各个参数

fTo = to;

fMaxSize = maxSize;

fNumTruncatedBytes = 0; // by default; could be changed bydoGetNextFrame()

fDurationInMicroseconds = 0; // by default;could be changed by doGetNextFrame()

fAfterGettingFunc = afterGettingFunc;

fAfterGettingClientData =afterGettingClientData;

fOnCloseFunc = onCloseFunc;

fOnCloseClientData = onCloseClientData;

fIsCurrentlyAwaitingData = True;

fSource::fAfterGettingFunc=MultiFramedRTPSink::afterGettingFrame

fSource::fOnCloseFunc=MultiFramedRTPSink::ourHandleClosure

doGetNextFrame();

}

H264FUAFragmenter::doGetNextFrame()

{

_ If no NALU in the buffer (i.e.fNumValidDataBytes == 1) Read a new one:

   {

   FramedFilter::fInputSource->getNextFrame(&fInputBuffer[1],

                            fInputBufferSize- 1,

                             afterGettingFrame,

                             this,

                            FramedSource::handleClosure,

                             this);

   _ fInputSource::fAfterGettingFunc =H264FUAFragmenter::afterGettingFrame

}

_ Else

{

  ......

  Complete delivery tothe client:

 FramedSource::afterGetting(this)

}

}

FramedSource::afterGetting(FramedSource* source) {

if (source->fAfterGettingFunc!= NULL) {

  (*(source->fAfterGettingFunc))(source->fAfterGettingClientData,

                   source->fFrameSize,source->fNumTruncatedBytes,

                   source->fPresentationTime,

                   source->fDurationInMicroseconds);

}

}

MultiFramedRTPSink::afterGettingFrame(void* clientData,

                     unsigned numBytesRead,unsigned numTruncatedBytes,

                           struct timeval presentationTime,

                           unsigned durationInMicroseconds)

{

MultiFramedRTPSink* sink =(MultiFramedRTPSink*)clientData;

sink->afterGettingFrame1(numBytesRead,numTruncatedBytes,

               presentationTime,durationInMicroseconds);

}

MultiFramedRTPSink::afterGettingFrame1(unsigned frameSize,

unsigned numTruncatedBytes,

                   struct timeval presentationTime,

unsigned durationInMicroseconds)

{

if (numFrameBytesToUse == 0) {

   // Sendour packet now, because we have filled it up:

  sendPacketIfNecessary();

}

else

{

   // There's room for more frames; try getting another:

   packFrame();

}

}

MultiFramedRTPSink::sendPacketIfNecessary()

{

   // Delaythis amount of time:

nextTask() = envir().taskScheduler()

.scheduleDelayedTask(uSecondsToGo, (TaskFunc*)sendNext,this);

}

// The following is called after each delay between packetsends:

MultiFramedRTPSink::sendNext(void* firstArg)

{

MultiFramedRTPSink* sink =(MultiFramedRTPSink*)firstArg;

sink->buildAndSendPacket(False);

}

LIVE555play流程相关推荐

  1. 淘宝获取单笔订单信息服务端调用API及流程

    淘宝获取单笔交易接口(文档地址):https://open.taobao.com/api.htm?docId=54&docType=2 调用接口所需依赖(文档地址):https://devel ...

  2. 用伪代码模拟洗衣机的运转流程

    今天的软导课又学到了不少"骚操作",其中就包括Pseudocode和Top-down design. 不如现在就借着介绍洗衣机的运转流程向大家介绍一下这两个简单的东西. 题目如下 ...

  3. vue-devTools插件安装流程

    vue-devTools插件安装流程 本文主要介绍 vue的调试工具 vue-devtools 的安装和使用 工欲善其事, 必先利其器, 快快一起来用vue-devtools来调试开发你的vue项目吧 ...

  4. RPC 笔记(01)— RPC概念、调用流程、RPC 与 Restful API 区别

    1. 基本概念 PRC 远程过程调用 Remote Procedure Call,其就是一个节点请求另外一个节点提供的服务.当两个物理分离的子系统需要建立逻辑上的关联时,RPC 是牵线搭桥的常见技术手 ...

  5. etcd 笔记(05)— etcd 代码结构、各模块功能、整体架构、各模块之间的交互、请求和应答流程

    1. etcd 项目结构和功能 etcd 项目代码的目录结构如下: $ tree ├── auth ├── build ├── client ├── clientv3 ├── contrib ├── ...

  6. 浅显易懂 Makefile 入门 (01)— 什么是Makefile、为什么要用Makefile、Makefile规则、Makefile流程如何实现增量编译

    1. 什么是 Makefile Makefile 文件描述了 Linux 系统下 C/C++ 工程的编译规则,它用来自动化编译 C/C++ 项目.一旦写编写好 Makefile 文件,只需要一个 ma ...

  7. Go 学习笔记(57)— Go 第三方库之 amqp (RabbitMQ 生产者、消费者整个流程)

    1. 安装 rabbitmq 的 golang 包 golang 可使用库 github.com/streadway/amqp 操作 rabbitmq .使用下面命令安装 RabbitMQ . go ...

  8. 简述Web程序开发流程

    总体开发流程 分析需求, 列出功能清单或写需求说明书. 设计程序功能, 写功能规格书和技术规格书. 进入开发与测试的迭代. 调试和性能等专项测试. 部署上线 运维 前端开发流程 根据功能规格绘制页面草 ...

  9. 【Sql Server】Database-sql语言的流程控制语句

    流程控制语句 If  else语句 If (表.列) {语句|语句块begin,,,end} Else {语句|语句块begin,,,end }   If else 语句嵌套 If Begin If ...

  10. pytorch 调用forward 的具体流程

    forward方法的具体流程: 以一个Module为例: 1. 调用module的call方法 2. module的call里面调用module的forward方法 3. forward里面如果碰到M ...

最新文章

  1. 新疆大学计算机考研难吗,新疆大学考研难吗?一般要什么水平才可以进入?
  2. 【Java】学习笔记(1)
  3. vCenter的安装与部署
  4. Python中按指定长度分割字符串并反转
  5. scala创建并使用Enumerations
  6. 美国检测病毒3270美元,中国仅40,问题出在哪?
  7. 四大金刚 数据结构_GIS技术在气象领域应用综述
  8. ASCII码对照表(参考用)
  9. 利用jasperreports报表生成pdf文档中文不能显示问题解决方法
  10. react native 组件汇总整理,点击链接至GitHub
  11. Delphi7下安装ICS组件
  12. matlab多行注释快捷键。
  13. linux计划任务详解(附演示)
  14. 60岁首席工程师被SpaceX边缘化,主管:我怕他退休或死了
  15. list index out of range
  16. TypeError: Cannot read property 'gizmo' of null 的解决方案
  17. 大数据学习之javaAPI远程操作hadoop
  18. Excel根据某一列的内容对另一列进行分组汇总
  19. 给定一个大小为 *n* 的数组,找到其中的多数元素。多数元素是指在数组中出现次数大于 *⌊ n/2 ⌋* 的元素。
  20. div html 下边加横线_CSS如何给文字添加下划线样式?

热门文章

  1. 【BZOJ】【2768】【JLOI2010】冠军调查
  2. HDU 3639 Hawk-and-Chicken
  3. 系统部门岗位关联表_155页,房地产公司最为系统的岗位说明书,敬请收藏
  4. 查找指定时间段内的文件
  5. CenturyLink设定NG-PON2部署阶段 业务、无线回程为初始服务目标
  6. 仿微信选项卡主页面创建
  7. zendframework Form表单美化
  8. 关于网络安全检查的问题
  9. windows server 2008远程桌面轻松搞定
  10. 设计模式实现一个简单的缓存