WebRTC中默认摄像头采集:
RTCCameraVideoCapturer:

src/sdk/objc/components/capturer/RTCCameraVideoCapturer.m- (void)captureOutput:(AVCaptureOutput *)captureOutputdidOutputSampleBuffer:(CMSampleBufferRef)sampleBufferfromConnection:(AVCaptureConnection *)connection {...RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *kNanosecondsPerSecond;RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBufferrotation:_rotationtimeStampNs:timeStampNs];[self.delegate capturer:self didCaptureVideoFrame:videoFrame];
}

先看看RTCCVPixelBufferRTCVideoFrame
RTCCVPixelBuffer只是简单存储了CVPixelBufferRef pixelBuffer

src/sdk/objc/components/video_frame_buffer/RTCCVPixelBuffer.mm- (instancetype)initWithPixelBuffer:(CVPixelBufferRef)pixelBuffer {return [self initWithPixelBuffer:pixelBufferadaptedWidth:CVPixelBufferGetWidth(pixelBuffer)adaptedHeight:CVPixelBufferGetHeight(pixelBuffer)cropWidth:CVPixelBufferGetWidth(pixelBuffer)cropHeight:CVPixelBufferGetHeight(pixelBuffer)cropX:0cropY:0];
}- (instancetype)initWithPixelBuffer:(CVPixelBufferRef)pixelBufferadaptedWidth:(int)adaptedWidthadaptedHeight:(int)adaptedHeightcropWidth:(int)cropWidthcropHeight:(int)cropHeightcropX:(int)cropXcropY:(int)cropY {if (self = [super init]) {_width = adaptedWidth;_height = adaptedHeight;_pixelBuffer = pixelBuffer;_bufferWidth = CVPixelBufferGetWidth(_pixelBuffer);_bufferHeight = CVPixelBufferGetHeight(_pixelBuffer);_cropWidth = cropWidth;_cropHeight = cropHeight;// Can only crop at even pixels._cropX = cropX & ~1;_cropY = cropY & ~1;CVBufferRetain(_pixelBuffer);}return self;
}

RTCVideoFrame又简单存储了RTCVideoFrameBuffer

src/sdk/objc/base/RTCVideoFrame.mm- (instancetype)initWithBuffer:(id<RTCVideoFrameBuffer>)bufferrotation:(RTCVideoRotation)rotationtimeStampNs:(int64_t)timeStampNs {if (self = [super init]) {_buffer = buffer;_rotation = rotation;_timeStampNs = timeStampNs;}return self;
}

回到didCaptureVideoFrame:
RTCVideoSource:

src/sdk/objc/api/peerconnection/RTCVideoSource.mm- (void)capturer:(RTCVideoCapturer *)capturer didCaptureVideoFrame:(RTCVideoFrame *)frame {getObjCVideoSource(_nativeVideoSource)->OnCapturedFrame(frame);
}

ObjCVideoTrackSource:

src/sdk/objc/native/src/objc_video_track_source.mmvoid ObjCVideoTrackSource::OnCapturedFrame(RTCVideoFrame *frame) {const int64_t timestamp_us = frame.timeStampNs / rtc::kNumNanosecsPerMicrosec;const int64_t translated_timestamp_us =timestamp_aligner_.TranslateTimestamp(timestamp_us, rtc::TimeMicros());int adapted_width;int adapted_height;int crop_width;int crop_height;int crop_x;int crop_y;if (!AdaptFrame(frame.width,frame.height,timestamp_us,&adapted_width,&adapted_height,&crop_width,&crop_height,&crop_x,&crop_y)) {return;}rtc::scoped_refptr<VideoFrameBuffer> buffer;if (adapted_width == frame.width && adapted_height == frame.height) {// No adaption - optimized path.buffer = new rtc::RefCountedObject<ObjCFrameBuffer>(frame.buffer);} else if ([frame.buffer isKindOfClass:[RTCCVPixelBuffer class]]) {// Adapted CVPixelBuffer frame.RTCCVPixelBuffer *rtcPixelBuffer = (RTCCVPixelBuffer *)frame.buffer;buffer = new rtc::RefCountedObject<ObjCFrameBuffer>([[RTCCVPixelBuffer alloc]initWithPixelBuffer:rtcPixelBuffer.pixelBufferadaptedWidth:adapted_widthadaptedHeight:adapted_heightcropWidth:crop_widthcropHeight:crop_heightcropX:crop_x + rtcPixelBuffer.cropXcropY:crop_y + rtcPixelBuffer.cropY]);} else {// Adapted I420 frame.// TODO(magjed): Optimize this I420 path.rtc::scoped_refptr<I420Buffer> i420_buffer = I420Buffer::Create(adapted_width, adapted_height);buffer = new rtc::RefCountedObject<ObjCFrameBuffer>(frame.buffer);i420_buffer->CropAndScaleFrom(*buffer->ToI420(), crop_x, crop_y, crop_width, crop_height);buffer = i420_buffer;}// Applying rotation is only supported for legacy reasons and performance is// not critical here.VideoRotation rotation = static_cast<VideoRotation>(frame.rotation);if (apply_rotation() && rotation != kVideoRotation_0) {buffer = I420Buffer::Rotate(*buffer->ToI420(), rotation);rotation = kVideoRotation_0;}OnFrame(VideoFrame(buffer, rotation, translated_timestamp_us));
}

默认没 crop,直接将RTCVideoFrameBuffer转换为ObjCFrameBuffer

src/sdk/objc/native/src/objc_frame_buffer.mmObjCFrameBuffer::ObjCFrameBuffer(id<RTCVideoFrameBuffer> frame_buffer): frame_buffer_(frame_buffer), width_(frame_buffer.width), height_(frame_buffer.height) {}

继续调用OnFrame:
ObjCVideoTrackSource派生于AdaptedVideoTrackSourceOnFrame是父类AdaptedVideoTrackSource实现的:

src/media/base/adaptedvideotracksource.ccvoid AdaptedVideoTrackSource::OnFrame(const webrtc::VideoFrame& frame) {rtc::scoped_refptr<webrtc::VideoFrameBuffer> buffer(frame.video_frame_buffer());/* Note that this is a "best effort" approach towants.rotation_applied; apply_rotation_ can change from false totrue between the check of apply_rotation() and the call tobroadcaster_.OnFrame(), in which case we generate a frame withpending rotation despite some sink with wants.rotation_applied ==true was just added. The VideoBroadcaster enforcessynchronization for us in this case, by not passing the frame onto sinks which don't want it. */if (apply_rotation() && frame.rotation() != webrtc::kVideoRotation_0 &&buffer->type() == webrtc::VideoFrameBuffer::Type::kI420) {/* Apply pending rotation. */broadcaster_.OnFrame(webrtc::VideoFrame(webrtc::I420Buffer::Rotate(*buffer->GetI420(), frame.rotation()),webrtc::kVideoRotation_0, frame.timestamp_us()));} else {broadcaster_.OnFrame(frame);}
}

broadcaster_VideoBroadcaster:

src/media/base/videobroadcaster.ccvoid VideoBroadcaster::OnFrame(const webrtc::VideoFrame& frame) {rtc::CritScope cs(&sinks_and_wants_lock_);for (auto& sink_pair : sink_pairs()) {if (sink_pair.wants.rotation_applied &&frame.rotation() != webrtc::kVideoRotation_0) {// Calls to OnFrame are not synchronized with changes to the sink wants.// When rotation_applied is set to true, one or a few frames may get here// with rotation still pending. Protect sinks that don't expect any// pending rotation.RTC_LOG(LS_VERBOSE) << "Discarding frame with unexpected rotation.";continue;}if (sink_pair.wants.black_frames) {sink_pair.sink->OnFrame(webrtc::VideoFrame(GetBlackFrameBuffer(frame.width(), frame.height()),frame.rotation(), frame.timestamp_us()));} else {sink_pair.sink->OnFrame(frame);}}
}

当videotrack.isEnabled=false时(在上层api可以调用此设置),创建blackframe,也就是黑屏帧,此时不发送摄像头帧而只是发黑屏帧。
正常情况下发送视频frame。


sink->OnFrame到VideoStreamEncoder::OnFrame

sink的定义:VideoSinkInterface<webrtc::VideoFrame>* sink;
VideoStreamEncoder派生于VideoStreamEncoderInterface:
sink->OnFrame实际上调用的是VideoStreamEncoder::OnFrame
VideoStreamEncoder创建及和sink绑定的流程:
绕了很大一圈,代码都看花了眼:

 src/pc/peerconnection.cc
PeerConnection::SetLocalDescription-->
ApplyLocalDescription-->
CreateChannelsPeerConnection::SetRemoteDescription-->
ApplyRemoteDescription->
CreateChannels音频通道:
CreateChannels->
CreateVoiceChannel->
channel_manager()->CreateVoiceChannel->
media_engine_->voice().CreateMediaChannel->
...音频下面不分析了,主要分析视频视频通道:
CreateChannels->
CreateVideoChannel->
channel_manager()->CreateVideoChannel->
media_engine_->video().CreateMediaChannel->
VideoMediaChannel* WebRtcVideoEngine::CreateMediaChannel->
new WebRtcVideoChannel

至此创建了WebRtcVideoChannel

ApplyLocalDescription->
UpdateSessionStateApplyRemoteDescription->
UpdateSessionStateUpdateSessionState->
PushdownMediaDescription->
channel->SetLocalContent
channel->SetRemoteContentchannel->SetLocalContent->BaseChannel::SetLocalContent_w->
VideoChannel::SetLocalContent_w->
UpdateLocalStreams_w->
media_channel()->AddSendStream->
WebRtcVideoChannel::AddSendStream->
new WebRtcVideoSendStream->
WebRtcVideoChannel::WebRtcVideoSendStream::RecreateWebRtcStream()->
stream_ = call_->CreateVideoSendStream->,stream_为VideoSendStream
stream_->SetSource(this, GetDegradationPreference());channel->SetRemoteContent->BaseChannel::SetRemoteContent_w
->VideoChannel::SetRemoteContent_w->
UpdateRemoteStreams_w->
BaseChannel::AddRecvStream_w->
media_channel()->AddRecvStream->
WebRtcVideoChannel::AddRecvStream->
new WebRtcVideoReceiveStream->
WebRtcVideoChannel::WebRtcVideoReceiveStream::RecreateWebRtcVideoStream() ->
stream_ = call_->CreateVideoReceiveStream->,stream_为VideoReceiveStream
stream_->Start();

至此创建了VideoSendStreamVideoReceiveStream,下面只分析VideoSendStream

VideoSendStream构造函数->
video_stream_encoder_ = CreateVideoStreamEncoder,返回的是VideoStreamEncoder上面的:
stream_->SetSource(this, GetDegradationPreference());时的this是WebRtcVideoSendStream,
VideoStreamEncoder::SetSource->
source->AddOrUpdateSink->
WebRtcVideoSendStream::AddOrUpdateSink->
source_->AddOrUpdateSink,这里的source_在WebRtcVideoChannel::WebRtcVideoSendStream::SetVideoSend中被设置

下面来看看WebRtcVideoChannel::WebRtcVideoSendStream::SetVideoSend是怎么被调用的:

PeerConnection::CreateSender->
sender = RtpSenderProxyWithInternal<RtpSenderInternal>::Create(signaling_thread(), new VideoRtpSender(worker_thread(), id)); ->
sender->SetTrack(track)->track_->VideoRtpSender::SetVideoSend() ->
media_channel_->SetVideoSend(ssrc_, &options, track_);->
WebRtcVideoChannel::WebRtcVideoSendStream::SetVideoSend

晕死,绕了好大一圈。


class VideoStreamEncoder : public VideoStreamEncoderInterface,private EncodedImageCallback,// Protected only to provide access to tests.protected AdaptationObserverInterface {

VideoStreamEncoderInterface又派生于rtc::VideoSinkInterface<VideoFrame>

VideoStreamEncoder::OnFrame:

src/video/video_stream_encoder.ccvoid VideoStreamEncoder::OnFrame(const VideoFrame& video_frame) {RTC_DCHECK_RUNS_SERIALIZED(&incoming_frame_race_checker_);...encoder_queue_.PostTask([this, incoming_frame, post_time_us, log_stats]() {RTC_DCHECK_RUN_ON(&encoder_queue_);encoder_stats_observer_->OnIncomingFrame(incoming_frame.width(),incoming_frame.height());++captured_frame_count_;const int posted_frames_waiting_for_encode =posted_frames_waiting_for_encode_.fetch_sub(1);RTC_DCHECK_GT(posted_frames_waiting_for_encode, 0);if (posted_frames_waiting_for_encode == 1) {MaybeEncodeVideoFrame(incoming_frame, post_time_us);} else {// There is a newer frame in flight. Do not encode this frame.RTC_LOG(LS_VERBOSE)<< "Incoming frame dropped due to that the encoder is blocked.";++dropped_frame_count_;encoder_stats_observer_->OnFrameDropped(VideoStreamEncoderObserver::DropReason::kEncoderQueue);}...});
}

PostTask中的MaybeEncodeVideoFrame:

void VideoStreamEncoder::MaybeEncodeVideoFrame(const VideoFrame& video_frame,int64_t time_when_posted_us) {...EncodeVideoFrame(video_frame, time_when_posted_us);
}

继续EncodeVideoFrame:

void VideoStreamEncoder::EncodeVideoFrame(const VideoFrame& video_frame,int64_t time_when_posted_us) {...video_sender_.AddVideoFrame(out_frame, nullptr, encoder_info_);
}
src/modules/video_coding/video_sender.ccint32_t VideoSender::AddVideoFrame(const VideoFrame& videoFrame,const CodecSpecificInfo* codecSpecificInfo,absl::optional<VideoEncoder::EncoderInfo> encoder_info) {
...int32_t ret =_encoder->Encode(converted_frame, codecSpecificInfo, next_frame_types);if (ret < 0) {RTC_LOG(LS_ERROR) << "Failed to encode frame. Error code: " << ret;return ret;}
...
}

以上如果是软编码而且不是I420格式,会转换成I420。

int32_t ret = _encoder->Encode(converted_frame, codecSpecificInfo, next_frame_types);
这儿的_enocderVCMGenericEncoder

VCMGenericEncoder

src/modules/video_coding/generic_encoder.ccint32_t VCMGenericEncoder::Encode(const VideoFrame& frame,const CodecSpecificInfo* codec_specific,const std::vector<FrameType>& frame_types) {
...return encoder_->Encode(frame, codec_specific, &frame_types);
}

这儿的encoder_ObjCVideoEncoder
ObjCVideoEncoder:

  int32_t Encode(const VideoFrame &frame,const CodecSpecificInfo *codec_specific_info,const std::vector<FrameType> *frame_types) {
...return [encoder_ encode:ToObjCVideoFrame(frame)codecSpecificInfo:nilframeTypes:rtcFrameTypes];}

这儿的encoder_RTCVideoEncoder也就是RTCVideoEncoderH264
RTCVideoEncoderH264:

- (NSInteger)encode:(RTCVideoFrame *)framecodecSpecificInfo:(nullable id<RTCCodecSpecificInfo>)codecSpecificInfoframeTypes:(NSArray<NSNumber *> *)frameTypes {
...OSStatus status = VTCompressionSessionEncodeFrame(_compressionSession,pixelBuffer,presentationTimeStamp,kCMTimeInvalid,frameProperties,encodeParams.release(),nullptr);...

正常摄像头采集情况下CVPixelBufferRef pixelBuffer经过层层传递最终到达VTCompressionSessionEncodeFrame。

格式转换


正常从摄像头采集的pixbuffer不需要格式转换,但如果需要转换,WebRTC会自动进行转换,但是只支持四种格式转换为I420:

kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
kCVPixelFormatType_32BGRA
kCVPixelFormatType_32ARGB

如果中途不需要转换的话,传进来的格式可以是其他格式。比如RTCCameraVideoCapturer采集时的格式是系统默认的第一种格式,在mac平台通常不是以上四种格式,比如我的iMac默认是kCVPixelFormatType_422YpCbCr8,然而传递到编码的过程中都不需要读取数据和转换格式,而直接将采集到的数据送进系统的硬编码器中编码。

kCVPixelFormatType_420YpCbCr8BiPlanarFullRangekCVPixelFormatType_420YpCbCr8BiPlanarVideoRange都是NV12格式,只不过FullRange的颜色更好一些。

RTCCVPixelBuffer::toI420的转换代码:

- (id<RTCI420Buffer>)toI420 {const OSType pixelFormat = CVPixelBufferGetPixelFormatType(_pixelBuffer);CVPixelBufferLockBaseAddress(_pixelBuffer, kCVPixelBufferLock_ReadOnly);RTCMutableI420Buffer* i420Buffer =[[RTCMutableI420Buffer alloc] initWithWidth:[self width] height:[self height]];switch (pixelFormat) {case kCVPixelFormatType_420YpCbCr8BiPlanarFullRange:case kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange: {const uint8_t* srcY =static_cast<uint8_t*>(CVPixelBufferGetBaseAddressOfPlane(_pixelBuffer, 0));const int srcYStride = CVPixelBufferGetBytesPerRowOfPlane(_pixelBuffer, 0);const uint8_t* srcUV =static_cast<uint8_t*>(CVPixelBufferGetBaseAddressOfPlane(_pixelBuffer, 1));const int srcUVStride = CVPixelBufferGetBytesPerRowOfPlane(_pixelBuffer, 1);// Crop just by modifying pointers.srcY += srcYStride * _cropY + _cropX;srcUV += srcUVStride * (_cropY / 2) + _cropX;// TODO(magjed): Use a frame buffer pool.webrtc::NV12ToI420Scaler nv12ToI420Scaler;nv12ToI420Scaler.NV12ToI420Scale(srcY,srcYStride,srcUV,srcUVStride,_cropWidth,_cropHeight,i420Buffer.mutableDataY,i420Buffer.strideY,i420Buffer.mutableDataU,i420Buffer.strideU,i420Buffer.mutableDataV,i420Buffer.strideV,i420Buffer.width,i420Buffer.height);break;}case kCVPixelFormatType_32BGRA:case kCVPixelFormatType_32ARGB: {CVPixelBufferRef scaledPixelBuffer = NULL;CVPixelBufferRef sourcePixelBuffer = NULL;if ([self requiresCropping] ||[self requiresScalingToWidth:i420Buffer.width height:i420Buffer.height]) {CVPixelBufferCreate(NULL, i420Buffer.width, i420Buffer.height, pixelFormat, NULL, &scaledPixelBuffer);[self cropAndScaleTo:scaledPixelBuffer withTempBuffer:NULL];CVPixelBufferLockBaseAddress(scaledPixelBuffer, kCVPixelBufferLock_ReadOnly);sourcePixelBuffer = scaledPixelBuffer;} else {sourcePixelBuffer = _pixelBuffer;}const uint8_t* src = static_cast<uint8_t*>(CVPixelBufferGetBaseAddress(sourcePixelBuffer));const size_t bytesPerRow = CVPixelBufferGetBytesPerRow(sourcePixelBuffer);if (pixelFormat == kCVPixelFormatType_32BGRA) {// Corresponds to libyuv::FOURCC_ARGBlibyuv::ARGBToI420(src,bytesPerRow,i420Buffer.mutableDataY,i420Buffer.strideY,i420Buffer.mutableDataU,i420Buffer.strideU,i420Buffer.mutableDataV,i420Buffer.strideV,i420Buffer.width,i420Buffer.height);} else if (pixelFormat == kCVPixelFormatType_32ARGB) {// Corresponds to libyuv::FOURCC_BGRAlibyuv::BGRAToI420(src,bytesPerRow,i420Buffer.mutableDataY,i420Buffer.strideY,i420Buffer.mutableDataU,i420Buffer.strideU,i420Buffer.mutableDataV,i420Buffer.strideV,i420Buffer.width,i420Buffer.height);}if (scaledPixelBuffer) {CVPixelBufferUnlockBaseAddress(scaledPixelBuffer, kCVPixelBufferLock_ReadOnly);CVBufferRelease(scaledPixelBuffer);}break;}default: { RTC_NOTREACHED() << "Unsupported pixel format."; }}CVPixelBufferUnlockBaseAddress(_pixelBuffer, kCVPixelBufferLock_ReadOnly);return i420Buffer;
}

CopyVideoFrameToNV12PixelBuffer将RTCI420Buffer转换为NV12的CVPixelBufferRef,内部使用的是libyuv::I420ToNV12进行转换。

bool CopyVideoFrameToNV12PixelBuffer(id<RTCI420Buffer> frameBuffer, CVPixelBufferRef pixelBuffer) {RTC_DCHECK(pixelBuffer);RTC_DCHECK_EQ(CVPixelBufferGetPixelFormatType(pixelBuffer), kNV12PixelFormat);RTC_DCHECK_EQ(CVPixelBufferGetHeightOfPlane(pixelBuffer, 0), frameBuffer.height);RTC_DCHECK_EQ(CVPixelBufferGetWidthOfPlane(pixelBuffer, 0), frameBuffer.width);CVReturn cvRet = CVPixelBufferLockBaseAddress(pixelBuffer, 0);if (cvRet != kCVReturnSuccess) {RTC_LOG(LS_ERROR) << "Failed to lock base address: " << cvRet;return false;}uint8_t *dstY = reinterpret_cast<uint8_t *>(CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0));int dstStrideY = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0);uint8_t *dstUV = reinterpret_cast<uint8_t *>(CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1));int dstStrideUV = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1);// Convert I420 to NV12.int ret = libyuv::I420ToNV12(frameBuffer.dataY,frameBuffer.strideY,frameBuffer.dataU,frameBuffer.strideU,frameBuffer.dataV,frameBuffer.strideV,dstY,dstStrideY,dstUV,dstStrideUV,frameBuffer.width,frameBuffer.height);CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);if (ret) {RTC_LOG(LS_ERROR) << "Error converting I420 VideoFrame to NV12 :" << ret;return false;}return true;
}

CVPixelBufferRef


CVPixelBufferRef支持多种格式,使用CVPixelBufferGetPixelFormatType可以取得格式类型。
常用的格式有:

kCVPixelFormatType_32ARGB         = 0x00000020,
kCVPixelFormatType_32BGRA         = 'BGRA',
kCVPixelFormatType_420YpCbCr8PlanarFullRange    = 'f420',
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange  = '420f',

还有更多格式都在CVPixelBuffer.h中定义。
f420就是420p也是i420420f即系统默认的NV12

WebRTC IOS视频硬编码流程及其中传递的CVPixelBufferRef相关推荐

  1. iOS视频硬编码技术

    iOS视频硬编码技术 一.iOS视频采集硬编码 基本原理 硬编码 & 软编码 硬编码:通过系统自带的Camera录制视频,实际上调用的是底层的高清编码硬件模块,即显卡,不使用CPU,速度快 软 ...

  2. 香橙派Pi5基于Qt5视频硬编码编译

    香橙派Pi5视频通过Qt5硬编码编译 文章目录 香橙派Pi5视频通过Qt5硬编码编译 前言 一.RKMpp和RKMpi是什么? 二.编译RKMPP 1.下载地址 2.文件结构 3.开始编译 4.编译M ...

  3. WeBRTC IOS视频采集流程

    因CSDN MardDown语法问题,流程图部分不兼容有道云笔记,所以流程图部分请拷贝到有道云笔记生成查看. iOS视频录制: 同拍照一样视频录制功能有两种实现方式 UIImagePickerView ...

  4. WebRTC-Android硬编码流程详解

    1.硬编解码的一些问题 1.1.方块效应 在我们视频电话时,快速移动摄像头时,会发现画面变出现很多方块.这就是方块效应. 无论是要发送的TCP数据包,还是要编码的图像,都会出现在短时间内出现较大的数据 ...

  5. 硬编码 java_Java编程中的硬编码问题

    在计算机程序或文本编辑中,硬编码是指将可变变量用一个固定值来代替的方法. 用这种方法编译后,如果以后需要更改此变量就非常困难了. 大部分程序语言里,可以将一个固定数值定义为一个标记,然后用这个特殊标记 ...

  6. iOS h264硬编码

    从这里抄过来的:https://github.com/LevyGG/iOS-H.264-hareware-encode-and-decode/blob/master/VTDemoOniPad/H264 ...

  7. 大厂卡你的学历,究竟是为了什么?,android开发视频硬编码

    因此Android工程师的养成,一部分依然靠纯自学,大部分则靠培训,无论「培训」背负了多少的骂名,但整个Android行业,二三十万的从业者,抛开少部分大学计算机科班和软件工程相关专业的,绝大多数都是 ...

  8. 试试 python-dotenv,避免敏感信息被硬编码到代码中

    我们开发的每个系统都离不开配置信息,例如数据库密码.Redis密码.邮件配置.各种第三方配置信息,这些信息都非常敏感,一旦泄露出去后果非常严重,被泄露的原因一般是程序员将配置信息和代码混在一起导致的. ...

  9. ios视频硬解异常总结,12911总结

    废话少说,直接上结果: VTDecompressionSessionCreate: -12911 原因总结: 创建session时,就是VTDecompressionSessionCreate函数: ...

最新文章

  1. 三个获取浏览器URL中参数值的方法
  2. mysql ——MHA
  3. 【select模块】select IO多路复用和select实现FTP
  4. ASP.NET Core 中基于工厂的中间件激活
  5. 如何创建_如何创建自己的微信圈子?圈子创建运营指南
  6. php之time的用法,php中time()与$_SERVER[REQUEST_TIME]用法区别
  7. 笨办法学 Python · 续 练习 12:复习
  8. 华为Mate 30 Pro前面板曝光:双曲面设计 几乎全是屏
  9. slice 定义和用法
  10. python3 linux
  11. 如何将一个完整项目推到码云_「码云下载项目」如何通过Git将本地项目提交到码云或GitHub...
  12. 西门子 S7-200CN CPU 224CN EEPROM芯片
  13. mysql简述cs结构与bs结构的区别_什么是BS和CS结构?
  14. scratch编程钟表
  15. 查看个人小程序的累计独立访客(UV)
  16. 解决Pycharm中下载不了sklearn问题
  17. SpringBoot请求报403 Forbidden
  18. MySQL将多条数据合并成一条
  19. kettle spoon判断增量更新_Kettle增量更新设计技巧
  20. 【读薄 CSAPP】贰 机器指令与程序优化

热门文章

  1. java 假设有50瓶饮料,喝完3个空瓶可以换一瓶饮料,依次类推,请问总共喝了多少瓶饮料?
  2. matlab学习-结构体变量
  3. Python flask入门
  4. Adobe XMP SDK编译
  5. HDU6194 后缀数组的应用
  6. foxmail的邮局和端口_橄榄邮Foxmail 7.2 设置详解
  7. notebook jupyter, can not assign ip adress
  8. 贷款减值准备和折现回拨
  9. 笔记|数据分析:指标体系中的标准化方法
  10. 【测绘程序设计试题集】 试题04 最短路径计算