嵌入式监控【v4l2采集->vpu编码->live555推流】

文章目录

  • 嵌入式监控【v4l2采集->vpu编码->live555推流】
    • 介绍
    • 数据流图
    • 一、v4l2
      • 1.1 确定cam的输出格式
      • 1.2 YUYV 转 YUV420
      • 1.3 播放采集到的yuv420数据
    • 二、vpu硬编码
      • 2.1 使用mxc_vpu_test.out 硬件编码h264
    • 三、使用live555MediaServer推流
    • 总结

在上一博客中介绍了视频监控的采用库和相关架构,这篇博客作为优化和补充,主要优化的地方为由v4l2代替opencv的采集,由vpu编码代替x264

介绍

应为x264在arm a9开发版中占用大量cpu资源,导致视频卡顿,所以决定采用使用v4l2来采集uvc数据,使用vpu编码。
开发版为imx6q,版载有vpu/ipu,可以代替x264的软算法,释放cpu资源。

数据流图

v4l2 -> yuyv -> yuv420 -> vpu -> h264
数据流验证过程:

  1. v4l2采集yuyv数据,写入到文件,使用pyuv播放验证
  2. 转码yuyv到yuv420后,写入到文件,使用pyuv播放验证
  3. 将yuv420文件使用vpu编码为h264后,使用live555推流,vlc远程播放验证

一、v4l2

此处为了精简采集uvc(usb video cam)的采集过程,直接使用v4l2采集,而没有使用opencv,opencv采集的数据Mat(RGB)需要在二次转化。

1.1 确定cam的输出格式

摄象头有两种,一种是模组形式,他的输出格式多为MJPG,另一种是uvc形式,他的输出格式为yuyv(422).
如何判断摄象头类型?一般使用usb接口的摄象头为uvc,而内嵌到板子上的多为MJPG,可以使用命令工具v4l2-ctl来判断.

下面是我的usb 罗技摄象头:

root@zjy-T440:~/workStation/crossGcc/vpu/mxc_vpu_test# v4l2-ctl -d /dev/video0 --all
Driver Info (not using libv4l2):Driver name   : uvcvideoCard type     : Integrated Camera: Integrated CBus info      : usb-0000:00:14.0-8Driver version: 5.3.18Capabilities  : 0x84A00001Video CaptureMetadata CaptureStreamingExtended Pix FormatDevice CapabilitiesDevice Caps   : 0x04200001Video CaptureStreamingExtended Pix Format
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:Width/Height      : 640/480Pixel Format      : 'YUYV'Field             : NoneBytes per Line    : 1280Size Image        : 614400Colorspace        : sRGBTransfer Function : Default (maps to sRGB)YCbCr/HSV Encoding: Default (maps to ITU-R 601)Quantization      : Default (maps to Limited Range)Flags             :
Crop Capability Video Capture:Bounds      : Left 0, Top 0, Width 640, Height 480Default     : Left 0, Top 0, Width 640, Height 480Pixel Aspect: 1/1
Selection: crop_default, Left 0, Top 0, Width 640, Height 480
Selection: crop_bounds, Left 0, Top 0, Width 640, Height 480
Streaming Parameters Video Capture:Capabilities     : timeperframeFrames per second: 30.000 (30/1)Read buffers     : 0brightness 0x00980900 (int)    : min=0 max=255 step=1 default=128 value=128contrast 0x00980901 (int)    : min=0 max=255 step=1 default=32 value=32saturation 0x00980902 (int)    : min=0 max=100 step=1 default=64 value=64hue 0x00980903 (int)    : min=-180 max=180 step=1 default=0 value=0white_balance_temperature_auto 0x0098090c (bool)   : default=1 value=1gamma 0x00980910 (int)    : min=90 max=150 step=1 default=120 value=120power_line_frequency 0x00980918 (menu)   : min=0 max=2 default=1 value=1white_balance_temperature 0x0098091a (int)    : min=2800 max=6500 step=1 default=4000 value=4000 flags=inactivesharpness 0x0098091b (int)    : min=0 max=7 step=1 default=2 value=2backlight_compensation 0x0098091c (int)    : min=0 max=2 step=1 default=1 value=1exposure_auto 0x009a0901 (menu)   : min=0 max=3 default=3 value=3exposure_absolute 0x009a0902 (int)    : min=4 max=1250 step=1 default=166 value=166 flags=inactiveexposure_auto_priority 0x009a0903 (bool)   : default=0 value=1

下面为开发版摄象头:

root@imx6qsabresd:/home# v4l2-ctl -d /dev/video2 --all
Driver Info (not using libv4l2):Driver name   : uvcvideoCard type     : USB CameraBus info      : usb-ci_hdrc.1-1.2Driver version: 4.1.15Capabilities  : 0x84200001Video CaptureStreamingExtended Pix FormatDevice CapabilitiesDevice Caps   : 0x04200001Video CaptureStreamingExtended Pix Format
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:Width/Height  : 1280/720Pixel Format  : 'MJPG'Field         : NoneBytes per Line: 0Size Image    : 1843200Colorspace    : SRGBFlags         :
Crop Capability Video Capture:Bounds      : Left 0, Top 0, Width 1280, Height 720Default     : Left 0, Top 0, Width 1280, Height 720Pixel Aspect: 1/1
Selection: crop_default, Left 0, Top 0, Width 1280, Height 720
Selection: crop_bounds, Left 0, Top 0, Width 1280, Height 720
Streaming Parameters Video Capture:Capabilities     : timeperframeFrames per second: 30.000 (30/1)Read buffers     : 0brightness (int)    : min=-64 max=64 step=1 default=0 value=0contrast (int)    : min=0 max=100 step=1 default=50 value=50saturation (int)    : min=0 max=100 step=1 default=50 value=50hue (int)    : min=-180 max=180 step=1 default=0 value=0white_balance_temperature_auto (bool)   : default=1 value=1gamma (int)    : min=100 max=500 step=1 default=300 value=300power_line_frequency (menu)   : min=0 max=2 default=1 value=1white_balance_temperature (int)    : min=2800 max=6500 step=10 default=4600 value=4600 flags=inactivesharpness (int)    : min=0 max=100 step=1 default=50 value=50backlight_compensation (int)    : min=0 max=2 step=1 default=0 value=0exposure_auto (menu)   : min=0 max=3 default=3 value=3exposure_absolute (int)    : min=50 max=10000 step=1 default=166 value=166 flags=inactiveexposure_auto_priority (bool)   : default=0 value=1pan_absolute (int)    : min=-57600 max=57600 step=3600 default=0 value=0tilt_absolute (int)    : min=-43200 max=43200 step=3600 default=0 value=0zoom_absolute (int)    : min=0 max=3 step=1 default=0 value=0brightness (int)    : min=-64 max=64 step=1 default=0 value=0contrast (int)    : min=0 max=100 step=1 default=50 value=50saturation (int)    : min=0 max=100 step=1 default=50 value=50hue (int)    : min=-180 max=180 step=1 default=0 value=0white_balance_temperature_auto (bool)   : default=1 value=1gamma (int)    : min=100 max=500 step=1 default=300 value=300power_line_frequency (menu)   : min=0 max=2 default=1 value=1white_balance_temperature (int)    : min=2800 max=6500 step=10 default=4600 value=4600 flags=inactivesharpness (int)    : min=0 max=100 step=1 default=50 value=50backlight_compensation (int)    : min=0 max=2 step=1 default=0 value=0

MJPG格式的cam,输出的jpeg图像,可以直接用来显示,直接一帧接着一帧就可以构成视频,但是如果想转化为h264 就需要先解码为yuv,再编码h264,所以此处采用uvc摄象头开发,uvc采集到的数据为yuyv。

1.2 YUYV 转 YUV420

为了使用vpu硬编码,需要转化为vpu的输入格式yuv420。
自行研究YUV格式,转换直接上代码:

 88 int YUV422To420(unsigned char yuv422[], unsigned char yuv420[], int width, int height)89 {90 91        int ynum=width*height;92        int i,j,k=0;93     //得到Y分量  94        for(i=0;i<ynum;i++){95            yuv420[i]=yuv422[i*2];96        }97     //得到U分量  98        for(i=0;i<height;i++){99            if((i%2)!=0)continue;
100            for(j=0;j<(width/2);j++){101                if((4*j+1)>(2*width))break;
102                yuv420[ynum+k*2*width/4+j]=yuv422[i*2*width+4*j+1];
103                        }
104             k++;
105        }
106        k=0;
107     //得到V分量
108        for(i=0;i<height;i++){109            if((i%2)==0)continue;
110            for(j=0;j<(width/2);j++){111                if((4*j+3)>(2*width))break;
112                yuv420[ynum+ynum/4+k*2*width/4+j]=yuv422[i*2*width+4*j+3];
113
114            }
115             k++;
116        }
117
118        return 1;
119 }
需要注意:YUYV和YUV420的帧大小计算
YUYV:h*w*2
YUV420: h*w*3/2
size(YUV420) = size(YUYV)*3/4
如果帧大小差异被忽略,导致花屏。

1.3 播放采集到的yuv420数据

研究YUV编码,一定用到一个播放YUV的播放软件pYUV,在linux可以安装使用。

size: 采集摄象头的帧大小
Color space: 色域空间,yuv
subsampling:由uvc直接采集到的是422,可自行转换420
interleaved:隔行扫描,经过转换420后的yuv不需要钩选
此处有个概念容易混淆,扫描方式和存储方式,此处的interleaved为存储方式,和v4l2中的fmt.pix.field = V4L2_FIELD_NONE不是同一个概念,v4l2中设置的是扫描方式。

二、vpu硬编码

nxp官方文档有提供imx6 vpu相关的文档:VPU_API_RM_L3.0.35_1.1.0.pdf
3.3.1.2
Encoder Operation Flow
To encode a bitstream, the application completes the following steps:

  1. Call vpu_Init() to initialize the VPU.
  2. Open a encoder instance by using vpu_EncOpen().
  3. Before starting a picture encoder operation, get crucial parameters for encoder operations such as required frame buffer
    size by using vpu_EncGetInitialInfo().
  4. By using the returned frame buffer requirement, allocate size of frame buffers and convey this information to the VPU
    by using vpu_EncRegisterFrameBuffer().
  5. Generate high-level header syntax by using vpu_EncGiveCommand().
  6. Start picture encoder operation picture-by-picture by using vpu_EncStartOneFrame().
  7. Wait the completion of picture encoder operation interrupt event.
  8. After encoding a frame is complete, check the results of encoder operation by using vpu_EncGetOutputInfo().
  9. If there are more frames to encode, go to Step 4. Otherwise, go to the next step.
  10. Terminate the sequence operation by closing the instance using vpu_EncClose().
  11. Call vpu_UnInit() to release the system resources.
    The encoder operation flow is shown in figure below.

4.4.2.2
Encode Stream from Camera Captured Data
The application should complete the following steps to encode streams from camera captured data:

  1. Call vpu_Init() to initialize the VPU. If there are multi-instances supported in this application, this function only needs
    to be called once.
  2. Open a encoder instance using vpu_EncOpen(). Call IOGetPhyMem() to input encop.bitstreamBuffer for the physical
    continuous bitstream buffer before opening the instance. Call IOGetVirtMem() to get the corresponding virtual
    address of the bitstream buffer, then fill the bitstream to this address in user space. If rotation is enabled and the
    rotation angle is 90° or 270°, the picture width and height must be swapped.
  3. If rotation is enabled, give commands ENABLE_ROTATION and SET_ROTATION_ANGLE. If mirror is enabled,
    give commands ENABLE_MIRRORING and SET_MIRROR_DIRECTION.
  4. Get crucial parameters for encoder operations such as required frame buffer size, and so on using
    vpu_EncGetInitialInfo().
  5. Using the frame buffer requirement returned from vpu_DecGetInitialInfo(), allocate the proper size of the frame
    buffers and notify the VPU using vpu_EncRegisterFrameBuffer(). The requested frame buffer for the source frame in
    PATH_V4L2 to encode camera captured data is as follows:
    • Allocate the minFrameBufferCount frame buffers by calling IOGetPhyMem() and register them to the VPU for
    encoder using vpu_EncRegisterFrameBuffer().
    • Another frame buffer is needed for the source frame buffer. Call v4l_capture_setup() to open the v4l device for
    camera and request v4l buffers. In this example, three v4l buffers are allocated. Call v4l_start_capturing() to
    start camera capture. Pass the dequeued v4l buffer address by calling v4l_get_capture_data() as encoder source
    frame in each picture encoder, then no need to memory transfer for performance improvement.
  6. Generate the high-level header syntaxes using vpu_EncGiveCommand().
  7. Start picture encoder operation picture-by-picture using vpu_EncStartOneFrame(). Pass dequeued v4l buffer address
    by calling v4l_get_capture_data() as the encoder source frame before each picture encoder is started.
  8. Wait for the completion of picture decoder operation interrupt event calling vpu_WaitforInt(). Use vpu_IsBusy() to
    check if the VPU is busy. If the VPU is not busy, go to the next step; otherwise, wait again.
  9. After encoding a frame is complete, check the results of encoder operation using vpu_EncGetOutputInfo(). After the
    output information is received, call v4l_put_capture_data() to the VIDIOC_QBUF v4l buffer for the next capture
    usage.
  10. If there are more frames to encode, go to Step 7; otherwise, go to the next step.
  11. Terminate the sequence operation by closing the instance using vpu_DecClose(). Make sure
    vpu_DecGetOutputInfo() is called for each corresponding vpu_DecStartOneFrame() call before closing the instance
    although the last output information may be not useful.
  12. Free all allocated memory and v4l resource using IOFreePhyMem() and IOFreeVirtMem(). Call
    v4l_stop_capturing() to stop capture.
  13. Call vpu_UnInit() to release the system resources. If there are multi-instances supported in this application, this
    function only needs to be called once.

文档对vpu编解码都将的很清楚,并且开发版附带的资料中有mxc_vpu_test.out 的源码和帮助文档。

2.1 使用mxc_vpu_test.out 硬件编码h264

开发版内自带vpu测试工具及源码:mxc_vpu_test.out,先使用该工具转码yuv420为h264,验证数据正确性。

root@imx6qsabresd:/# cd /unit_tests/
root@imx6qsabresd:/unit_tests# ./mxc_vpu_test.out -E "-i out.yuv -w 624 -h 416 -f 2 -o file.264 -t 0"
root@imx6qsabresd:/home# ./mxc_vpu_test.out
[INFO]
Usage: ./mxc_vpu_test.out -D "<decode options>" -E "<encode options>" -L "<loopback options>" -C <config file> -T "<transcode options>" -H display this help decode options -i <input file> Read input from file If no input file is specified, default is network -o <output file> Write output to file If no output is specified, default is LCD -x <output method> output mode V4l2(0) or IPU lib(1) 0 - V4L2 of FG device, 1 - IPU lib path 2 - G2D (available for Android only) Other value means V4L2 with other video node16 - /dev/video16, 17 - /dev/video17, and so on -f <format> 0 - MPEG4, 1 - H.263, 2 - H.264, 3 - VC1, 4 - MPEG2, 5 - DIV3, 6 - RV, 7 - MJPG, 8 - AVS, 9 - VP8If no format specified, default is 0 (MPEG4) -l <mp4Class / h264 type> When 'f' flag is 0 (MPEG4), it is mp4 class type. 0 - MPEG4, 1 - DIVX 5.0 or higher, 2 - XVID, 5 - DIVX4.0 When 'f' flag is 2 (H.264), it is h264 type. 0 - normal H.264(AVC), 1 - MVC -p <port number> UDP port number to bind If no port number is secified, 5555 is used -c <count> Number of frames to decode -d <deblocking> Enable deblock - 1. enabled default deblock is disabled (0). -e <dering> Enable dering - 1. enabled default dering is disabled (0). -r <rotation angle> 0, 90, 180, 270 default rotation is disabled (0) -m <mirror direction> 0, 1, 2, 3 default no mirroring (0) -u <ipu/gpu rotation> Using IPU/GPU rotation for display - 1. IPU/GPU rotation default is VPU rotation(0).This flag is effective when 'r' flag is specified.-v <vdi motion> set IPU VDI motion algorithm l, m, h.default is m-medium. -w <width> display picture width default is source picture width. -h <height> display picture height default is source picture height -j <left offset> display picture left offset default is 0. -k <top offset> display picture top offset default is 0 -a <frame rate> display framerate default is 30 -t <chromaInterleave> CbCr interleaved default is interleave(1). -s <prescan/bs_mode> Enable prescan in decoding on i.mx5x - 1. enabled default is disabled. Bitstream mode in decoding on i.mx6  0. Normal mode, 1. Rollback mode default is enabled. -y <maptype> Map type for GDI interface 0 - Linear frame map, 1 - frame MB map, 2 - field MB map default is 0. encode options -i <input file> Read input from file (yuv) If no input file specified, default is camera -x <input method> input mode V4L2 with video node 0 - /dev/video0, 1 - /dev/video1, and so on -o <output file> Write output to file This option will be ignored if 'n' is specified If no output is specified, def files are created -n <ip address> Send output to this IP address -p <port number> UDP port number at server If no port number is secified, 5555 is used -f <format> 0 - MPEG4, 1 - H.263, 2 - H.264, 7 - MJPG If no format specified, default is 0 (MPEG4) -l <h264 type> 0 - normal H.264(AVC), 1 - MVC-c <count> Number of frames to encode -r <rotation angle> 0, 90, 180, 270 default rotation is disabled (0) -m <mirror direction> 0, 1, 2, 3 default no mirroring (0) -w <width> capture image width default is 176. -h <height>capture image height default is 144 -b <bitrate in kbps> default is auto (0) -g <gop size> default is 0 -t <chromaInterleave> CbCr interleaved default is interleave(1). -q <quantization parameter> default is 20 -a <frame rate> capture/encode framerate default is 30 loopback options -x <input method> input mode V4L2 with video node 0 - /dev/video0, 1 - /dev/video1, and so on -f <format> 0 - MPEG4, 1 - H.263, 2 - H.264, 7 - MJPG If no format specified, default is 0 (MPEG4) -w <width> capture image width default is 176. -h <height>capture image height default is 144 -t <chromaInterleave> CbCr interleaved default is interleave(1). -a <frame rate> capture/encode/display framerate default is 30 transcode options, encoder set to h264 720p now -i <input file> Read input from file If no input file is specified, default is network -o <output file> Write output to file If no output is specified, default is LCD -x <output method> V4l2(0) or IPU lib(1) -f <format> 0 - MPEG4, 1 - H.263, 2 - H.264, 3 - VC1, 4 - MPEG2, 5 - DIV3, 6 - RV, 7 - MJPG, 8 - AVS, 9 - VP8If no format specified, default is 0 (MPEG4) -l <mp4Class / h264 type> When 'f' flag is 0 (MPEG4), it is mp4 class type. 0 - MPEG4, 1 - DIVX 5.0 or higher, 2 - XVID, 5 - DIVX4.0 When 'f' flag is 2 (H.264), it is h264 type. 0 - normal H.264(AVC), 1 - MVC -p <port number> UDP port number to bind If no port number is secified, 5555 is used -c <count> Number of frames to decode -d <deblocking> Enable deblock - 1. enabled default deblock is disabled (0). -e <dering> Enable dering - 1. enabled default dering is disabled (0). -r <rotation angle> 0, 90, 180, 270 default rotation is disabled (0) -m <mirror direction> 0, 1, 2, 3 default no mirroring (0) -u <ipu rotation> Using IPU rotation for display - 1. IPU rotation default is VPU rotation(0).This flag is effective when 'r' flag is specified.-v <vdi motion> set IPU VDI motion algorithm l, m, h.default is m-medium. -w <width> display picture width default is source picture width. -h <height> display picture height default is source picture height -j <left offset> display picture left offset default is 0. -k <top offset> display picture top offset default is 0 -a <frame rate> display framerate default is 30 -t <chromaInterleave> CbCr interleaved default is interleave(1). -s <prescan/bs_mode> Enable prescan in decoding on i.mx5x - 1. enabled default is disabled. Bitstream mode in decoding on i.mx6  0. Normal mode, 1. Rollback mode default is enabled. -y <maptype> Map type for GDI interface 0 - Linear frame map, 1 - frame MB map, 2 - field MB map -q <quantization parameter> default is 20 config file - Use config file for specifying options 

./mxc_vpu_test.out -E “-i out.yuv -w 624 -h 416 -f 2 -o file.264 -t 0”
-E 编码操作
-i 输入的yuv文件
-w -h 视频分辨率
-f 转码为h264
-o 输出文件名
-t 是否隔行存储
转码后生成file.h264,此时可以直接使用live555推流播放了。

三、使用live555MediaServer推流

此处参考上篇博文,使用移植后的live555MediaServer直接推流测试。

root@imx6qsabresd:/home# ./live555MediaServer
LIVE555 Media Serverversion 0.99 (LIVE555 Streaming Media library version 2020.01.28).
Play streams from this server using the URLrtsp://192.168.2.11/<filename>
where <filename> is a file present in the current directory.
Each file's type is inferred from its name suffix:".264" => a H.264 Video Elementary Stream file".265" => a H.265 Video Elementary Stream file".aac" => an AAC Audio (ADTS format) file".ac3" => an AC-3 Audio file".amr" => an AMR Audio file".dv" => a DV Video file".m4e" => a MPEG-4 Video Elementary Stream file".mkv" => a Matroska audio+video+(optional)subtitles file".mp3" => a MPEG-1 or 2 Audio file".mpg" => a MPEG-1 or 2 Program Stream (audio+video) file".ogg" or ".ogv" or ".opus" => an Ogg audio and/or video file".ts" => a MPEG Transport Stream file(a ".tsx" index file - if present - provides server 'trick play' support)".vob" => a VOB (MPEG-2 video with AC-3 audio) file".wav" => a WAV Audio file".webm" => a WebM audio(Vorbis)+video(VP8) file
See http://www.live555.com/mediaServer/ for additional documentation.
(We use port 80 for optional RTSP-over-HTTP tunneling, or for HTTP live streaming (for indexed Transport Stream files only).)

推流后可以直接使用vlc等rtsp播放软件,播放rtsp://192.168.2.11/file.264 文件。

总结

最后以上数据链路打通后,开始着手整合代码。
源码整理中,敬请期待。

嵌入式监控【v4l2采集-vpu编码-live555推流】相关推荐

  1. 嵌入式监控【v4l2采集->vpu编码->live555推流】

    嵌入式监控[v4l2采集->vpu编码->live555推流] 文章目录 嵌入式监控[v4l2采集->vpu编码->live555推流] 介绍 数据流图 一.v4l2 1.1 ...

  2. V4L2采集yuv视频花屏:Linux视频采集与编码(一)

    V4L2采集yuv视频花屏:Linux视频采集与编码(一) 标签: linuxvideobufferiostructnull 2012-05-01 20:35 16179人阅读 评论(42) 收藏 举 ...

  3. V4L2视频采集与H264编码1—V4L2采集JPEG数据

    最近在做视频编码,经过一段时间的折腾,终于可以把V4L2采集到的图片数据编码成H264视频,并且成功将工程移植到了ARM开发板上.无奈开发板性能太低,每秒钟只能编码一帧数据,查看CPU已经跑到100% ...

  4. FFMPEG音视频同步-音视频实时采集并编码推流

    FFMPEG音视频同步-音视频实时采集并编码推流 //------------------------------------------------------------------------- ...

  5. V4L2采集YUYV数据—X264编码H264视频实例

    前几天在网上买个罗技的C270摄像头,它支持YUYV(YUV422)和JPEG数据输出.它规格书上写的是支持HD720P(1280*720像素),在实际的调试过程中,我使用该分辨率会导致数据采集过慢. ...

  6. 干货:H.265编码RTMP推流直播摄像头

    了解直播行业的同仁们都知道,前端采集的摄像机码率越小,整个直播链路成本就越小,而且给客户的直播体验更好,因为码率越小,占用的上行带宽越少,服务器存储成本越小,客户端播放的视频就越流畅. 叁陆伍视讯,成 ...

  7. V4L2采集视频数据

    Video for Linux two(Video4Linux2)简称V4L2,是V4L的改进版.V4L2是linux操作系统下用于采集图片.视频和音频数据的API接口,配合适当的视频采集设备和相应的 ...

  8. webrtc源码分析之-从视频采集到编码流程

    peer_connection中从视频采集到编码的流程 摘要:本篇文章主要讲述当我们通过peer_connection完成推流时,视频从采集到编码是如何衔接的. 既,视频采集后如何传送到编码器.重点分 ...

  9. 云看板生产管理系统,实时监控网关采集的数据

    云看板生产管理系统是利用数采网关.云服务器和智能手机APP相结合,精心打造的一款高效.智能.强大的物联网生产管理系统.相较于其他生产管理系统,云看板生产管理系统最大的优势是部署简捷高效,客户只需要将负 ...

最新文章

  1. 《Java工程师修炼之道》内容概览
  2. icc校色文件使用教程_Flink教程-flink 1.11使用sql将流式数据写入文件系统
  3. python批量改动指定文件夹文件名称
  4. reactNative 计算时间差
  5. oracle测试没响应,Oracle JDBC 没响应,是不是BUG?
  6. jdk 安装_Jdk 安装使用教程
  7. 网络安全:堡垒机相关知识介绍
  8. html如何与php,html页面怎么跟php文件连接
  9. uboot源码——环境变量
  10. linux非阻塞通话编程,linux 非阻塞式socket编程求助。。
  11. 外贸电子商务软件必须提供的SEO特性
  12. layui中laypage当前页刷新
  13. Spring通过注解的形式 将bean以及相应的属性值 放入ioc容器
  14. menuconfig 菜单配置
  15. 打印机服务器没有响应是怎么回事啊,打印机后台程序没有运行?(打印机后台服务无法启动怎么办?)...
  16. 全球与中国乳制品替代杏仁制品市场深度研究分析报告
  17. 《战神》全剧情对话超详尽攻略
  18. 【小白话通信】离散分布之间的关系
  19. 如何恢复计算机隐藏的文件夹,隐藏文件夹,小编教你电脑隐藏文件夹怎么恢复...
  20. html注册cab包,OCX控件打包成CAB并实现数字签名过程

热门文章

  1. 螺旋模型的优点与缺点
  2. 【源码】SIMULINK中的Logitech X-56 H.O.T.A.S飞行控制器
  3. Threadlocal学习及内存泄漏原因和解决方案
  4. python乘积函数_Python中乘法
  5. 在深度学习中使用Bagging集成模型
  6. java pdf与ofd文件相互转换
  7. 外包公司究竟有没有前途?讲讲我在外包公司的真实经历
  8. 秋冬易感冒着凉 风寒感冒9大食疗方
  9. php无限评论回复_php实现无限级评论功能
  10. 越睡越累,原因竟然是这个!