使用异步读取编码(解码)后的数据,效率会大增。

可以直接起一个线程不断地读。

---------------------------------------------------------------------------------------------------------------

https://blog.csdn.net/u013028621/article/details/62417181

0、本文概述

MediaCodec是anroid api 16以后开发的硬编解码接口,英文文档参照这个链接,中文翻译可以参考这个链接。本文主要记录的是如何使用MediaCodec对视频进行编解码,最后会以实例的方式展示如何将Camera预览数据编码成H264,再把编码后的h264解码并且显示在SurfaceView中。本例不涉及音频的编解码。

1、MediaCodec编码视频

使用MediaCodec实现视频编码的步骤如下: 
1.初始化MediaCodec,方法有两种,分别是通过名称和类型来创建,对应的方法为:

MediaCodec createByCodecName (String name);
MediaCodec createDecoderByType (String type);
  • 具体可用的name和type参考文档即可。这里我们通过后者来初始化一个视频编码器。
mMC = MediaCodec.createDecoderByType(MIME_TYPE);

2.配置MediaCodec,这一步需要配置的是MediaFormat,这个类包含了比特率、帧率、关键帧间隔时间等,其中比特率如果太低就会造成类似马赛克的现象。

mMF = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
mMF.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
mMF.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);
if (mPrimeColorFormat != 0){mMF.setInteger(MediaFormat.KEY_COLOR_FORMAT, mPrimeColorFormat);
}
mMF.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1); //关键帧间隔时间 单位s
mMC.configure(mMF, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

其中mPrimeColorFormat为本机支持的颜色空间。一般是yuv420p或者yuv420sp,Camera预览格式一般是yv12或者NV21,所以在编码之前需要进行格式转换,实例可参照文末代码。代码是最好的老师嘛。 
3.打开编码器,获取输入输出缓冲区

mMC.start();
mInputBuffers = mMC.getInputBuffers();
mOutputBuffers = mMC.getOutputBuffers();

4.输入数据,过程可以分为以下几个小步: 
1)获取可使用缓冲区位置得到索引

int inputbufferindex = mMC.dequeueInputBuffer(BUFFER_TIMEOUT);

如果存在可用的缓冲区,此方法会返回其位置索引,否则返回-1,参数为超时时间,单位是毫秒,如果此参数是0,则立即返回,如果参数小于0,则无限等待直到有可使用的缓冲区,如果参数大于0,则等待时间为传入的毫秒值。 
2)传入原始数据

ByteBuffer inputBuffer = mInputBuffers[inputbufferindex];
inputBuffer.clear();//清除原来的内容以接收新的内容
inputBuffer.put(bytes, 0, len);//len是传进来的有效数据长度
mMC.queueInputBuffer(inputbufferindex, 0, len, timestamp, 0);

此缓冲区一旦使用,只有在dequeueInputBuffer返回其索引位置才代表它可以再次使用。 
5.获取其输出数据,获取输入原始数据和获取输出数据最好是异步进行,因为输入一帧数据不代表编码器马上就会输出对应的编码数据,可能输入好几帧才会输出一帧。获取输出数据的步骤与输入数据的步骤相似: 
1)获取可用的输出缓冲区

int outputbufferindex = mMC.dequeueOutputBuffer(mBI, BUFFER_TIMEOUT);

其中参数一是一个BufferInfo类型的实例,参数二为超时时间,负数代表无限等待(可见,不要在主线程进行操作)。 
2)获取输出数据

mOutputBuffers[outputbufferindex].get(bytes, 0, mBI.size);

3)释放缓冲区

mMC.releaseOutputBuffer(outputbufferindex, false);

2、MediaCodec解码视频

解码视频的步骤跟编码的类似,配置不一样: 
1.实例化解码器

mMC = MediaCodec.createDecoderByType(MIME_TYPE);

2.配置解码器,此处需要配置用于显示图像的Surface、MediaFormat包含视频的pps和sps(包含在编码出来的第一帧数据)

int[] width = new int[1];
int[] height = new int[1];
AvcUtils.parseSPS(sps, width, height);//从sps中解析出视频宽高
mMF = MediaFormat.createVideoFormat(MIME_TYPE, width[0], height[0]);
mMF.setByteBuffer("csd-0", ByteBuffer.wrap(sps));
mMF.setByteBuffer("csd-1", ByteBuffer.wrap(pps));
mMF.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, width[0] * height[0]);
mMC.configure(mMF, surface, null, 0);

3.开启编码器并获取输入输出缓冲区

mMC.start();
mInputBuffers = mMC.getInputBuffers();
mOutputBuffers = mMC.getOutputBuffers();

4.输入数据 
1)获取可用的输入缓冲区

int inputbufferindex = mMC.dequeueInputBuffer(BUFFER_TIMEOUT);

返回值为可用缓冲区的索引

ByteBuffer inputBuffer = mInputBuffers[inputbufferindex];
inputBuffer.clear();

2)然后输入数据

inputBuffer.put(bytes, 0, len);
mMC.queueInputBuffer(inputbufferindex, 0, len, timestamp, 0);

5.获取输出数据,这一步与4同样应该异步进行,其具体步骤与上面解码的基本相同,在释放缓冲区的时候需要注意第二个参数设置为true,表示解码显示在Surface上

mMC.releaseOutputBuffer(outputbufferindex, true);

3、编解码实例

下面是一个MediaCodec编解码实例,此例子Camera预览数据(yv12)编码成H264,再把编码后的h264解码并且显示在SurfaceView中。

3.1布局文件

布局文件非常简单,两个SurfaceView分别用于显示编解码的图像,两个按钮控制开始和停止,一个TextView用于显示捕捉帧率。布局文件代码就不展示了,界面如下 

3.2编码器类Encoder

package com.example.mediacodecpro;import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.util.Log;import java.io.IOException;
import java.nio.ByteBuffer;/*** Created by chuibai on 2017/3/10.<br />*/public class Encoder {public static final int TRY_AGAIN_LATER = -1;public static final int BUFFER_OK = 0;public static final int BUFFER_TOO_SMALL = 1;public static final int OUTPUT_UPDATE = 2;private int format = 0;private final String MIME_TYPE = "video/avc";private MediaCodec mMC = null;private MediaFormat mMF;private ByteBuffer[] inputBuffers;private ByteBuffer[] outputBuffers;private long BUFFER_TIMEOUT = 0;private MediaCodec.BufferInfo mBI;/*** 初始化编码器* @throws IOException 创建编码器失败会抛出异常*/public void init() throws IOException {mMC = MediaCodec.createEncoderByType(MIME_TYPE);format = MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar;mBI = new MediaCodec.BufferInfo();}/*** 配置编码器,需要配置颜色、帧率、比特率以及视频宽高* @param width 视频的宽* @param height 视频的高* @param bitrate 视频比特率* @param framerate 视频帧率*/public void configure(int width,int height,int bitrate,int framerate){if(mMF == null){mMF = MediaFormat.createVideoFormat(MIME_TYPE, width, height);mMF.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);mMF.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);if (format != 0){mMF.setInteger(MediaFormat.KEY_COLOR_FORMAT, format);}mMF.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, -1); //关键帧间隔时间 单位s}mMC.configure(mMF,null,null,MediaCodec.CONFIGURE_FLAG_ENCODE);}/*** 开启编码器,获取输入输出缓冲区*/public void start(){mMC.start();inputBuffers = mMC.getInputBuffers();outputBuffers = mMC.getOutputBuffers();}/*** 向编码器输入数据,此处要求输入YUV420P的数据* @param data YUV数据* @param len 数据长度* @param timestamp 时间戳* @return*/public int input(byte[] data,int len,long timestamp){int index = mMC.dequeueInputBuffer(BUFFER_TIMEOUT);Log.e("...","" + index);if(index >= 0){ByteBuffer inputBuffer = inputBuffers[index];inputBuffer.clear();if(inputBuffer.capacity() < len){mMC.queueInputBuffer(index, 0, 0, timestamp, 0);return BUFFER_TOO_SMALL;}inputBuffer.put(data,0,len);mMC.queueInputBuffer(index,0,len,timestamp,0);}else{return index;}return BUFFER_OK;}/*** 输出编码后的数据* @param data 数据* @param len 有效数据长度* @param ts 时间戳* @return*/public int output(/*out*/byte[] data,/* out */int[] len,/* out */long[] ts){int i = mMC.dequeueOutputBuffer(mBI, BUFFER_TIMEOUT);if(i >= 0){if(mBI.size > data.length) return BUFFER_TOO_SMALL;outputBuffers[i].position(mBI.offset);outputBuffers[i].limit(mBI.offset + mBI.size);outputBuffers[i].get(data, 0, mBI.size);len[0] = mBI.size ;ts[0] = mBI.presentationTimeUs;mMC.releaseOutputBuffer(i, false);} else if (i == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {outputBuffers = mMC.getOutputBuffers();return OUTPUT_UPDATE;} else if (i == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {mMF = mMC.getOutputFormat();return OUTPUT_UPDATE;} else if (i == MediaCodec.INFO_TRY_AGAIN_LATER) {return TRY_AGAIN_LATER;}return BUFFER_OK;}public void release(){mMC.stop();mMC.release();mMC = null;outputBuffers = null;inputBuffers = null;}public void flush() {mMC.flush();}
}

3.3解码器类Decoder

package com.example.mediacodecpro;import android.media.MediaCodec;
import android.media.MediaFormat;
import android.view.Surface;
import java.io.IOException;
import java.nio.ByteBuffer;/*** Created by chuibai on 2017/3/10.<br />*/public class Decoder {public static final int TRY_AGAIN_LATER = -1;public static final int BUFFER_OK = 0;public static final int BUFFER_TOO_SMALL = 1;public static final int OUTPUT_UPDATE = 2;private final String MIME_TYPE = "video/avc";private MediaCodec mMC = null;private MediaFormat mMF;private long BUFFER_TIMEOUT = 0;private MediaCodec.BufferInfo mBI;private ByteBuffer[] mInputBuffers;private ByteBuffer[] mOutputBuffers;/*** 初始化编码器* @throws IOException 创建编码器失败会抛出异常*/public void init() throws IOException {mMC = MediaCodec.createDecoderByType(MIME_TYPE);mBI = new MediaCodec.BufferInfo();}/*** 配置解码器* @param sps 用于配置的sps参数* @param pps 用于配置的pps参数* @param surface 用于解码显示的Surface*/public void configure(byte[] sps, byte[] pps, Surface surface){int[] width = new int[1];int[] height = new int[1];AvcUtils.parseSPS(sps, width, height);//从sps中解析出视频宽高mMF = MediaFormat.createVideoFormat(MIME_TYPE, width[0], height[0]);mMF.setByteBuffer("csd-0", ByteBuffer.wrap(sps));mMF.setByteBuffer("csd-1", ByteBuffer.wrap(pps));mMF.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, width[0] * height[0]);mMC.configure(mMF, surface, null, 0);}/*** 开启解码器,获取输入输出缓冲区*/public void start(){mMC.start();mInputBuffers = mMC.getInputBuffers();mOutputBuffers = mMC.getOutputBuffers();}/*** 输入数据* @param data 输入的数据* @param len 数据有效长度* @param timestamp 时间戳* @return 成功则返回{@link #BUFFER_OK} 否则返回{@link #TRY_AGAIN_LATER}*/public int input(byte[] data,int len,long timestamp){int i = mMC.dequeueInputBuffer(BUFFER_TIMEOUT);if(i >= 0){ByteBuffer inputBuffer = mInputBuffers[i];inputBuffer.clear();inputBuffer.put(data, 0, len);mMC.queueInputBuffer(i, 0, len, timestamp, 0);}else {return TRY_AGAIN_LATER;}return BUFFER_OK;}public int output(byte[] data,int[] len,long[] ts){int i = mMC.dequeueOutputBuffer(mBI, BUFFER_TIMEOUT);if(i >= 0){if (mOutputBuffers[i] != null){mOutputBuffers[i].position(mBI.offset);mOutputBuffers[i].limit(mBI.offset + mBI.size);if (data != null)mOutputBuffers[i].get(data, 0, mBI.size);len[0] = mBI.size;ts[0] = mBI.presentationTimeUs;}mMC.releaseOutputBuffer(i, true);}else{return TRY_AGAIN_LATER;}return BUFFER_OK;}public void flush(){mMC.flush();}public void release() {flush();mMC.stop();mMC.release();mMC = null;mInputBuffers = null;mOutputBuffers = null;}
}
  • 3.4MainAcitivity
package com.example.mediacodecpro;import android.content.pm.ActivityInfo;
import android.graphics.ImageFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.os.Handler;
import android.os.Looper;
import android.os.Message;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.SurfaceView;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;
import java.io.IOException;
import java.io.OutputStream;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.net.Socket;
import java.nio.ByteBuffer;
import java.util.Iterator;
import java.util.LinkedList;
import java.util.Queue;
import butterknife.BindView;
import butterknife.ButterKnife;public class MainActivity extends AppCompatActivity implements View.OnClickListener, Camera.PreviewCallback {@BindView(R.id.surfaceView_encode)SurfaceView surfaceViewEncode;@BindView(R.id.surfaceView_decode)SurfaceView surfaceViewDecode;@BindView(R.id.btnStart)Button btnStart;@BindView(R.id.btnStop)Button btnStop;@BindView(R.id.capture)TextView capture;private int width;private int height;private int bitrate;private int framerate;private int captureFrame;private Camera mCamera;private Queue<PreviewBufferInfo> mPreviewBuffers_clean;private Queue<PreviewBufferInfo> mPreviewBuffers_dirty;private Queue<PreviewBufferInfo> mDecodeBuffers_clean;private Queue<PreviewBufferInfo> mDecodeBuffers_dirty;private int PREVIEW_POOL_CAPACITY = 5;private int format;private int DECODE_UNI_SIZE = 1024 * 1024;private byte[] mAvcBuf = new byte[1024 * 1024];private final int MSG_ENCODE = 0;private final int MSG_DECODE = 1;private String TAG = "MainActivity";private long mLastTestTick = 0;private Object mAvcEncLock;private Object mDecEncLock;private Decoder mDecoder;private Handler codecHandler;private byte[] mRawData;private Encoder mEncoder;private CodecThread codecThread;private DatagramSocket socket;private DatagramPacket packet;private byte[] sps_pps;private byte[] mPacketBuf = new byte[1024 * 1024];@Overrideprotected void onCreate(Bundle savedInstanceState) {super.onCreate(savedInstanceState);setContentView(R.layout.activity_main);ButterKnife.bind(this);//初始化参数initParams();//设置监听事件btnStart.setOnClickListener(this);btnStop.setOnClickListener(this);}/*** 初始化参数,包括帧率、颜色、比特率,视频宽高等*/private void initParams() {width = 352;height = 288;bitrate = 1500000;framerate = 30;captureFrame = 0;format = ImageFormat.YV12;mAvcEncLock = new Object();mDecEncLock = new Object();}@Overrideprotected void onResume() {if (getRequestedOrientation() != ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE) {setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);}super.onResume();}@Overridepublic void onClick(View v) {switch (v.getId()){case R.id.btnStart:mCamera = Camera.open(0);initQueues();initEncoder();initCodecThread();startPreview();break;case R.id.btnStop:releaseCodecThread();releseEncoderAndDecoder();releaseCamera();releaseQueue();break;}}/*** 释放队列资源*/private void releaseQueue() {if (mPreviewBuffers_clean != null){mPreviewBuffers_clean.clear();mPreviewBuffers_clean = null;}if (mPreviewBuffers_dirty != null){mPreviewBuffers_dirty.clear();mPreviewBuffers_dirty = null;}if (mDecodeBuffers_clean != null){mDecodeBuffers_clean.clear();mDecodeBuffers_clean = null;}if (mDecodeBuffers_dirty != null){mDecodeBuffers_dirty.clear();mDecodeBuffers_dirty = null;}}/*** 释放摄像头资源*/private void releaseCamera() {if(mCamera != null){mCamera.setPreviewCallbackWithBuffer(null);mCamera.stopPreview();mCamera.release();mCamera = null;}}private void releseEncoderAndDecoder() {if(mEncoder != null){mEncoder.flush();mEncoder.release();mEncoder = null;}if(mDecoder != null){mDecoder.release();mDecoder = null;}}private void releaseCodecThread() {codecHandler.getLooper().quit();codecHandler = null;codecThread = null;}private void initCodecThread() {codecThread = new CodecThread();codecThread.start();}/*** 开启预览*/private void startPreview() {Camera.Parameters parameters = mCamera.getParameters();parameters.setPreviewFormat(format);parameters.setPreviewFrameRate(framerate);parameters.setPreviewSize(width,height);mCamera.setParameters(parameters);try {mCamera.setPreviewDisplay(surfaceViewEncode.getHolder());} catch (IOException e) {e.printStackTrace();}mCamera.setPreviewCallbackWithBuffer(this);mCamera.startPreview();}@Overridepublic void onPreviewFrame(byte[] data, Camera camera) {/** 预览的data为null */if(data == null) {Log.e(TAG,"预览的data为null");return;}long curTick = System.currentTimeMillis();if (mLastTestTick == 0) {mLastTestTick = curTick;}if (curTick > mLastTestTick + 1000) {setCaptureFPSTextView(captureFrame);captureFrame = 0;mLastTestTick = curTick;} elsecaptureFrame++;synchronized(mAvcEncLock) {PreviewBufferInfo info = mPreviewBuffers_clean.poll();    //remove the head of queueinfo.buffer = data;info.size = getPreviewBufferSize(width, height, format);info.timestamp = System.currentTimeMillis();mPreviewBuffers_dirty.add(info);if(mDecoder == null){codecHandler.sendEmptyMessage(MSG_ENCODE);}}}private void setCaptureFPSTextView(int captureFrame) {capture.setText("当前帧率:" + captureFrame);}private void initEncoder() {mEncoder = new Encoder();try {mEncoder.init();mEncoder.configure(width,height,bitrate,framerate);mEncoder.start();} catch (IOException e) {e.printStackTrace();}}/*** 初始化各种队列*/private void initQueues() {if (mPreviewBuffers_clean == null)mPreviewBuffers_clean = new LinkedList<>();if (mPreviewBuffers_dirty == null)mPreviewBuffers_dirty = new LinkedList<>();int size = getPreviewBufferSize(width, height, format);for (int i = 0; i < PREVIEW_POOL_CAPACITY; i++) {byte[] mem = new byte[size];mCamera.addCallbackBuffer(mem);    //ByteBuffer.array is a reference, not a copyPreviewBufferInfo info = new PreviewBufferInfo();info.buffer = null;info.size = 0;info.timestamp = 0;mPreviewBuffers_clean.add(info);}if (mDecodeBuffers_clean == null)mDecodeBuffers_clean = new LinkedList<>();if (mDecodeBuffers_dirty == null)mDecodeBuffers_dirty = new LinkedList<>();for (int i = 0; i < PREVIEW_POOL_CAPACITY; i++) {PreviewBufferInfo info = new PreviewBufferInfo();info.buffer = new byte[DECODE_UNI_SIZE];info.size = 0;info.timestamp = 0;mDecodeBuffers_clean.add(info);}}/*** 获取预览buffer的大小* @param width 预览宽* @param height 预览高* @param format 预览颜色格式* @return 预览buffer的大小*/private int getPreviewBufferSize(int width, int height, int format) {int size = 0;switch (format) {case ImageFormat.YV12: {int yStride = (int) Math.ceil(width / 16.0) * 16;int uvStride = (int) Math.ceil((yStride / 2) / 16.0) * 16;int ySize = yStride * height;int uvSize = uvStride * height / 2;size = ySize + uvSize * 2;}break;case ImageFormat.NV21: {float bytesPerPix = (float) ImageFormat.getBitsPerPixel(format) / 8;size = (int) (width * height * bytesPerPix);}break;}return size;}private void swapYV12toI420(byte[] yv12bytes, byte[] i420bytes, int width, int height) {System.arraycopy(yv12bytes, 0, i420bytes, 0, width * height);System.arraycopy(yv12bytes, width * height + width * height / 4, i420bytes, width * height, width * height / 4);System.arraycopy(yv12bytes, width * height, i420bytes, width * height + width * height / 4, width * height / 4);}private class PreviewBufferInfo {public byte[] buffer;public int size;public long timestamp;}private class CodecThread extends Thread {@Overridepublic void run() {Looper.prepare();codecHandler = new Handler() {@Overridepublic void handleMessage(Message msg) {switch (msg.what) {case MSG_ENCODE:int res = Encoder.BUFFER_OK;synchronized (mAvcEncLock) {if (mPreviewBuffers_dirty != null && mPreviewBuffers_clean != null) {Iterator<PreviewBufferInfo> ite = mPreviewBuffers_dirty.iterator();while (ite.hasNext()) {PreviewBufferInfo info = ite.next();byte[] data = info.buffer;int data_size = info.size;if (format == ImageFormat.YV12) {if (mRawData == null || mRawData.length < data_size) {mRawData = new byte[data_size];}swapYV12toI420(data, mRawData, width, height);} else {Log.e(TAG, "preview size MUST be YV12, cur is " + format);mRawData = data;}res = mEncoder.input(mRawData, data_size, info.timestamp);if (res != Encoder.BUFFER_OK) {
//                                            Log.e(TAG, "mEncoder.input, maybe wrong:" + res);break;        //the rest buffers shouldn't go into encoder, if the previous one get problem} else {ite.remove();mPreviewBuffers_clean.add(info);if (mCamera != null) {mCamera.addCallbackBuffer(data);}}}}}while (res == Encoder.BUFFER_OK) {int[] len = new int[1];long[] ts = new long[1];synchronized (mAvcEncLock) {res = mEncoder.output(mAvcBuf, len, ts);}if (res == Encoder.BUFFER_OK) {//发送h264if(sps_pps != null){send(len[0]);}if (mDecodeBuffers_clean != null && mDecodeBuffers_dirty != null) {synchronized (mAvcEncLock) {Iterator<PreviewBufferInfo> ite = mDecodeBuffers_clean.iterator();if (ite.hasNext()) {PreviewBufferInfo bufferInfo = ite.next();if (bufferInfo.buffer.length >= len[0]) {bufferInfo.timestamp = ts[0];bufferInfo.size = len[0];System.arraycopy(mAvcBuf, 0, bufferInfo.buffer, 0, len[0]);ite.remove();mDecodeBuffers_dirty.add(bufferInfo);} else {Log.e(TAG, "decoder uni buffer too small, need " + len[0] + " but has " + bufferInfo.buffer.length);}}}initDecoder(len);}}}codecHandler.sendEmptyMessageDelayed(MSG_ENCODE, 30);break;case MSG_DECODE:synchronized (mDecEncLock) {int result = Decoder.BUFFER_OK;//STEP 1: handle input bufferif (mDecodeBuffers_dirty != null && mDecodeBuffers_clean != null) {Iterator<PreviewBufferInfo> ite = mDecodeBuffers_dirty.iterator();while (ite.hasNext()) {PreviewBufferInfo info = ite.next();result = mDecoder.input(info.buffer, info.size, info.timestamp);if (result != Decoder.BUFFER_OK) {break;        //the rest buffers shouldn't go into encoder, if the previous one get problem} else {ite.remove();mDecodeBuffers_clean.add(info);}}}int[] len = new int[1];long[] ts = new long[1];while (result == Decoder.BUFFER_OK) {result = mDecoder.output(null, len, ts);}}codecHandler.sendEmptyMessageDelayed(MSG_DECODE, 30);break;}}};Looper.loop();}}private void send(int len) {try {if(socket == null) socket = new DatagramSocket();if(packet == null){packet = new DatagramPacket(mPacketBuf,0,sps_pps.length + len);packet.setAddress(InetAddress.getByName("192.168.43.1"));packet.setPort(5006);}if(mAvcBuf[4] == 0x65){System.arraycopy(sps_pps,0,mPacketBuf,0,sps_pps.length);System.arraycopy(mAvcBuf,0,mPacketBuf,sps_pps.length,len);len += sps_pps.length;}else{System.arraycopy(mAvcBuf,0,mPacketBuf,0,len);}packet.setLength(len);socket.send(packet);} catch (IOException e) {e.printStackTrace();}}private void initDecoder(int[] len) {if(sps_pps == null){sps_pps = new byte[len[0]];System.arraycopy(mAvcBuf,0,sps_pps,0,len[0]);}if(mDecoder == null){mDecoder = new Decoder();try {mDecoder.init();} catch (IOException e) {e.printStackTrace();}byte[] sps_nal = null;int sps_len = 0;byte[] pps_nal = null;int pps_len = 0;ByteBuffer byteb = ByteBuffer.wrap(mAvcBuf, 0, len[0]);//SPSif (true == AvcUtils.goToPrefix(byteb)) {int sps_position = 0;int pps_position = 0;int nal_type = AvcUtils.getNalType(byteb);if (AvcUtils.NAL_TYPE_SPS == nal_type) {Log.d(TAG, "OutputAvcBuffer, AVC NAL type: SPS");sps_position = byteb.position() - AvcUtils.START_PREFIX_LENGTH - AvcUtils.NAL_UNIT_HEADER_LENGTH;//PPSif (true == AvcUtils.goToPrefix(byteb)) {nal_type = AvcUtils.getNalType(byteb);if (AvcUtils.NAL_TYPE_PPS == nal_type) {pps_position = byteb.position() - AvcUtils.START_PREFIX_LENGTH - AvcUtils.NAL_UNIT_HEADER_LENGTH;sps_len = pps_position - sps_position;sps_nal = new byte[sps_len];int cur_pos = byteb.position();byteb.position(sps_position);byteb.get(sps_nal, 0, sps_len);byteb.position(cur_pos);//sliceif (true == AvcUtils.goToPrefix(byteb)) {nal_type = AvcUtils.getNalType(byteb);int pps_end_position = byteb.position() - AvcUtils.START_PREFIX_LENGTH - AvcUtils.NAL_UNIT_HEADER_LENGTH;pps_len = pps_end_position - pps_position;} else {pps_len = byteb.position() - pps_position;//pps_len = byteb.limit() - pps_position + 1;}if (pps_len > 0) {pps_nal = new byte[pps_len];cur_pos = byteb.position();byteb.position(pps_position);byteb.get(pps_nal, 0, pps_len);byteb.position(cur_pos);}} else {//Log.d(log_tag, "OutputAvcBuffer, AVC NAL type: "+nal_type);throw new UnsupportedOperationException("SPS is not followed by PPS, nal type :" + nal_type);}}} else {//Log.d(log_tag, "OutputAvcBuffer, AVC NAL type: "+nal_type);}//2. configure AVC decoder with SPS/PPSif (sps_nal != null && pps_nal != null) {int[] width = new int[1];int[] height = new int[1];AvcUtils.parseSPS(sps_nal, width, height);mDecoder.configure(sps_nal, pps_nal,surfaceViewDecode.getHolder().getSurface());mDecoder.start();if (codecHandler != null) {codecHandler.sendEmptyMessage(MSG_DECODE);}}}}}}

上面的send方法可以把手机捕捉并编码的视频数据发送到电脑上使用ffplay播放,使用ffplay的时候记得加上参数-analyzeduration 200000减小视频延迟。

4.要点总结

个人总结使用MediaCodec编解码的时候主要需要注意以下事项: 
- 数据的输入和输出要异步进行,不要采用阻塞的方式等待输出数据 
- 编码Camera预览数据的时候使用带buffer的预览回调,避免帧率过低 
- 适当使用缓存队列,不要每一帧都new一个byte数组,避免频繁的GC 
- 不要在主线程进行操作,在我学习的过程中看到有些朋友直接在预览回调里进行编解码,在dequeueOutputBuffer方法还传递-1,也就是无限等待程序返回 
上述实例地址:点击这里,由于我一直没有积分去下载东西,所以下载收1个积分。如果有任何问题,欢迎指正,相互学习

Android硬编解码接口MediaCodec使用完全解析(一)相关推荐

  1. Android原生编解码接口 MediaCodec 之——完全解析

    版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明. 本文链接:https://blog.csdn.net/gb702250823/article/ ...

  2. zbar android解码错误,Android原生编解码接口 MediaCodec 之——踩坑

    关键帧 MediaCodec 有两种方式触发输出关键帧,一是由配置时设置的 KEY_FRAME_RATE和KEY_I_FRAME_INTERVAL参数自动触发,二是运行过程当中经过 setParame ...

  3. Android原生编解码接口MediaCodec详解

    作者:躬行之 了解了音视频的相关知识,可以先阅读同系列文章: 音视频开发基础知识 音频帧.视频帧及其同步 Camera2.MediaCodec录制mp4 MediaCodec 是 Android 中的 ...

  4. mediacodec.java_Android原生编解码接口 MediaCodec 之——踩坑

    版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明. 本文链接:https://blog.csdn.net/gb702250823/article/ ...

  5. Android视频编解码之MediaCodec简单入门

    本篇只是简单入门,后面会继续写文章详细讲解: 由于MediaCodec涉及内容众多,原本想一篇文章把所有内容概括,但是后来发现不太可能,限于自己能力,想要考虑全面太难,我也是刚开始学习需要借助网上的代 ...

  6. android硬编解码MediaCodec

    一 mediacodec简介 MediaCodec 类可以用来访问底层媒体编解码器,即编码器/解码器的组件. 它是 Android 底层多媒体支持架构的一部分(通常与 MediaExtractor,M ...

  7. android ndk之opencv+MediaCodec硬编解码来处理视频动态时间水印

    android ndk之opencv+MediaCodec硬编解码来处理视频水印学习笔记 android视频处理学习笔记.以前android增加时间水印的需求,希望多了解视频编解码,直播,特效这一块, ...

  8. Android 音视频编解码(一) -- MediaCodec 初探

    音视频 系列文章 Android 音视频开发(一) – 使用AudioRecord 录制PCM(录音):AudioTrack播放音频 Android 音视频开发(二) – Camera1 实现预览.拍 ...

  9. Android 硬编硬解退坑指南

    https://www.jianshu.com/p/7c03ebc0d2a0 Android 硬编硬解退坑指南 _qisen 2018.09.01 17:44 字数 2266 阅读 223评论 2喜欢 ...

最新文章

  1. CSS3+jQuery实现时钟插件
  2. [读书笔记]大型分布式网站架构设计与实践.分布式缓存
  3. stack.pop()方法_C.示例中的Stack.Pop()方法
  4. 对于PHP大型开发框架的看法
  5. java中Error(UnsatisfiedLinkError)与Exception是有差异的
  6. 排队论模型(四):M / M / s 混合制排队模型
  7. 小米升鸿蒙系统,小米11升级鸿蒙系统
  8. 创维电信悦me,(YMB0300-CW)卡刷固件及教程
  9. Git 新建分支和Commit Message 规范和最佳实践
  10. java计算机毕业设计ssm兴发农家乐服务管理系统n159q(附源码、数据库)
  11. MAML算法详解(元学习)
  12. 心形尺寸比例图解_标识牌尺寸大小及空间比例关系示意图与人体工程学
  13. 关于splay的一些说明
  14. 写能执行cmd命令的bat文件
  15. 无人机避障四种常见技术中,为何大疆首选双目视觉
  16. 老杜 mySql自学笔记34道例题
  17. FoneDog iOS Toolkit(苹果数据恢复软件)官方正式版V2.1.62 | 苹果数据恢复大师下载 | 苹果数据恢复有免费的吗?
  18. 星际密码(编程题解)
  19. 毕业设计-基于微信小程序的校园快递代取系统
  20. 高效解锁Word文档密码

热门文章

  1. php 生成打印送货单,PHP输出PDF打印HTML5+CSS3打印格式控制
  2. sql server 获取上个月,上周
  3. 冷热冲击试验-高低温冲击试验国标GB/T2423
  4. Tomcat:应用加载原理分析
  5. RubyMine 2017.3.2破解教程
  6. 读书小结——李开复《世界因你而不同》、《向死而生》
  7. CloudCompare 点云距离计算
  8. php validate验证,Validator | validate 验证规则
  9. mysql创建角色集_mysql8之新增角色
  10. 抖音怎么用A/B测试驱动产品增长的