效果图如下:

一、为预览控件设置圆角

public RoundTextureView(Context context, AttributeSet attrs) {

super(context, attrs);

setOutlineProvider(new ViewOutlineProvider() {

@Override

public void getOutline(View view, Outline outline) {

Rect rect = new Rect(0, 0, view.getMeasuredWidth(), view.getMeasuredHeight());

outline.setRoundRect(rect, radius);

}

});

setClipToOutline(true);

}

在需要时修改圆角值并更新

public void setRadius(int radius) {

this.radius = radius;

}

public void turnRound() {

invalidateOutline();

}

即可根据设置的圆角值更新控件显示的圆角大小。当控件为正方形,且圆角值为边长的一半,显示的就是圆形。

二、实现正方形预览

1. 设备支持1:1预览尺寸

首先介绍一种简单但是局限性较大的实现方式:将相机预览尺寸和预览控件的大小都调整为1:1。

一般Android设备都支持多种预览尺寸,以Samsung Tab S3为例

在使用Camera API时,其支持的预览尺寸如下:

2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1920x1080

2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1280x720

2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1440x1080

2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1088x1088

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1056x864

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 960x720

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 720x480

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 640x480

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 352x288

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 320x240

2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 176x144

其中1:1的预览尺寸为:1088x1088。

在使用Camera2 API时,其支持的预览尺寸(其实也包含了PictureSize)如下:

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 4128x3096

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 4128x2322

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 3264x2448

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 3264x1836

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 3024x3024

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2976x2976

2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2880x2160

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2592x1944

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2560x1920

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2560x1440

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2560x1080

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2160x2160

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2048x1536

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2048x1152

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1936x1936

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1920x1080

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1440x1080

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1280x960

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1280x720

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 960x720

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 720x480

2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 640x480

2019-08-02 13:19:24.982 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 320x240

2019-08-02 13:19:24.982 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 176x144

其中1:1的预览尺寸为:3024x3024、2976x2976、2160x2160、1936x1936。

只要我们选择1:1的预览尺寸,再将预览控件设置为正方形,即可实现正方形预览;

再通过设置预览控件的圆角为边长的一半,即可实现圆形预览。2. 设备不支持1:1预览尺寸的情况

选择1:1预览尺寸的缺陷分析

分辨率局限性

上述说到,我们可以选择1:1的预览尺寸进行预览,但是局限性较高,

可选择范围都很小。如果相机不支持1:1的预览尺寸,这个方案就不可行了。

资源消耗

以Samsung tab S3为例,该设备使用Camera2 API时,支持的正方形预览尺寸都很大,在进行图像处理等操作时将占用较多系统资源。

处理不支持1:1预览尺寸的情况

添加一个1:1尺寸的ViewGroup

将TextureView放入ViewGroup

设置TextureView的margin值以达到显示中心正方形区域的效果

示意图

示例代码

//将预览控件和预览尺寸比例保持一致,避免拉伸

{

FrameLayout.LayoutParams textureViewLayoutParams = (FrameLayout.LayoutParams) textureView.getLayoutParams();

int newHeight = 0;

int newWidth = textureViewLayoutParams.width;

//横屏

if (displayOrientation % 180 == 0) {

newHeight = textureViewLayoutParams.width * previewSize.height / previewSize.width;

}

//竖屏

else {

newHeight = textureViewLayoutParams.width * previewSize.width / previewSize.height;

}

当不是正方形预览的情况下,添加一层ViewGroup限制View的显示区域

if (newHeight != textureViewLayoutParams.height) {

insertFrameLayout = new RoundFrameLayout(CoverByParentCameraActivity.this);

int sideLength = Math.min(newWidth, newHeight);

FrameLayout.LayoutParams layoutParams = new FrameLayout.LayoutParams(sideLength, sideLength);

insertFrameLayout.setLayoutParams(layoutParams);

FrameLayout parentView = (FrameLayout) textureView.getParent();

parentView.removeView(textureView);

parentView.addView(insertFrameLayout);

insertFrameLayout.addView(textureView);

FrameLayout.LayoutParams newTextureViewLayoutParams = new FrameLayout.LayoutParams(newWidth, newHeight);

//横屏

if (displayOrientation % 180 == 0) {

newTextureViewLayoutParams.leftMargin = ((newHeight - newWidth) / 2);

}

//竖屏

else {

newTextureViewLayoutParams.topMargin = -(newHeight - newWidth) / 2;

}

textureView.setLayoutParams(newTextureViewLayoutParams);

}

}

三、使用GLSurfaceView进行自定义程度更高的预览

使用上面的方法操作已经可完成正方形和圆形预览,但是仅适用于原生相机,当我们的数据源并非是原生相机的情况时如何进行圆形预览?接下来介绍使用GLSurfaceView显示NV21的方案,完全是自己实现预览数据的绘制。

1. GLSurfaceView使用流程

OpenGL渲染YUV数据流程

其中的重点是渲染器(Renderer)的编写,Renderer的介绍如下:

/**

* A generic renderer interface.

*

* The renderer is responsible for making OpenGL calls to render a frame.

*

* GLSurfaceView clients typically create their own classes that implement

* this interface, and then call {@link GLSurfaceView#setRenderer} to

* register the renderer with the GLSurfaceView.

*

*

*

*

Developer Guides

*

For more information about how to use OpenGL, read the

* OpenGL developer guide.

*

*

*

Threading

* The renderer will be called on a separate thread, so that rendering

* performance is decoupled from the UI thread. Clients typically need to

* communicate with the renderer from the UI thread, because that's where

* input events are received. Clients can communicate using any of the

* standard Java techniques for cross-thread communication, or they can

* use the {@link GLSurfaceView#queueEvent(Runnable)} convenience method.

*

*

EGL Context Lost

* There are situations where the EGL rendering context will be lost. This

* typically happens when device wakes up after going to sleep. When

* the EGL context is lost, all OpenGL resources (such as textures) that are

* associated with that context will be automatically deleted. In order to

* keep rendering correctly, a renderer must recreate any lost resources

* that it still needs. The {@link #onSurfaceCreated(GL10, EGLConfig)} method

* is a convenient place to do this.

*

*

* @see #setRenderer(Renderer)

*/

public interface Renderer {

/**

* Called when the surface is created or recreated.

*

* Called when the rendering thread

* starts and whenever the EGL context is lost. The EGL context will typically

* be lost when the Android device awakes after going to sleep.

*

* Since this method is called at the beginning of rendering, as well as

* every time the EGL context is lost, this method is a convenient place to put

* code to create resources that need to be created when the rendering

* starts, and that need to be recreated when the EGL context is lost.

* Textures are an example of a resource that you might want to create

* here.

*

* Note that when the EGL context is lost, all OpenGL resources associated

* with that context will be automatically deleted. You do not need to call

* the corresponding "glDelete" methods such as glDeleteTextures to

* manually delete these lost resources.

*

* @param gl the GL interface. Use instanceof to

* test if the interface supports GL11 or higher interfaces.

* @param config the EGLConfig of the created surface. Can be used

* to create matching pbuffers.

*/

void onSurfaceCreated(GL10 gl, EGLConfig config);

/**

* Called when the surface changed size.

*

* Called after the surface is created and whenever

* the OpenGL ES surface size changes.

*

* Typically you will set your viewport here. If your camera

* is fixed then you could also set your projection matrix here:

*

* void onSurfaceChanged(GL10 gl, int width, int height) {

* gl.glViewport(0, 0, width, height);

* // for a fixed camera, set the projection too

* float ratio = (float) width / height;

* gl.glMatrixMode(GL10.GL_PROJECTION);

* gl.glLoadIdentity();

* gl.glFrustumf(-ratio, ratio, -1, 1, 1, 10);

* }

*

* @param gl the GL interface. Use instanceof to

* test if the interface supports GL11 or higher interfaces.

* @param width

* @param height

*/

void onSurfaceChanged(GL10 gl, int width, int height);

/**

* Called to draw the current frame.

*

* This method is responsible for drawing the current frame.

*

* The implementation of this method typically looks like this:

*

* void onDrawFrame(GL10 gl) {

* gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);

* //... other gl calls to render the scene ...

* }

*

* @param gl the GL interface. Use instanceof to

* test if the interface supports GL11 or higher interfaces.

*/

void onDrawFrame(GL10 gl);

}

void onSurfaceCreated(GL10 gl, EGLConfig config)

在Surface创建或重建的情况下回调

void onSurfaceChanged(GL10 gl, int width, int height)

在Surface的大小发生变化的情况下回调

void onDrawFrame(GL10 gl)

在这里实现绘制操作。当我们设置的renderMode为RENDERMODE_CONTINUOUSLY时,该函数将不断地执行;

当我们设置的renderMode为RENDERMODE_WHEN_DIRTY时,将只在创建完成和调用requestRender后才执行。一般我们选择RENDERMODE_WHEN_DIRTY渲染模式,避免过度绘制。

一般情况下,我们会自己实现一个Renderer,然后为GLSurfaceView设置Renderer,可以说,Renderer的编写是整个流程的核心步骤。以下是在void onSurfaceCreated(GL10 gl, EGLConfig config)进行的初始化操作和在void onDrawFrame(GL10 gl)进行的绘制操作的流程图:

渲染YUV数据的Renderer

2. 具体实现

坐标系介绍

Android View坐标系

OpenGL世界坐标系

如图所示,和Android的View坐标系不同,OpenGL的坐标系是笛卡尔坐标系。

Android View的坐标系以左上角为原点,向右x递增,向下y递增;

而OpenGL坐标系以中心为原点,向右x递增,向上y递增。

着色器编写

/**

* 顶点着色器

*/

private static String VERTEX_SHADER =

" attribute vec4 attr_position;\n" +

" attribute vec2 attr_tc;\n" +

" varying vec2 tc;\n" +

" void main() {\n" +

" gl_Position = attr_position;\n" +

" tc = attr_tc;\n" +

" }";

/**

* 片段着色器

*/

private static String FRAG_SHADER =

" varying vec2 tc;\n" +

" uniform sampler2D ySampler;\n" +

" uniform sampler2D uSampler;\n" +

" uniform sampler2D vSampler;\n" +

" const mat3 convertMat = mat3( 1.0, 1.0, 1.0, -0.001, -0.3441, 1.772, 1.402, -0.7141, -0.58060);\n" +

" void main()\n" +

" {\n" +

" vec3 yuv;\n" +

" yuv.x = texture2D(ySampler, tc).r;\n" +

" yuv.y = texture2D(uSampler, tc).r - 0.5;\n" +

" yuv.z = texture2D(vSampler, tc).r - 0.5;\n" +

" gl_FragColor = vec4(convertMat * yuv, 1.0);\n" +

" }";

内建变量解释

gl_Position

VERTEX_SHADER代码里的gl_Position代表绘制的空间坐标。由于我们是二维绘制,所以直接传入OpenGL二维坐标系的左下(-1,-1)、右下(1,-1)、左上(-1,1)、右上(1,1),也就是{-1,-1,1,-1,-1,1,1,1}

gl_FragColor

FRAG_SHADER代码里的gl_FragColor代表单个片元的颜色

其他变量解释

ySampler、uSampler、vSampler

分别代表Y、U、V纹理采样器

convertMat

根据以下公式:

R = Y + 1.402 (V - 128)

G = Y - 0.34414 (U - 128) - 0.71414 (V - 128)

B = Y + 1.772 (U - 128)

我们可得到一个YUV转RGB的矩阵

1.0, 1.0, 1.0,

0, -0.344, 1.77,

1.403, -0.714, 0

部分类型、函数的解释

vec3、vec4

分别代表三维向量、四维向量。

vec4 texture2D(sampler2D sampler, vec2 coord)

以指定的矩阵将采样器的图像纹理转换为颜色值;如:

texture2D(ySampler, tc).r获取到的是Y数据,

texture2D(uSampler, tc).r获取到的是U数据,

texture2D(vSampler, tc).r获取到的是V数据。

在Java代码中进行初始化

根据图像宽高创建Y、U、V对应的ByteBuffer纹理数据;

根据是否镜像显示、旋转角度选择对应的转换矩阵;

public void init(boolean isMirror, int rotateDegree, int frameWidth, int frameHeight) {

if (this.frameWidth == frameWidth

&& this.frameHeight == frameHeight

&& this.rotateDegree == rotateDegree

&& this.isMirror == isMirror) {

return;

}

dataInput = false;

this.frameWidth = frameWidth;

this.frameHeight = frameHeight;

this.rotateDegree = rotateDegree;

this.isMirror = isMirror;

yArray = new byte[this.frameWidth * this.frameHeight];

uArray = new byte[this.frameWidth * this.frameHeight / 4];

vArray = new byte[this.frameWidth * this.frameHeight / 4];

int yFrameSize = this.frameHeight * this.frameWidth;

int uvFrameSize = yFrameSize >> 2;

yBuf = ByteBuffer.allocateDirect(yFrameSize);

yBuf.order(ByteOrder.nativeOrder()).position(0);

uBuf = ByteBuffer.allocateDirect(uvFrameSize);

uBuf.order(ByteOrder.nativeOrder()).position(0);

vBuf = ByteBuffer.allocateDirect(uvFrameSize);

vBuf.order(ByteOrder.nativeOrder()).position(0);

// 顶点坐标

squareVertices = ByteBuffer

.allocateDirect(GLUtil.SQUARE_VERTICES.length * FLOAT_SIZE_BYTES)

.order(ByteOrder.nativeOrder())

.asFloatBuffer();

squareVertices.put(GLUtil.SQUARE_VERTICES).position(0);

//纹理坐标

if (isMirror) {

switch (rotateDegree) {

case 0:

coordVertice = GLUtil.MIRROR_COORD_VERTICES;

break;

case 90:

coordVertice = GLUtil.ROTATE_90_MIRROR_COORD_VERTICES;

break;

case 180:

coordVertice = GLUtil.ROTATE_180_MIRROR_COORD_VERTICES;

break;

case 270:

coordVertice = GLUtil.ROTATE_270_MIRROR_COORD_VERTICES;

break;

default:

break;

}

} else {

switch (rotateDegree) {

case 0:

coordVertice = GLUtil.COORD_VERTICES;

break;

case 90:

coordVertice = GLUtil.ROTATE_90_COORD_VERTICES;

break;

case 180:

coordVertice = GLUtil.ROTATE_180_COORD_VERTICES;

break;

case 270:

coordVertice = GLUtil.ROTATE_270_COORD_VERTICES;

break;

default:

break;

}

}

coordVertices = ByteBuffer.allocateDirect(coordVertice.length * FLOAT_SIZE_BYTES).order(ByteOrder.nativeOrder()).asFloatBuffer();

coordVertices.put(coordVertice).position(0);

}

在Surface创建完成时进行Renderer初始化

private void initRenderer() {

rendererReady = false;

createGLProgram();

//启用纹理

GLES20.glEnable(GLES20.GL_TEXTURE_2D);

//创建纹理

createTexture(frameWidth, frameHeight, GLES20.GL_LUMINANCE, yTexture);

createTexture(frameWidth / 2, frameHeight / 2, GLES20.GL_LUMINANCE, uTexture);

createTexture(frameWidth / 2, frameHeight / 2, GLES20.GL_LUMINANCE, vTexture);

rendererReady = true;

}

其中createGLProgram用于创建OpenGL Program并关联着色器代码中的变量

private void createGLProgram() {

int programHandleMain = GLUtil.createShaderProgram();

if (programHandleMain != -1) {

// 使用着色器程序

GLES20.glUseProgram(programHandleMain);

// 获取顶点着色器变量

int glPosition = GLES20.glGetAttribLocation(programHandleMain, "attr_position");

int textureCoord = GLES20.glGetAttribLocation(programHandleMain, "attr_tc");

// 获取片段着色器变量

int ySampler = GLES20.glGetUniformLocation(programHandleMain, "ySampler");

int uSampler = GLES20.glGetUniformLocation(programHandleMain, "uSampler");

int vSampler = GLES20.glGetUniformLocation(programHandleMain, "vSampler");

//给变量赋值

/**

* GLES20.GL_TEXTURE0 和 ySampler 绑定

* GLES20.GL_TEXTURE1 和 uSampler 绑定

* GLES20.GL_TEXTURE2 和 vSampler 绑定

*

* 也就是说 glUniform1i的第二个参数代表图层序号

*/

GLES20.glUniform1i(ySampler, 0);

GLES20.glUniform1i(uSampler, 1);

GLES20.glUniform1i(vSampler, 2);

GLES20.glEnableVertexAttribArray(glPosition);

GLES20.glEnableVertexAttribArray(textureCoord);

/**

* 设置Vertex Shader数据

*/

squareVertices.position(0);

GLES20.glVertexAttribPointer(glPosition, GLUtil.COUNT_PER_SQUARE_VERTICE, GLES20.GL_FLOAT, false, 8, squareVertices);

coordVertices.position(0);

GLES20.glVertexAttribPointer(textureCoord, GLUtil.COUNT_PER_COORD_VERTICES, GLES20.GL_FLOAT, false, 8, coordVertices);

}

}

其中createTexture用于根据宽高和格式创建纹理

private void createTexture(int width, int height, int format, int[] textureId) {

//创建纹理

GLES20.glGenTextures(1, textureId, 0);

//绑定纹理

GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId[0]);

/**

* {@link GLES20#GL_TEXTURE_WRAP_S}代表左右方向的纹理环绕模式

* {@link GLES20#GL_TEXTURE_WRAP_T}代表上下方向的纹理环绕模式

*

* {@link GLES20#GL_REPEAT}:重复

* {@link GLES20#GL_MIRRORED_REPEAT}:镜像重复

* {@link GLES20#GL_CLAMP_TO_EDGE}:忽略边框截取

*

* 例如我们使用{@link GLES20#GL_REPEAT}:

*

* squareVertices coordVertices

* -1.0f, -1.0f, 1.0f, 1.0f,

* 1.0f, -1.0f, 1.0f, 0.0f, -> 和textureView预览相同

* -1.0f, 1.0f, 0.0f, 1.0f,

* 1.0f, 1.0f 0.0f, 0.0f

*

* squareVertices coordVertices

* -1.0f, -1.0f, 2.0f, 2.0f,

* 1.0f, -1.0f, 2.0f, 0.0f, -> 和textureView预览相比,分割成了4 块相同的预览(左下,右下,左上,右上)

* -1.0f, 1.0f, 0.0f, 2.0f,

* 1.0f, 1.0f 0.0f, 0.0f

*/

GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT);

GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT);

/**

* {@link GLES20#GL_TEXTURE_MIN_FILTER}代表所显示的纹理比加载进来的纹理小时的情况

* {@link GLES20#GL_TEXTURE_MAG_FILTER}代表所显示的纹理比加载进来的纹理大时的情况

*

* {@link GLES20#GL_NEAREST}:使用纹理中坐标最接近的一个像素的颜色作为需要绘制的像素颜色

* {@link GLES20#GL_LINEAR}:使用纹理中坐标最接近的若干个颜色,通过加权平均算法得到需要绘制的像素颜色

*/

GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);

GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);

GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, format, width, height, 0, format, GLES20.GL_UNSIGNED_BYTE, null);

}

在Java代码中调用绘制

在数据源获取到时裁剪并传入帧数据

@Override

public void onPreview(final byte[] nv21, Camera camera) {

//裁剪指定的图像区域

ImageUtil.cropNV21(nv21, this.squareNV21, previewSize.width, previewSize.height, cropRect);

//刷新GLSurfaceView

roundCameraGLSurfaceView.refreshFrameNV21(this.squareNV21);

}

NV21数据裁剪代码

/**

* 裁剪NV21数据

*

* @param originNV21 原始的NV21数据

* @param cropNV21 裁剪结果NV21数据,需要预先分配内存

* @param width 原始数据的宽度

* @param height 原始数据的高度

* @param left 原始数据被裁剪的区域的左边界

* @param top 原始数据被裁剪的区域的上边界

* @param right 原始数据被裁剪的区域的右边界

* @param bottom 原始数据被裁剪的区域的下边界

*/

public static void cropNV21(byte[] originNV21, byte[] cropNV21, int width, int height, int left, int top, int right, int bottom) {

int halfWidth = width / 2;

int cropImageWidth = right - left;

int cropImageHeight = bottom - top;

//原数据Y左上

int originalYLineStart = top * width;

int targetYIndex = 0;

//原数据UV左上

int originalUVLineStart = width * height + top * halfWidth;

//目标数据的UV起始值

int targetUVIndex = cropImageWidth * cropImageHeight;

for (int i = top; i < bottom; i++) {

System.arraycopy(originNV21, originalYLineStart + left, cropNV21, targetYIndex, cropImageWidth);

originalYLineStart += width;

targetYIndex += cropImageWidth;

if ((i & 1) == 0) {

System.arraycopy(originNV21, originalUVLineStart + left, cropNV21, targetUVIndex, cropImageWidth);

originalUVLineStart += width;

targetUVIndex += cropImageWidth;

}

}

}

传给GLSurafceView并刷新帧数据

/**

* 传入NV21刷新帧

*

* @param data NV21数据

*/

public void refreshFrameNV21(byte[] data) {

if (rendererReady) {

yBuf.clear();

uBuf.clear();

vBuf.clear();

putNV21(data, frameWidth, frameHeight);

dataInput = true;

requestRender();

}

}

其中putNV21用于将NV21中的Y、U、V数据分别取出

/**

* 将NV21数据的Y、U、V分量取出

*

* @param src nv21帧数据

* @param width 宽度

* @param height 高度

*/

private void putNV21(byte[] src, int width, int height) {

int ySize = width * height;

int frameSize = ySize * 3 / 2;

//取分量y值

System.arraycopy(src, 0, yArray, 0, ySize);

int k = 0;

//取分量uv值

int index = ySize;

while (index < frameSize) {

vArray[k] = src[index++];

uArray[k++] = src[index++];

}

yBuf.put(yArray).position(0);

uBuf.put(uArray).position(0);

vBuf.put(vArray).position(0);

}

在执行requestRender后,onDrawFrame函数将被回调,在其中进行三个纹理的数据绑定并绘制

@Override

public void onDrawFrame(GL10 gl) {

// 分别对每个纹理做激活、绑定、设置数据操作

if (dataInput) {

//y

GLES20.glActiveTexture(GLES20.GL_TEXTURE0);

GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, yTexture[0]);

GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,

0,

0,

0,

frameWidth,

frameHeight,

GLES20.GL_LUMINANCE,

GLES20.GL_UNSIGNED_BYTE,

yBuf);

//u

GLES20.glActiveTexture(GLES20.GL_TEXTURE1);

GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, uTexture[0]);

GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,

0,

0,

0,

frameWidth >> 1,

frameHeight >> 1,

GLES20.GL_LUMINANCE,

GLES20.GL_UNSIGNED_BYTE,

uBuf);

//v

GLES20.glActiveTexture(GLES20.GL_TEXTURE2);

GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, vTexture[0]);

GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,

0,

0,

0,

frameWidth >> 1,

frameHeight >> 1,

GLES20.GL_LUMINANCE,

GLES20.GL_UNSIGNED_BYTE,

vBuf);

//在数据绑定完成后进行绘制

GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);

}

}

即可完成绘制。

四、加一层边框

有时候需求并不仅仅是圆形预览这么简单,我们可能还要为相机预览加一层边框

边框效果

一样的思路,我们动态地修改边框值,并进行重绘。

边框自定义View中的相关代码如下:

@Override

protected void onDraw(Canvas canvas) {

super.onDraw(canvas);

if (paint == null) {

paint = new Paint();

paint.setStyle(Paint.Style.STROKE);

paint.setAntiAlias(true);

SweepGradient sweepGradient = new SweepGradient(((float) getWidth() / 2), ((float) getHeight() / 2),

new int[]{Color.GREEN, Color.CYAN, Color.BLUE, Color.CYAN, Color.GREEN}, null);

paint.setShader(sweepGradient);

}

drawBorder(canvas, 6);

}

private void drawBorder(Canvas canvas, int rectThickness) {

if (canvas == null) {

return;

}

paint.setStrokeWidth(rectThickness);

Path drawPath = new Path();

drawPath.addRoundRect(new RectF(0, 0, getWidth(), getHeight()), radius, radius, Path.Direction.CW);

canvas.drawPath(drawPath, paint);

}

public void turnRound() {

invalidate();

}

public void setRadius(int radius) {

this.radius = radius;

}

五、完整Demo代码:

使用Camera API和Camera2 API并选择最接近正方形的预览尺寸

使用Camera API并为其动态添加一层父控件,达到正方形预览的效果

使用Camera API获取预览数据,使用OpenGL的方式进行显示最后,给大家推荐一个好用的Android免费离线人脸识别的sdk,可以和本文实现技术的完美结合: https://ai.arcsoft.com.cn/product/arcface.html

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持脚本之家。

android 圆形相机预览拍照_Android多种方式实现相机圆形预览的示例代码相关推荐

  1. android 圆形相机预览拍照_Android多种方式实现相机圆形预览

    最终效果图如下: 一.为预览控件设置圆角 public RoundTextureView(Context context, AttributeSet attrs) { super(context, a ...

  2. Android多种方式实现相机圆形预览 看这一篇就够了,Android开发面试书籍

    } public void turnRound() { invalidateOutline(); } 即可根据设置的圆角值更新控件显示的圆角大小.当控件为正方形,且圆角值为边长的一半,显示的就是圆形. ...

  3. js小学生图区_多种方式实现js图片预览

    js多种方式图片预览-持续更新 //设置自己的变量存储区 var Util = { file : $("#file"), image_show:$("#img_show& ...

  4. Android 实现开关机和重启的多种方式

    Android实现的方式如下几种: 默认的SDK并没有提供应用开发者直接的Android系统关机或重启的API接口,一般来讲,实现Android系统的关机或重启,需要较高的权限(系统权限甚至Root权 ...

  5. android 8.0 调系统拍照_Android通知栏微技巧,8.0系统中通知栏的适配

    为什么要进行通知栏适配? 不得不说,通知栏真是一个让人又爱又恨的东西. 通知栏是Android系统原创的一个功能,虽说乔布斯一直认为Android系统是彻彻底底抄袭iOS的一个产品,但是通知栏确实是A ...

  6. android相机预览拍照功能实现

    代码目录结构 LAUNCHER activity:CameraActivity : onCreate()方法中创建Camera工具类CameraHelper的实例并初始化: mCameraHelper ...

  7. android 相机和照片一起_Android相机开发(三): 实现拍照录像和查看

    Android Camera Develop: capture photo and video 概述 上篇完成了相机的偏好设置,本篇就要实现相机的核心功能--拍照和录像了.直觉上拍照和录像应该差别不大 ...

  8. Android 使用CameraX实现预览/拍照/录制视频/图片分析/对焦/缩放/切换摄像头等操作

    1. CameraX架构 看官方文档 CameraX架构 有如下这一段话 使用CameraX,借助名为"用例"的抽象概念与设备的相机进行交互. 预览 : 接受用于显示预览的Surf ...

  9. Android实战技巧之四十七:不用预览拍照与图片缩放剪裁

    副标题:Take Picture without preview Android Google出于对隐私的保护,制定了一条门槛,即在Android应用开发中编写拍照程序是必需要有图像预览的.这会对那些 ...

最新文章

  1. java中xxe漏洞修复方法
  2. 机器学习模型的超参数优化 | 原力计划
  3. scala dynamics 示例
  4. 第十三届光华工程科技奖公布,彭士禄、张伯礼、王海峰等40人及1个团体获奖
  5. 专接本微型计算机原理考试,河北省2009年专接本-微型计算机原理与汇编语言试卷...
  6. 怎么配置iptv服务器信息,请配置iptv服务器信息
  7. mysql分窗函数_频谱分析中如何选择合适的窗函数
  8. 程序人生:女程序员的求职奋斗史
  9. RDLC之自定義數據集二
  10. C++语言学习(十六)——多继承
  11. 使用EditPlus运行C/C++
  12. java时间管理番茄时钟小程序源码
  13. 刷机后IMEI丢失如何能刷回来
  14. Python——弹幕词频统计及其文本分析(绘制词云)(含源代码)
  15. 计算机休眠 mac,Mac如何开启休眠模式
  16. 文本识别综述 <软件学报_王建新等、中国图象图形学报_刘崇宇等>
  17. coderwhy--前端知识整合包--htmlcss05
  18. 素数(质数),合数 ,偶数 , 奇数 ,约数(因数) ,因子 , 质因子 , 哥德巴赫猜想定义
  19. ue4树叶飘落动画_Android:使用属性动画制作器的类似于树叶的动画
  20. Smart3D初学者第二步:三维模型重建(1)

热门文章

  1. Java 配置Tomcat环境变量并使用(在windows中)
  2. idea怎么让相同的目录名重叠起来
  3. 使用移动硬盘拷贝大文件到ubuntu系统
  4. matplot 坐标点_matplot绘制带箭头的坐标图
  5. oracle的默认端口号是多少,口号标语之oracle默认端口号
  6. 查看oracle端口号
  7. var和let的用法及区别
  8. jsp70835办公用品仓库库存管理系统servlet
  9. python计算组合数c(ni)_用Python计算组合数,通过
  10. “ChatGPT的问题、风险与机遇”会议综述