上一篇文章分析了Cardboard SDK的生命周期设计。

这里我们看下畸变部分的实现。

Cardboard中将畸变这部分封装成了一个Distortion类和DistortionRenderer类。
我们看下Distortion这个类:
      private static final float[] DEFAULT_COEFFICIENTS = { 250.0F, 50000.0F };private float[] mCoefficients;
里面定义了一组默认系数DEFAULT_COEFFICIENTS,并提供了一个变量mCoefficients保存实际使用的系数
提供了函数distortionFactor和distort和来计算畸变因子,做畸变及反畸变:
      public float distortionFactor(float radius){float rSq = radius * radius;return 1.0F + mCoefficients[0] * rSq + mCoefficients[1] * rSq * rSq;}public float distort(float radius){return radius * distortionFactor(radius);}public float distortInverse(float radius){float r0 = radius / 0.9F;float r1 = radius * 0.9F;float dr0 = radius - distort(r0);while (Math.abs(r1 - r0) > 0.0001D) {float dr1 = radius - distort(r1);float r2 = r1 - dr1 * ((r1 - r0) / (dr1 - dr0));r0 = r1;r1 = r2;dr0 = dr1;}return r1;}

在CardboardDeviceParams的构造函数中会构造这个Distortion类,并提供了getDistortion()来获取这个Distortion对象。

通过查看对getDistortion()的调用可以看到哪些地方用到了Distortion类。
一个是DistortionRenderer.onProjectionChanged->DistortionRenderer. createDistortionMesh->CardboardDeviceParams.getDistortion()
另一个是
RendererHelper构造和触发onDrawFrame的时候会调用updateFieldOfView()进而调用CardboardDeviceParams.getDistortion()
前者取得Distortion这个类后用于构造DistortionMesh对象,在其中调用了distortInverse这个反畸变计算函数!
后者会调用Distortion类的distort函数来计算outerAngle,innerAngle,bottomAngle,topAngle等值。!

好了,现在来看下DistortionMesh对象中具体干了些什么
(1)首先了解下dpi的概念,即每英寸的像素数(Dots Per Inch)。这个值可以通过获取DisplayMetrics这个类来得到:
WindowManager windowManager = (WindowManager)context.getSystemService("window");
DisplayMetrics metrics = new DisplayMetrics();
windowManager.getDefaultDisplay().getRealMetrics(metrics);
比如在Nexus6 上,metrics.xdpi=494.27metrics.ydpi=492.606
(2)根据dpi,可以计算出每个像素多少米,进而与屏幕宽高的像素值相乘(比如mWidth=2560,mHeight=1440)计算出屏幕的宽和高分别是多少米
     mXMetersPerPixel = (0.0254F / metrics.xdpi);mYMetersPerPixel = (0.0254F / metrics.ydpi);public float getWidthMeters(){return mWidth * mXMetersPerPixel;}public float getHeightMeters(){return mHeight * mYMetersPerPixel;}
在Nexus6上
screen.getWidthMeters()=0.13155563
screen.getHeightMeters()=0.07425001
(3)有了屏幕的宽高信息,我们再计算一下视场角相关的值
      private EyeViewport initViewportForEye(EyeParams eye, float xOffsetM){//获取屏幕属性ScreenParams screen = mHmd.getScreen();//获取Cardboard设备属性CardboardDeviceParams cdp = mHmd.getCardboard();//计算出眼睛到屏幕的距离0.011+0.037float eyeToScreenDistanceM = cdp.getEyeToLensDistance() + cdp.getScreenToLensDistance();//根据视场角算出人眼可见的屏幕区域float leftM = (float)Math.tan(Math.toRadians(eye.getFov().getLeft())) * eyeToScreenDistanceM;float rightM = (float)Math.tan(Math.toRadians(eye.getFov().getRight())) * eyeToScreenDistanceM;float bottomM = (float)Math.tan(Math.toRadians(eye.getFov().getBottom())) * eyeToScreenDistanceM;float topM = (float)Math.tan(Math.toRadians(eye.getFov().getTop())) * eyeToScreenDistanceM;EyeViewport vp = new EyeViewport();//视场偏移量vp.x = xOffsetM;vp.y = 0.0F;//视场的宽vp.width = (leftM + rightM);//视场高vp.height = (bottomM + topM);//视场左上角坐标vp.eyeX = (leftM + xOffsetM);vp.eyeY = bottomM;//屏幕横向像素数2560/以米为单位的屏幕宽度0.13155563,得到每米像素数19459.447float xPxPerM = screen.getWidth() / screen.getWidthMeters();//屏幕纵向像素数1440/以米为单位的屏幕高度0.07425001,得到每米像素数19393.936float yPxPerM = screen.getHeight() / screen.getHeightMeters();//最终算出视场左上角的像素坐标和像素宽高eye.getViewport().x = Math.round(vp.x * xPxPerM);eye.getViewport().y = Math.round(vp.y * xPxPerM);eye.getViewport().width = Math.round(vp.width * xPxPerM);eye.getViewport().height = Math.round(vp.height * xPxPerM);return vp;}
上面计算出来:
左右眼视场的上下左右宽度
leftM=0.05015834rightM=0.037965bottomM=0.041869722topM=0.054545455
leftM=0.037965rightM=0.05015834bottomM=0.041869722topM=0.054545455
并计算出视场左上角坐标,偏移量,以及宽高
vp.eyeX=0.05015834vp.eyeY=0.041869722vp.x=0.0vp.y=0.0vp.width=0.08812334vp.height=0.09641518
vp.eyeX=0.12608834vp.eyeY=0.041869722vp.x=0.08812334vp.y=0.0vp.width=0.08812334vp.height=0.09641518
这里数据有问题的地方是,先通过视场角,计算出左眼的视场宽度为leftM+rightM=0.08812334 ,然后右眼的偏移以左眼的最右边为边界。
但这里算出的右眼宽度也为0.08812334,加起来的值大于屏幕宽度0.17624668,大于前面计算出来的screen.getWidthMeters()=0.13155563了。
说明这个算法没有根据屏幕的宽高来调整FOV。理想情况下,应该参数根据根据屏幕宽高来做调整。
然后用这些值来构造DistortionMesh这个类。
好了现在,我们看DistortionMesh构造函数传了些什么参数进来:
EyeParams eye:眼睛相关参数,其中包含mViewport,mFov,还有视角矩阵mEyeTransform
Distortion distortion:畸变相关类,用于计算反畸变参数
screenWidthM :屏幕宽(米为单位,由getWidthMeters()计算得到)
screenHeightM :屏幕高(米为单位,由getHeightMeters()计算得到)
xEyeOffsetMScreen :眼镜中心点离屏幕边沿的横向距离(横屏模式下)
yEyeOffsetMScreen :眼镜中心点离屏幕边沿的纵向距离(横屏模式下,并出去3cm的空余边界)
textureWidthM :纹理宽度(其实是左眼视场宽度+右眼视场宽度)
textureHeightM :纹理高度(左右眼视场高度中较大的那个值)
xEyeOffsetMTexture :(以米为单位的视场角中心点横向坐标,vp.eyeX )
yEyeOffsetMTexture :(以米为单位的视场角中心点纵向坐标,vp.eyeY )
viewportXMTexture :(屏幕坐标里,视场左上角,也就是原点的横向像素坐标,vp.x )
viewportYMTexture :(屏幕坐标里,视场左上角,也就是原点的纵向像素坐标 ,vp.y)
viewportWidthMTexture :(视场像素宽,vp.width )
viewportHeightMTexture :(视场像素高,vp.height)
现在看下这个类的构造函数:
           public DistortionMesh(EyeParams eye, Distortion distortion, float screenWidthM, float screenHeightM, float xEyeOffsetMScreen, float yEyeOffsetMScreen, float textureWidthM, float textureHeightM, float xEyeOffsetMTexture, float yEyeOffsetMTexture, float viewportXMTexture, float viewportYMTexture, float viewportWidthMTexture, float viewportHeightMTexture){float mPerUScreen = screenWidthM;float mPerVScreen = screenHeightM;float mPerUTexture = textureWidthM;float mPerVTexture = textureHeightM;float[] vertexData = new float[8000];int vertexOffset = 0;Log.d(TAG,"screenWidthM="+screenWidthM+"\nscreenHeightM="+screenHeightM+"\nxEyeOffsetMScreen="+xEyeOffsetMScreen+"\nyEyeOffsetMScreen="+yEyeOffsetMScreen+"\ntextureWidthM="+textureWidthM+"\ntextureHeightM="+textureHeightM+"\nxEyeOffsetMTexture="+xEyeOffsetMTexture+"\nyEyeOffsetMTexture="+yEyeOffsetMTexture+"\nviewportXMTexture="+viewportXMTexture+"\nviewportYMTexture="+viewportYMTexture+"\nviewportWidthMTexture="+viewportWidthMTexture+"\nviewportHeightMTexture="+viewportHeightMTexture);for (int row = 0; row < 40; row++) {for (int col = 0; col < 40; col++){float uTexture = col / 39.0F * (viewportWidthMTexture / textureWidthM) + viewportXMTexture / textureWidthM;float vTexture = row / 39.0F * (viewportHeightMTexture / textureHeightM) + viewportYMTexture / textureHeightM;float xTexture = uTexture * mPerUTexture;float yTexture = vTexture * mPerVTexture;float xTextureEye = xTexture - xEyeOffsetMTexture;float yTextureEye = yTexture - yEyeOffsetMTexture;float rTexture = (float)Math.sqrt(xTextureEye * xTextureEye + yTextureEye * yTextureEye);float textureToScreen = rTexture > 0.0F ? distortion.distortInverse(rTexture) / rTexture : 1.0F;float xScreen = xTextureEye * textureToScreen + xEyeOffsetMScreen;float yScreen = yTextureEye * textureToScreen + yEyeOffsetMScreen;float uScreen = xScreen / mPerUScreen;float vScreen = yScreen / mPerVScreen;float vignetteSizeMTexture = 0.002F / textureToScreen;float dxTexture = xTexture - DistortionRenderer.clamp(xTexture, viewportXMTexture + vignetteSizeMTexture, viewportXMTexture + viewportWidthMTexture - vignetteSizeMTexture);float dyTexture = yTexture - DistortionRenderer.clamp(yTexture, viewportYMTexture + vignetteSizeMTexture, viewportYMTexture + viewportHeightMTexture - vignetteSizeMTexture);float drTexture = (float)Math.sqrt(dxTexture * dxTexture + dyTexture * dyTexture);float vignette = 1.0F - DistortionRenderer.clamp(drTexture / vignetteSizeMTexture, 0.0F, 1.0F);vertexData[(vertexOffset + 0)] = (2.0F * uScreen - 1.0F);vertexData[(vertexOffset + 1)] = (2.0F * vScreen - 1.0F);vertexData[(vertexOffset + 2)] = vignette;vertexData[(vertexOffset + 3)] = uTexture;vertexData[(vertexOffset + 4)] = vTexture;vertexOffset += 5;}}nIndices = 3158;int[] indexData = new int[nIndices];int indexOffset = 0;vertexOffset = 0;for (int row = 0; row < 39; row++) {if (row > 0) {indexData[indexOffset] = indexData[(indexOffset - 1)];indexOffset++;}for (int col = 0; col < 40; col++) {if (col > 0) {if (row % 2 == 0){vertexOffset++;}else {vertexOffset--;}}indexData[(indexOffset++)] = vertexOffset;indexData[(indexOffset++)] = (vertexOffset + 40);}vertexOffset += 40;}FloatBuffer vertexBuffer = ByteBuffer.allocateDirect(vertexData.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();vertexBuffer.put(vertexData).position(0);IntBuffer indexBuffer = ByteBuffer.allocateDirect(indexData.length * 4).order(ByteOrder.nativeOrder()).asIntBuffer();indexBuffer.put(indexData).position(0);int[] bufferIds = new int[2];GLES20.glGenBuffers(2, bufferIds, 0);mArrayBufferId = bufferIds[0];mElementBufferId = bufferIds[1];GLES20.glBindBuffer(34962, mArrayBufferId);GLES20.glBufferData(34962, vertexData.length * 4, vertexBuffer, 35044);GLES20.glBindBuffer(34963, mElementBufferId);GLES20.glBufferData(34963, indexData.length * 4, indexBuffer, 35044);GLES20.glBindBuffer(34962, 0);GLES20.glBindBuffer(34963, 0);}

构建完这个类之后,还需要创建纹理:

      private int createTexture(int width, int height) {int[] textureIds = new int[1];GLES20.glGenTextures(1, textureIds, 0);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureIds[0]);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGB, width, height, 0,GLES20.GL_RGB, GLES20.GL_UNSIGNED_SHORT_5_6_5, null);return textureIds[0];}
      private int setupRenderTextureAndRenderbuffer(int width, int height) {if (mTextureId != -1) {GLES20.glDeleteTextures(1, new int[] { mTextureId }, 0);}if (mRenderbufferId != -1) {GLES20.glDeleteRenderbuffers(1, new int[] { mRenderbufferId }, 0);}if (mFramebufferId != -1) {GLES20.glDeleteFramebuffers(1, new int[] { mFramebufferId }, 0);}mTextureId = createTexture(width, height);checkGlError("setupRenderTextureAndRenderbuffer: create texture");int[] renderbufferIds = new int[1];GLES20.glGenRenderbuffers(1, renderbufferIds, 0);GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderbufferIds[0]);GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, width, height);mRenderbufferId = renderbufferIds[0];checkGlError("setupRenderTextureAndRenderbuffer: create renderbuffer");int[] framebufferIds = new int[1];GLES20.glGenFramebuffers(1, framebufferIds, 0);GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, framebufferIds[0]);mFramebufferId = framebufferIds[0];GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, mTextureId, 0);GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER,renderbufferIds[0]);int status = GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER);if (status != GLES20.GL_FRAMEBUFFER_COMPLETE) {throw new RuntimeException("Framebuffer is not complete: "+ Integer.toHexString(status));}GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);return framebufferIds[0];}
我们看下这个类的函数实现
(1)在构造RendererHelper的时候初始化DistortionRenderer对象
(2)在每次调用RendererHelper.onDrawFrame的时候都会检查
mProjectionChanged,来判定是否要在VR模式下调用
DistortionRenderer.onProjectionChanged
然后在每一帧绘制前后调用beforeDrawFrame和afterDrawFrame
DistortionRenderer.beforeDrawFrame()
      public void beforeDrawFrame(){GLES20.glGetIntegerv(GLES20.GL_FRAMEBUFFER_BINDING, mOriginalFramebufferId);GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, mFramebufferId);}

DistortionRenderer.afterDrawFrame()

     public void afterDrawFrame(){GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, mOriginalFramebufferId.array()[0]);GLES20.glViewport(0, 0, mHmd.getScreen().getWidth(), mHmd.getScreen().getHeight());GLES20.glGetIntegerv(GLES20.GL_VIEWPORT, mViewport);GLES20.glGetIntegerv(GLES20.GL_CULL_FACE, mCullFaceEnabled);GLES20.glGetIntegerv(GLES20.GL_SCISSOR_TEST, mScissorTestEnabled);GLES20.glDisable(GLES20.GL_SCISSOR_TEST);GLES20.glDisable(GLES20.GL_CULL_FACE);GLES20.glClearColor(0.0F, 0.0F, 0.0F, 1.0F);GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);GLES20.glUseProgram(mProgramHolder.program);GLES20.glEnable(GLES20.GL_SCISSOR_TEST);GLES20.glScissor(0, 0, mHmd.getScreen().getWidth() / 2, mHmd.getScreen().getHeight());renderDistortionMesh(mLeftEyeDistortionMesh);GLES20.glScissor(mHmd.getScreen().getWidth() / 2, 0, mHmd.getScreen().getWidth() / 2, mHmd.getScreen().getHeight());renderDistortionMesh(mRightEyeDistortionMesh);GLES20.glDisableVertexAttribArray(mProgramHolder.aPosition);GLES20.glDisableVertexAttribArray(mProgramHolder.aVignette);GLES20.glDisableVertexAttribArray(mProgramHolder.aTextureCoord);GLES20.glUseProgram(0);GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0);GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, 0);GLES20.glDisable(GLES20.GL_SCISSOR_TEST);if (mCullFaceEnabled.array()[0] == 1) {GLES20.glEnable(GLES20.GL_CULL_FACE);}if (mScissorTestEnabled.array()[0] == 1) {GLES20.glEnable(GLES20.GL_SCISSOR_TEST);}GLES20.glViewport(mViewport.array()[0], mViewport.array()[1], mViewport.array()[2], mViewport.array()[3]);}

DistortionRenderer.onProjectionChanged()

      public void onProjectionChanged(HeadMountedDisplay hmd, EyeParams leftEye, EyeParams rightEye, float zNear, float zFar){mHmd = new HeadMountedDisplay(hmd);mLeftEyeFov = new FieldOfView(leftEye.getFov());mRightEyeFov = new FieldOfView(rightEye.getFov());ScreenParams screen = mHmd.getScreen();CardboardDeviceParams cdp = mHmd.getCardboard();if (mProgramHolder == null) {mProgramHolder = createProgramHolder();}EyeViewport leftEyeViewport = initViewportForEye(leftEye, 0.0F);EyeViewport rightEyeViewport = initViewportForEye(rightEye, leftEyeViewport.width);leftEye.getFov().toPerspectiveMatrix(zNear, zFar, leftEye.getTransform().getPerspective(), 0);rightEye.getFov().toPerspectiveMatrix(zNear, zFar, rightEye.getTransform().getPerspective(), 0);float textureWidthM = leftEyeViewport.width + rightEyeViewport.width;float textureHeightM = Math.max(leftEyeViewport.height, rightEyeViewport.height);float xPxPerM = screen.getWidth() / screen.getWidthMeters();float yPxPerM = screen.getHeight() / screen.getHeightMeters();int textureWidthPx = Math.round(textureWidthM * xPxPerM);int textureHeightPx = Math.round(textureHeightM * yPxPerM);float xEyeOffsetMScreen = screen.getWidthMeters() / 2.0F - cdp.getInterpupillaryDistance() / 2.0F;float yEyeOffsetMScreen = cdp.getVerticalDistanceToLensCenter() - screen.getBorderSizeMeters();mLeftEyeDistortionMesh = createDistortionMesh(leftEye, leftEyeViewport, textureWidthM, textureHeightM, xEyeOffsetMScreen, yEyeOffsetMScreen);xEyeOffsetMScreen = screen.getWidthMeters() - xEyeOffsetMScreen;mRightEyeDistortionMesh = createDistortionMesh(rightEye, rightEyeViewport, textureWidthM, textureHeightM, xEyeOffsetMScreen, yEyeOffsetMScreen);setupRenderTextureAndRenderbuffer(textureWidthPx, textureHeightPx);}

Google VR开发-Cardboard VR SDK反畸变实现相关推荐

  1. Google VR开发-Cardboard VR SDK头部追踪实现(罗德里格旋转公式)

    一.罗德里格旋转公式 可以参考百度百科和维基百科进行了解. 概括来说就是罗德里格旋转公式就是用来求旋转后新向量的公式: 而这个公式可以转换成矩阵形式: 公式各部分的几何意义和推导原理参考下图 这个图证 ...

  2. unity+Cardboard SDK VR开发Cardboard Unity SDK讲解

    Cardboard Unity SDK Reference中文翻译版,水平有限请以英文版为准. Plugin Reference Package 内容 Unity插件包包含以下内容: 脚本 ·     ...

  3. VR开发基础—VR视频

    1.导入谷歌官方提供的库: commonwidget.common.panowidget(全景图).videowidget(视频) 或者添加依赖: dependencies { compile pro ...

  4. 【获奖公布】走进VR开发世界——我们离开发一款VR大作还有多远?

    此次征文比赛以分享VR开发经验为核心,在对所有参赛文章进行审核后,以"开发"为先,评选出一.二.三等奖,共9名. 获奖名单 奖项 文章 作者 评语 一等奖 <VR游戏交互开发 ...

  5. GOOGLE VR SDK开发VR游戏,VR播放器之二

    之前简单说了CardBoardView的使用,这里写CardboardView.StereoRenderer的,使用上十分简单,和编写glsurface的Renderer一样导出有关的接口,使用OPG ...

  6. GOOGLE VR SDK开发VR游戏,VR播放器之一

    最近一年来,VR虚拟现实和AR增强现实技术的宣传甚嚣尘上.其实VR,AR技术很早就有了,一直没有流行开来,不可否认价格是影响技术推广的最大壁垒.谷歌对VR最大的贡献是提供了廉价的谷歌眼镜,按照GOOG ...

  7. GOOGLE VR SDK开发VR游戏,VR播放器之中的一个

    近期一年来,VR虚拟现实和AR增强现实技术的宣传甚嚣尘上.事实上VR,AR技术非常早就有了,一直没有流行开来.不可否认价格是影响技术推广的最大壁垒. 谷歌对VR最大的贡献是提供了便宜的谷歌眼镜,依照G ...

  8. 二、VR全景图显示器开发 ---- Android VR视频/Google VR for Android /VR Pano/VR Video

    原文地址: http://blog.csdn.net/qq_24889075/article/details/52128463 http://www.jianshu.com/p/104251a3153 ...

  9. 一、初识GVR ---- Android VR视频/Google VR for Android /VR Pano/VR Video

    原文链接: http://blog.csdn.net/qq_24889075/article/details/52118633 http://www.jianshu.com/p/09c0822b9d1 ...

最新文章

  1. [NOI2005]聪聪与可可(期望dp)
  2. python 栈实现 加减乘除_数据结构与算法(六):基于栈实现简单的四则运算
  3. Flutter开发之爬坑集合(五)
  4. 现代密码学3.1--定义计算安全的两种方法
  5. 徐雷FrankXu 内推 杭州 蚂蚁金服招聘 java开发工程
  6. Django-安装xadmin的方法及主要配置方法
  7. java oracle11g jar_oracle11g驱动jar包下载
  8. java案例2-7 随机抽取幸运观众
  9. 提升自己的认知-思维模型
  10. vue运行报错冒号问题,browser.js:158 Uncaught SyntaxError: Unexpected token ‘:‘
  11. Pr:旧版标题字幕设计器
  12. 优秀工程师应该具备哪些素质_优秀的工程师具有什么品质
  13. JS生成26个英文字母
  14. 瑞芯微RK3399六核-迅为3399开发板介绍
  15. TexturePacker入门记事
  16. [AHK]自动运行一键选股
  17. js visibility
  18. 二手车电商又多了一只独角兽?
  19. 【自学编程】自己写小例子的经验
  20. 小试debian-7.11.0-amd64+Plone5.1.2全文检索和预览中文WORD中文PDF

热门文章

  1. Azure Information Protection信息保护(AIP)/Azure Rights Management权限管理(RMS)
  2. Python pandas 筛选 Excel 特定行和列全集
  3. 使用pypcd读取pcd时ValueError: field ‘__0000‘ occurs more than once错误
  4. 自动驾驶IDM与MOBIL模型
  5. 笔记本计算机声音小,笔记本声音太小怎么加大 笔记本声音太小增大方法【详细介绍】...
  6. 海尔微型计算机云悦t3G276ia,没了海尔云悦miniA 迷你主机界尽失半壁江山
  7. 基于SpringBoot和Vue实现的个人博客网站快速搭建(已开源)
  8. esx linux 硬盘 扩容,ESX虚拟机添加新磁盘并扩容逻辑卷
  9. 网络时间同步设备(时钟同步产品)时钟系统应用技术介绍
  10. LSB文本水印的嵌入与提取