arkit 人脸捕捉

ARKit has been established as a reliable way to do stable consumer level AR since its first announcement at WWDC in June. More recently, during the iPhone X announcement, it was revealed that ARKit will include some face tracking features only available on the iPhone X using the front camera array, which includes a depth camera.

自从6月份在WWDC上首次发布ARKit以来,ARKit已经成为建立稳定的消费者级AR的可靠方法。 最近,在iPhone X发布期间,据透露ARKit将包括一些面部跟踪功能 ,这些功能仅在使用前置摄像头阵列(包括深度摄像头)的iPhone X上可用。

Unity has been working closely with Apple from the beginning to deliver a Unity ARKit plugin with the ARKit announcement so that Unity developers could start using those features as soon as it was available. Since then, we have continued to work closely with Apple to deliver the face tracking features of ARKit as part of the Unity ARKit plugin.

Unity从一开始就与Apple紧密合作,在发布ARKit的同时提供了Unity ARKit插件,以便Unity开发人员可以在可用后立即开始使用这些功能。 从那时起,我们一直与Apple紧密合作,将ARKit的面部跟踪功能作为Unity ARKit插件的一部分提供。

New code and examples for these features are integrated into the Unity ARKit plugin, which you can get from BitBucket or from the Asset Store.

这些功能的新代码和示例已集成到Unity ARKit插件中,您可以从BitBucket或资产商店中获得该插件。

插件中的API添加 (API additions in plugin)

There is now a new configuration called ARKitFaceTrackingConfiguration which can be used when running on an iPhone X. There are new RunWithConfig and RunWithConfigAndOptions methods that take the ARKitFaceTrackingConfiguration to start the AR session.

现在有一个名为ARKitFaceTrackingConfiguration的新配置,可以在iPhone X上运行时使用。有新的RunWithConfigRunWithConfigAndOptions方法采用ARKitFaceTrackingConfiguration来启动AR会话。

public void RunWithConfig(ARKitFaceTrackingConfiguration config) public void RunWithConfigAndOptions(ARKitFaceTrackingConfiguration config, UnityARSessionRunOption runOptions) public void RunWithConfig(ARKitFaceTrackingConfiguration config) public void RunWithConfigAndOptions(ARKitFaceTrackingConfiguration config, UnityARSessionRunOption runOptions)

1

2
3

public void RunWithConfig(ARKitFaceTrackingConfiguration config)
public void RunWithConfigAndOptions(ARKitFaceTrackingConfiguration config, UnityARSessionRunOption runOptions)

1

2
3

public void RunWithConfig ( ARKitFaceTrackingConfiguration config )
public void RunWithConfigAndOptions ( ARKitFaceTrackingConfiguration config , UnityARSessionRunOption runOptions )

There are also event callbacks for when ARFaceAnchor is added, removed or updated:

还有添加,删除或更新ARFaceAnchor时的事件回调:

public delegate void ARFaceAnchorAdded(ARFaceAnchor anchorData); public static event ARFaceAnchorAdded ARFaceAnchorAddedEvent; public delegate void ARFaceAnchorUpdated(ARFaceAnchor anchorData); public static event ARFaceAnchorUpdated ARFaceAnchorUpdatedEvent; public delegate void ARFaceAnchorRemoved(ARFaceAnchor anchorData); public static event ARFaceAnchorRemoved ARFaceAnchorRemovedEvent; public delegate void ARFaceAnchorAdded(ARFaceAnchor anchorData); public static event ARFaceAnchorAdded ARFaceAnchorAddedEvent; public delegate void ARFaceAnchorUpdated(ARFaceAnchor anchorData); public static event ARFaceAnchorUpdated ARFaceAnchorUpdatedEvent; public delegate void ARFaceAnchorRemoved(ARFaceAnchor anchorData); public static event ARFaceAnchorRemoved ARFaceAnchorRemovedEvent;

1

2
3
4
5
6
7
8
9
10
11
12
13
14
15

public delegate void ARFaceAnchorAdded(ARFaceAnchor anchorData);
public static event ARFaceAnchorAdded ARFaceAnchorAddedEvent;
public delegate void ARFaceAnchorUpdated(ARFaceAnchor anchorData);
public static event ARFaceAnchorUpdated ARFaceAnchorUpdatedEvent;
public delegate void ARFaceAnchorRemoved(ARFaceAnchor anchorData);
public static event ARFaceAnchorRemoved ARFaceAnchorRemovedEvent;

1

2
3
4
5
6
7
8
9
10
11
12
13
14
15

public delegate void ARFaceAnchorAdded ( ARFaceAnchor anchorData ) ;
public static event ARFaceAnchorAdded ARFaceAnchorAddedEvent ;
public delegate void ARFaceAnchorUpdated ( ARFaceAnchor anchorData ) ;
public static event ARFaceAnchorUpdated ARFaceAnchorUpdatedEvent ;
public delegate void ARFaceAnchorRemoved ( ARFaceAnchor anchorData ) ;
public static event ARFaceAnchorRemoved ARFaceAnchorRemovedEvent ;

人脸追踪功能 (Features Exposed by Face Tracking)

There are four main features exposed by Face Tracking in ARKit. They are described below and corresponding examples that use them are detailed.

ARKit中的“面部跟踪”揭示了四个主要功能。 下面介绍它们,并详细说明使用它们的相应示例。

面锚 (Face Anchor)

The basic feature of face tracking is to provide a face anchor when ARKit detects a face with the front camera on the iPhone X. This face anchor is similar to the plane anchor that the ARKit returns usually, but tracks the position and orientation of the center of the head as you move it around. This allows you to use the movement of the face as input for your ARKit app, but also allows you to use this anchor to attach objects to the face or head so that it will move around with the movement of your head.

面部跟踪的基本功能是当ARKit在iPhone X上使用前置摄像头检测到面部时提供面部锚 。此面部锚与ARKit通常返回的平面锚类似,但会跟踪中心的位置和方向。当您四处移动时。 这允许您将面部的运动用作ARKit应用程序的输入,还可以使用此锚点将对象附加到面部或头部,以便其随头部的运动而运动。

We have made an example scene to demonstrate the use of this called FaceAnchorScene.  This has a GameObject with the component UnityARFaceAnchorManager that initializes ARKit with ARKitFaceTrackingConfiguration. It also hooks into the FaceAnchor to create, update, and remove events so that it can do the following:

我们制作了一个示例场景来演示此名为FaceAnchorScene的用法。 它具有一个带有UnityARFaceAnchorManager组件的GameObject ,该组件使用ARKitFaceTrackingConfiguration初始化ARKit 。 它还可以挂接到FaceAnchor上以创建,更新和删除事件,以便可以执行以下操作:

This scene also uses ARCameraTracker component that updates the Main Unity Camera via the FrameUpdateEvent that a regular ARKit app uses.

该场景还使用ARCameraTracker组件,该组件通过常规ARKit应用程序使用的FrameUpdateEvent更新Main Unity Camera。

面网格几何 (Face Mesh Geometry)

The face tracking API can also return the geometry of the face it detects as a mesh. We can then use the mesh vertices to create a corresponding mesh in Unity. Then we can use the mesh in Unity with a transparent texture to allow all sorts of face painting and masks. We can also put an occlusion material on this mesh when we attach things to the face anchor so that the attachments occlude properly against the video of the face.

面部跟踪API还可以返回其检测为网格的面部的几何形状 。 然后,我们可以使用网格顶点在Unity中创建相应的网格。 然后,我们可以在Unity中使用具有透明纹理的网格以允许进行各种面部绘画和蒙版。 当我们将物体连接到面部锚上时,我们还可以在该网格上放置遮挡材质,以使附件正确遮挡在面部视频上。

The example scene called FaceMeshScene shows how to display the face mesh geometry on top of your face with a default material on it (so it appears grey). It has the usual ARCameraTracker GameObject to move the camera in the scene. In addition, it has ARFaceMeshManager GameObject, which has a standard Mesh Renderer and an empty Mesh Filter component. This GameObject also has UnityARFaceMeshManager component, which does the following:

名为FaceMeshScene的示例场景显示了如何在带有默认材质(因此显示为灰色)的情况下,在您的脸部上方显示脸部网格几何体。 它具有通常的ARCameraTracker GameObject可以在场景中移动相机。 此外,它还具有ARFaceMeshManager游戏对象,该对象具有标准的网格渲染器和空的网格过滤器组件。 此GameObject还具有UnityARFaceMeshManager组件,该组件执行以下操作:

混合形状 (Blend Shapes)

Another set of data we get from face tracking are coefficients that describe the expressions on your face, which can be mapped onto a virtual face to make it have a similar expression to yours.

我们从面部跟踪中获得的另一组数据是描述您面部表情的系数 ,可以将这些系数映射到虚拟面部上,以使其具有与您相似的表情。

Our example scene FaceBlendShapeScene shows this. In the UI, it shows the coefficients of the different blend shape values that are returned by the current expression on your face. See how they change when you change your expressions!

我们的示例场景FaceBlendShapeScene显示了这一点。 在用户界面中,它显示了当前表情在您的脸上返回的不同混合形状值的系数。 看看当您改变表情时它们如何改变!

This scene has the same GameObjects as the FaceMeshScene, but in addition also has a BlendshapeOutput GameObject which contains a BlendshapePrinter component. This component extracts the blend shapes from the face anchor if it exists and outputs it to the screen UI.

该场景具有与FaceMeshScene相同的GameObject,但是除此之外,还具有BlendshapeOutput GameObject,其中包含BlendshapePrinter组件。 如果存在,此组件将从面锚提取混合形状并将其输出到屏幕UI。

We are working on a more elaborate example where these values will be mapped on to the facial animation of a virtual head to get a better idea on how this could work in your experience.

我们正在研究一个更精致的示例,其中将这些值映射到虚拟头部的面部动画上,以更好地了解如何将其应用于您的体验。

定向光估计 (Directional Light Estimate)

Another interesting set of data that you get with face tracking is a directional light estimate of the scene, based on using your face as a light probe in the scene. The estimate that is generated contains three things:

通过面部跟踪获得的另一套有趣的数据是基于将面部用作场景中的光探测器的定向光估计 。 生成的估计包含三件事:

The last of these is very interesting for us in Unity, as it is the solution used for dynamic global illumination in our standard rendering pipeline. Knowing this information, we can take advantage of it in our example scene.

其中的最后一个对我们在Unity中非常有趣,因为它是标准渲染管线中用于动态全局照明的解决方案。 知道了这些信息,我们可以在示例场景中利用它。

The FaceDirectionalLightEstimate scene has an ARCameraTracker and an ARFaceAnchorManager, which moves a standard grey sphere mesh around with your face.  What’s new is the ARKitLightManager GameObject, which has the UnityARKitLightManager component on it that gets the spherical harmonics coefficients from the FrameUpdated event and plugs it into all of the Unity light probes in the scene, including the ambient light probe (which is used when none of the light probes in the scene affect the mesh). This effectively lights the meshes in the scene with the estimated environment lighting dynamically.

FaceDirectionalLightEstimate场景具有ARCameraTracker和ARFaceAnchorManager,可将标准的灰色球体网格围绕您的脸移动。 新增的功能是ARKitLightManager GameObject,其上具有UnityARKitLightManager组件,该组件可从FrameUpdated事件获取球谐系数,并将其插入场景中的所有Unity光探测器中,包括环境光探测器(当场景中的光探针会影响网格)。 这样可以有效地照亮场景中的网格物体,并具有估计的环境照明。

Alternatively, if you wish to use your own mechanism to light the scene you can get the raw spherical harmonics coefficients in Unity’s coordinate system via the FrameUpdatedEvent and plug it into your lighting formulas. You may also just want to light with the primary light direction and intensity, which are also available in the same manner.

另外,如果您希望使用自己的机制来照亮场景,则可以通过FrameUpdatedEvent在Unity坐标系中获取原始球谐系数,并将其插入您的照明公式中。 您可能还只想以主要的光线方向和强度进行照明,它们也可以相同的方式使用。

在应用程序中使用面部跟踪 (Use Face Tracking in your Apps)

As you can see, there are some nice ARKit features available with face tracking on the iPhone X. Unity’s ARKit plugin can help you to easily implement these features within your apps. As usual, show us your creations on @jimmy_jam_jam, and ask any questions on the forums.

如您所见,iPhone X上的面部跟踪提供了一些不错的ARKit功能。Unity的ARKit插件可以帮助您轻松地在应用程序中实现这些功能。 像往常一样,在@jimmy_jam_jam上向我们展示您的作品,并在论坛上提问。

翻译自: https://blogs.unity3d.com/2017/11/03/arkit-face-tracking-on-iphone-x/

arkit 人脸捕捉

arkit 人脸捕捉_iPhone X上的ARKit人脸追踪相关推荐

  1. 人脸检测:史上最详细人脸检测libfacedetection讲解-网络详解--第二节

    以下是关于我个人对libfacedetection(人脸检测-pytorch)的所有见解,如有错误欢迎大家在评论区指出,我将会第一时间纠正.据说,人脸检测速度可以达到1000FPS,到底结果如何,我们 ...

  2. 人脸识别:史上最详细人脸识别adaface讲解-ckpt转onnx模型--第三节

    这章节我会讲解的是我在工作上的项目,人脸识别adaface,以下的讲解为个人的看法,若有地方说错的我会第一时间纠正,如果觉得博主讲解的还可以的话点个赞,就是对我最大的鼓励~ 上一章节我们讲到了模型的训 ...

  3. 人脸识别:史上最详细人脸识别adaface讲解-模型训练与测试--第二节

    这章节我会讲解的是我在工作上的项目,人脸识别adaface,以下的讲解为个人的看法,若有地方说错的我会第一时间纠正,如果觉得博主讲解的还可以的话点个赞,就是对我最大的鼓励~ 上一章节,我们谈到了如何下 ...

  4. ARKit奠定了Apple平台上实现AR的基石

    在WWDC 2017大会上,Apple公布了ARKit.ARKit是一种为iOS构建增强现实(AR,augmented reality)App的框架,意在实现将虚拟内容精确且真实地浸入真实世界场景上. ...

  5. 史上最简单的人脸识别项目登上GitHub趋势榜

    来源 | GitHub Trending整理 | Freesia译者 | TommyZihao出品 | AI科技大本营(ID: rgznai100) 导读:近日,一个名为 face_recogniti ...

  6. 3d slicer调整窗宽窗位_3D人脸模型月销量上千单,谁在打印,谁在帮打?

    原标题:3D人脸模型月销量上千单,谁在打印,谁在帮打? 编辑导读:前段时间,一则"面具可代替人脸解解锁手机"的新闻让人们感觉不寒而栗.如果面具可以代替人脸解锁手机,那么是否也可以完 ...

  7. vue 拍照人脸识别_安排上了!PC人脸识别登录,出乎意料的简单

    推荐阅读: 微服务实战文档分享,阿里内部的Spring cloud微服务精髓都在里面 去面试3W月薪的Java岗位,被虐哭了,原来是我这些技术点还有欠缺 春招失厉,狂刷200+面试文档,终斩获头条,阿 ...

  8. VR来了,3D人脸重建跟上《三维人脸重建-3DMM》

    之前我们写过了<三维人脸重建入门>,接下来,自然就是入门之后的事情.当然了,不管是一个什么项目,方法永远不会是唯一的. 一 引言 To my best of knowledge,如之前所说 ...

  9. 耐克人脸识别_人脸识别首案宣判,当人脸识别遇上面试,将碰出怎样的火花?...

    11月20日,备受关注的"中国人脸识别第一案"迎来一审宣判,这一判决具有破冰意义,对于滥用人脸识别行为发出了严正的警告,也将大众的目光再次聚焦在"人脸识别"这一 ...

  10. 人脸识别有趣应用3——抠出人脸并给人脸涂上戏剧脸谱

    目录 前言 原理 Python源代码 前言 人脸最基础的操作之一,是要将人脸识别出来后,把真个人脸给抠出来,这样就可以对人脸进行各种操作,比如:美白.去痘等等,本篇是基于人脸识别库,结合阈值分割图片的 ...

最新文章

  1. react native native module
  2. oracle数据库Sys密码策略,Oracle数据库加固之密码策略解析
  3. 面向对象简述--对象、引用、指针
  4. 审批政策中收入与负债核实
  5. kubernetes常用对象
  6. mapreduce排序算法_MapReduce算法–二级排序
  7. 关系数据库——视图/存储过程/触发器
  8. 部署到gcp_GCP 网络系统Andromeda --- 概述篇
  9. 树形DP-HDU1561 The more, The Better
  10. zoj 1450 Minimal Circle 最小覆盖圆
  11. 9篇前沿文章 | 一览肿瘤基因组及多组学思路
  12. 量产HLW8032串口通讯芯片的三相电参数采集系统项目资料
  13. 没有无线网络设备时如何共享无线网络
  14. iOS 给文字上面加贯穿横线
  15. JS脚本defer的作用 (转自一路前行)
  16. python之路_面向对象
  17. VSCode实现STM32开发
  18. Linux安全之三大攻击(SYN,DDOS,CC)原理及处理的详解
  19. FFmpeg编译报nasm/yasm not found or too old. 错误解决
  20. 【Coding】LeetCode刷题记录

热门文章

  1. C语言总结(一维数组、二维数组、字符数组和字符串)
  2. 集合的概念以及集合框架的介绍
  3. 【Oracle BIEE学习笔记一】Oracle BIEE简介
  4. 怎么找网页源文件位置_win7系统查看网页源文件的三种方法
  5. 02 JS实现时钟效果
  6. 34岁华为员工跳槽央企,晒出年薪和工作时间,网友:羡慕了
  7. “用户请求取消当前的操作”的几种解决办法
  8. 瞬态抑制二极管(Tvs)和快恢复/超快恢复二极管的学习
  9. stm32定时器3产生1us延时的函数
  10. 为什么计算机连不上无线网络,为什么电脑连不上wifi(wifi正常 电脑连不上网)