前段时间捣鼓OpenCV插件,得到多位小伙伴们的认可,因工作原因,有一段时间没更新文章了,在和小伙伴们的交流中发现一些问题,本文鉴于上一章 的思路做一个换脸演示流程,在此感谢各位网友的认可,如果此文能够帮助你解决项目上的问题,希望动动你的发财手,给个一键三联!
进入正题:首先下载必备插件,没有顺序之分,如果必须有就是:OpenCv,DlibFaceLandmarkDetector,FaceMaskExample,当然的确没有顺序之分,因为导入进入报错之后你只要把三个插件导入进去就都解决了(鉴于之前一个小伙伴出现的乌龙提示一下);
接下来呢就是下载必要的依赖文件,如果下载不了的,可以到我博客下载资源中找到。插件导入和依赖文件导入后目录如下:(记得把插件中的StreamingAssets放到根目录)
接下来新建一个场景配置如下:

除了Quad挂载了脚本之外。其他的物体均未挂载任何脚本组件:

Quad挂载以下几个组件:WebCamMask,TextureExample ,FpsMonitor (FpsMonitor 可要可不要,如不要,将代码中的相关逻辑取消就行)两个脚本中有依赖组件,可手动挂载:TrackedMeshOverlay ,WebCamTextureToMatHelper;

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using UnityEngine;
using UnityEngine.SceneManagement;
using DlibFaceLandmarkDetector;
using OpenCVForUnity.RectangleTrack;
using OpenCVForUnity.UnityUtils.Helper;
using OpenCVForUnity.CoreModule;
using OpenCVForUnity.ObjdetectModule;
using OpenCVForUnity.ImgprocModule;
using Rect = OpenCVForUnity.CoreModule.Rect;
using FaceMaskExample;
using UnityEditor;/// <summary>
/// WebCamTexture FaceMask Example
/// </summary>
[RequireComponent(typeof(WebCamTextureToMatHelper), typeof(TrackedMeshOverlay))]
public class WebCamMask : MonoBehaviour
{[HeaderAttribute("FaceMaskData")]/// <summary>/// The face mask data list./// </summary>public List<FaceMaskData> faceMaskDatas;[HeaderAttribute("Option")]/// <summary>/// Determines if use dlib face detector./// </summary>public bool useDlibFaceDetecter = false;/// <summary>/// Determines if enables noise filter./// </summary>public bool enableNoiseFilter = true;/// <summary>/// Determines if enables color correction./// </summary>public bool enableColorCorrection = true;/// <summary>/// Determines if filters non frontal faces./// </summary>public bool filterNonFrontalFaces = false;/// <summary>/// The frontal face rate lower limit./// </summary>[Range(0.0f, 1.0f)]public float frontalFaceRateLowerLimit = 0.85f;/// <summary>/// Determines if displays face rects./// </summary>public bool displayFaceRects = false;#region UI/ <summary>/ The toggle for switching face rects display state/ </summary>//public Toggle displayFaceRectsToggle;/ <summary>/ The filter non frontal faces toggle./ </summary>//public Toggle filterNonFrontalFacesToggle;/ <summary>/ The enable color correction toggle./ </summary>//public Toggle enableColorCorrectionToggle;/ <summary>/ The enable noise filter toggle./ </summary>//public Toggle enableNoiseFilterToggle;/ <summary>/ The use dlib face detecter toggle./ </summary>//public Toggle useDlibFaceDetecterToggle;/ <summary>/ The toggle for switching debug face points display state./ </summary>//public Toggle displayDebugFacePointsToggle;#endregion/// <summary>/// Determines if displays debug face points./// </summary>public bool displayDebugFacePoints = false;/// <summary>/// The gray mat./// </summary>Mat grayMat;/// <summary>/// The texture./// </summary>Texture2D texture;/// <summary>/// The cascade./// </summary>CascadeClassifier cascade;/// <summary>/// The detection based tracker./// </summary>RectangleTracker rectangleTracker;/// <summary>/// The web cam texture to mat helper./// </summary>WebCamTextureToMatHelper webCamTextureToMatHelper;/// <summary>/// The face landmark detector./// </summary>FaceLandmarkDetector faceLandmarkDetector;/// <summary>/// The mean points filter dictionary./// </summary>Dictionary<int, LowPassPointsFilter> lowPassFilterDict;/// <summary>/// The optical flow points filter dictionary./// </summary>Dictionary<int, OFPointsFilter> opticalFlowFilterDict;/// <summary>/// The face mask color corrector./// </summary>FaceMaskColorCorrector faceMaskColorCorrector;/// <summary>/// The frontal face checker./// </summary>FrontalFaceChecker frontalFaceChecker;/// <summary>/// The mesh overlay./// </summary>TrackedMeshOverlay meshOverlay;/// <summary>/// The Shader.PropertyToID for "_Fade"./// </summary>int shader_FadeID;/// <summary>/// The Shader.PropertyToID for "_ColorCorrection"./// </summary>int shader_ColorCorrectionID;/// <summary>/// The Shader.PropertyToID for "_LUTTex"./// </summary>int shader_LUTTexID;/// <summary>/// The face mask texture./// </summary>Texture2D faceMaskTexture;/// <summary>/// The face mask mat./// </summary>Mat faceMaskMat;/// <summary>/// The index number of face mask data./// </summary>int faceMaskDataIndex = 0;/// <summary>/// The detected face rect in mask mat./// </summary>UnityEngine.Rect faceRectInMask;/// <summary>/// The detected face landmark points in mask mat./// </summary>List<Vector2> faceLandmarkPointsInMask;/// <summary>/// The haarcascade_frontalface_alt_xml_filepath./// </summary>string haarcascade_frontalface_alt_xml_filepath;/// <summary>/// The sp_human_face_68_dat_filepath./// </summary>string sp_human_face_68_dat_filepath;/// <summary>/// The FPS monitor./// </summary>FpsMonitor fpsMonitor;#if UNITY_WEBGL && !UNITY_EDITORIEnumerator getFilePath_Coroutine;
#endif// Use this for initializationvoid Start(){fpsMonitor = GetComponent<FpsMonitor>();webCamTextureToMatHelper = gameObject.GetComponent<WebCamTextureToMatHelper>();#if UNITY_WEBGL && !UNITY_EDITORgetFilePath_Coroutine = GetFilePath ();StartCoroutine (getFilePath_Coroutine);
#elsehaarcascade_frontalface_alt_xml_filepath = OpenCVForUnity.UnityUtils.Utils.getFilePath("haarcascade_frontalface_alt.xml");sp_human_face_68_dat_filepath = DlibFaceLandmarkDetector.UnityUtils.Utils.getFilePath("sp_human_face_68.dat");Run();
#endif}#if UNITY_WEBGL && !UNITY_EDITORprivate IEnumerator GetFilePath (){var getFilePathAsync_0_Coroutine = OpenCVForUnity.UnityUtils.Utils.getFilePathAsync ("haarcascade_frontalface_alt.xml", (result) => {haarcascade_frontalface_alt_xml_filepath = result;});yield return getFilePathAsync_0_Coroutine;var getFilePathAsync_1_Coroutine = DlibFaceLandmarkDetector.UnityUtils.Utils.getFilePathAsync ("sp_human_face_68.dat", (result) => {sp_human_face_68_dat_filepath = result;});yield return getFilePathAsync_1_Coroutine;getFilePath_Coroutine = null;Run ();}
#endifprivate void Run(){meshOverlay = this.GetComponent<TrackedMeshOverlay>();shader_FadeID = Shader.PropertyToID("_Fade");shader_ColorCorrectionID = Shader.PropertyToID("_ColorCorrection");shader_LUTTexID = Shader.PropertyToID("_LUTTex");rectangleTracker = new RectangleTracker();faceLandmarkDetector = new FaceLandmarkDetector(sp_human_face_68_dat_filepath);lowPassFilterDict = new Dictionary<int, LowPassPointsFilter>();opticalFlowFilterDict = new Dictionary<int, OFPointsFilter>();faceMaskColorCorrector = new FaceMaskColorCorrector();#if UNITY_ANDROID && !UNITY_EDITOR// Avoids the front camera low light issue that occurs in only some Android devices (e.g. Google Pixel, Pixel2).webCamTextureToMatHelper.avoidAndroidFrontCameraLowLightIssue = true;
#endifwebCamTextureToMatHelper.Initialize(); }/// <summary>/// Raises the web cam texture to mat helper initialized event./// </summary>public void OnWebCamTextureToMatHelperInitialized(){Debug.Log("OnWebCamTextureToMatHelperInitialized");Mat webCamTextureMat = webCamTextureToMatHelper.GetMat();texture = new Texture2D(webCamTextureMat.cols(), webCamTextureMat.rows(), TextureFormat.RGBA32, false);gameObject.transform.localScale = new Vector3(webCamTextureMat.cols(), webCamTextureMat.rows(), 1);Debug.Log("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);if (fpsMonitor != null){fpsMonitor.Add("width", webCamTextureMat.width().ToString());fpsMonitor.Add("height", webCamTextureMat.height().ToString());fpsMonitor.Add("orientation", Screen.orientation.ToString());}float width = gameObject.transform.localScale.x;float height = gameObject.transform.localScale.y;float widthScale = (float)Screen.width / width;float heightScale = (float)Screen.height / height;if (widthScale < heightScale){Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;}else{Camera.main.orthographicSize = height / 2;}gameObject.GetComponent<Renderer>().material.mainTexture = texture;grayMat = new Mat(webCamTextureMat.rows(), webCamTextureMat.cols(), CvType.CV_8UC1);cascade = new CascadeClassifier(haarcascade_frontalface_alt_xml_filepath);//            if (cascade.empty ()) {//                Debug.LogError ("cascade file is not loaded.Please copy from “FaceTrackerExample/StreamingAssets/” to “Assets/StreamingAssets/” folder. ");//            }frontalFaceChecker = new FrontalFaceChecker(width, height);meshOverlay.UpdateOverlayTransform(gameObject.transform);OnChangeFaceMaskButtonClick();}/// <summary>/// Raises the web cam texture to mat helper disposed event./// </summary>public void OnWebCamTextureToMatHelperDisposed(){Debug.Log("OnWebCamTextureToMatHelperDisposed");grayMat.Dispose();if (texture != null){Texture2D.Destroy(texture);texture = null;}rectangleTracker.Reset();meshOverlay.Reset();foreach (var key in lowPassFilterDict.Keys){lowPassFilterDict[key].Dispose();}lowPassFilterDict.Clear();foreach (var key in opticalFlowFilterDict.Keys){opticalFlowFilterDict[key].Dispose();}opticalFlowFilterDict.Clear();faceMaskColorCorrector.Reset();frontalFaceChecker.Dispose();}/// <summary>/// Raises the web cam texture to mat helper error occurred event./// </summary>/// <param name="errorCode">Error code.</param>public void OnWebCamTextureToMatHelperErrorOccurred(WebCamTextureToMatHelper.ErrorCode errorCode){Debug.Log("OnWebCamTextureToMatHelperErrorOccurred " + errorCode);}GameObject item = null;List<Vector2> landmasks = null;UnityEngine.Rect facemaskrect;public TextureExample textureExample;// Update is called once per framevoid Update(){if (Input.GetKeyDown(KeyCode.Space)){//防止获取视频的图片卡帧或者获取的非当前帧,这里先暂停,不暂停会出现误差。不信你试试。。。webCamTextureToMatHelper.Pause(); //这里是将获取到的图片存入本地。因为我是需要将预制存到本地,所以图片至关重要。// Config.SaveTexture(gameObject.GetComponent<Renderer>().material.mainTexture as Texture2D, Application.dataPath + "/TestForOpenCV/Resources", "wakaka");if (item == null){//加载预制并将FaceMaskData重新赋值。GameObject obj = Instantiate(Resources.Load<GameObject>("TestMask"), GameObject.Find("FaceMaskData").transform);obj.GetComponent<FaceMaskData>().image = gameObject.GetComponent<Renderer>().material.mainTexture as Texture2D;obj.GetComponent<FaceMaskData>().faceRect = facemaskrect;obj.GetComponent<FaceMaskData>().landmarkPoints = landmasks;faceMaskDatas.Add(obj.GetComponent<FaceMaskData>());textureExample.faceMaskDatas = faceMaskDatas;textureExample.Run(Resources.Load("头-黑底") as Texture2D);//var prefabInstance = PrefabUtility.GetCorrespondingObjectFromSource(obj); 非打开预制体模式下//if (prefabInstance)//{//    var prefabPath = AssetDatabase.GetAssetPath(prefabInstance);//    // 修改预制体,则只能先Unpack预制体再保存//    PrefabUtility.UnpackPrefabInstance(obj, PrefabUnpackMode.Completely, InteractionMode.UserAction);//    PrefabUtility.SaveAsPrefabAssetAndConnect(obj, prefabPath, InteractionMode.AutomatedAction);//    // 不修改只新增,可以直接保存//    PrefabUtility.SaveAsPrefabAsset(obj, prefabPath);//}//else//{//    // 预制体模式下,从Prefab场景中取得预制体资源位置和根物体,并保存//    // PrefabStage prefabStage = PrefabStageUtility.GetCurrentPrefabStage();//    //预制体原始位置//    string path = Application.dataPath + "/TestForOpenCV/Resources/TestMask.prefab";//prefabStage.prefabAssetPath;//    GameObject root = obj; //prefabStage.prefabContentsRoot;//    PrefabUtility.SaveAsPrefabAsset(root, path);//} }}if (webCamTextureToMatHelper.IsPlaying() && webCamTextureToMatHelper.DidUpdateThisFrame()){Mat rgbaMat = webCamTextureToMatHelper.GetMat();// detect faces.List<Rect> detectResult = new List<Rect>();if (useDlibFaceDetecter){OpenCVForUnityUtils.SetImage(faceLandmarkDetector, rgbaMat);List<UnityEngine.Rect> result = faceLandmarkDetector.Detect();foreach (var unityRect in result){detectResult.Add(new Rect((int)unityRect.x, (int)unityRect.y, (int)unityRect.width, (int)unityRect.height));}}else{// convert image to greyscale.Imgproc.cvtColor(rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);using (Mat equalizeHistMat = new Mat())using (MatOfRect faces = new MatOfRect()){Imgproc.equalizeHist(grayMat, equalizeHistMat);cascade.detectMultiScale(equalizeHistMat, faces, 1.1f, 2, 0 | Objdetect.CASCADE_SCALE_IMAGE, new Size(equalizeHistMat.cols() * 0.15, equalizeHistMat.cols() * 0.15), new Size());detectResult = faces.toList();}// corrects the deviation of a detection result between OpenCV and Dlib.foreach (Rect r in detectResult){r.y += (int)(r.height * 0.1f);}}// face tracking.rectangleTracker.UpdateTrackedObjects(detectResult);List<TrackedRect> trackedRects = new List<TrackedRect>();rectangleTracker.GetObjects(trackedRects, true);// create noise filter.foreach (var openCVRect in trackedRects){if (openCVRect.state == TrackedState.NEW){if (!lowPassFilterDict.ContainsKey(openCVRect.id))lowPassFilterDict.Add(openCVRect.id, new LowPassPointsFilter((int)faceLandmarkDetector.GetShapePredictorNumParts()));if (!opticalFlowFilterDict.ContainsKey(openCVRect.id))opticalFlowFilterDict.Add(openCVRect.id, new OFPointsFilter((int)faceLandmarkDetector.GetShapePredictorNumParts()));}else if (openCVRect.state == TrackedState.DELETED){if (lowPassFilterDict.ContainsKey(openCVRect.id)){lowPassFilterDict[openCVRect.id].Dispose();lowPassFilterDict.Remove(openCVRect.id);}if (opticalFlowFilterDict.ContainsKey(openCVRect.id)){opticalFlowFilterDict[openCVRect.id].Dispose();opticalFlowFilterDict.Remove(openCVRect.id);}}}// create LUT texture.foreach (var openCVRect in trackedRects){if (openCVRect.state == TrackedState.NEW){faceMaskColorCorrector.CreateLUTTex(openCVRect.id);}else if (openCVRect.state == TrackedState.DELETED){faceMaskColorCorrector.DeleteLUTTex(openCVRect.id);}}// detect face landmark points.OpenCVForUnityUtils.SetImage(faceLandmarkDetector, rgbaMat);List<List<Vector2>> landmarkPoints = new List<List<Vector2>>();for (int i = 0; i < trackedRects.Count; i++){TrackedRect tr = trackedRects[i];UnityEngine.Rect rect = new UnityEngine.Rect(tr.x, tr.y, tr.width, tr.height);List<Vector2> points = faceLandmarkDetector.DetectLandmark(rect);// apply noise filter.if (enableNoiseFilter){if (tr.state > TrackedState.NEW && tr.state < TrackedState.DELETED){opticalFlowFilterDict[tr.id].Process(rgbaMat, points, points);lowPassFilterDict[tr.id].Process(rgbaMat, points, points);}}landmarkPoints.Add(points);}if (landmarkPoints.Count > 0){landmasks = landmarkPoints[0];facemaskrect = new UnityEngine.Rect(trackedRects[0].x, trackedRects[0].y, trackedRects[0].width, trackedRects[0].height);print(string.Format("当前识别到人脸数量为:{0},标志点为:{1},追踪人数:{2}", landmarkPoints.Count, landmarkPoints[0].Count, trackedRects.Count));}#region  创建遮罩face masking.if (faceMaskTexture != null && landmarkPoints.Count >= 1){ // Apply face masking between detected faces and a face mask image.float maskImageWidth = faceMaskTexture.width;float maskImageHeight = faceMaskTexture.height;TrackedRect tr;for (int i = 0; i < trackedRects.Count; i++){tr = trackedRects[i];if (tr.state == TrackedState.NEW){meshOverlay.CreateObject(tr.id, faceMaskTexture);}if (tr.state < TrackedState.DELETED){MaskFace(meshOverlay, tr, landmarkPoints[i], faceLandmarkPointsInMask, maskImageWidth, maskImageHeight);if (enableColorCorrection){CorrectFaceMaskColor(tr.id, faceMaskMat, rgbaMat, faceLandmarkPointsInMask, landmarkPoints[i]);}}else if (tr.state == TrackedState.DELETED){meshOverlay.DeleteObject(tr.id);}}}else if (landmarkPoints.Count >= 1){ // Apply face masking between detected faces.float maskImageWidth = texture.width;float maskImageHeight = texture.height;TrackedRect tr;for (int i = 0; i < trackedRects.Count; i++){tr = trackedRects[i];if (tr.state == TrackedState.NEW){meshOverlay.CreateObject(tr.id, texture);}if (tr.state < TrackedState.DELETED){MaskFace(meshOverlay, tr, landmarkPoints[i], landmarkPoints[0], maskImageWidth, maskImageHeight);if (enableColorCorrection){CorrectFaceMaskColor(tr.id, rgbaMat, rgbaMat, landmarkPoints[0], landmarkPoints[i]);}}else if (tr.state == TrackedState.DELETED){meshOverlay.DeleteObject(tr.id);}}}#endregion// draw face rects.if (displayFaceRects){for (int i = 0; i < detectResult.Count; i++){UnityEngine.Rect rect = new UnityEngine.Rect(detectResult[i].x, detectResult[i].y, detectResult[i].width, detectResult[i].height);OpenCVForUnityUtils.DrawFaceRect(rgbaMat, rect, new Scalar(255, 0, 0, 255), 2);}for (int i = 0; i < trackedRects.Count; i++){UnityEngine.Rect rect = new UnityEngine.Rect(trackedRects[i].x, trackedRects[i].y, trackedRects[i].width, trackedRects[i].height);OpenCVForUnityUtils.DrawFaceRect(rgbaMat, rect, new Scalar(255, 255, 0, 255), 2);//                        Imgproc.putText (rgbaMat, " " + frontalFaceChecker.GetFrontalFaceAngles (landmarkPoints [i]), new Point (rect.xMin, rect.yMin - 10), Imgproc.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false);//                        Imgproc.putText (rgbaMat, " " + frontalFaceChecker.GetFrontalFaceRate (landmarkPoints [i]), new Point (rect.xMin, rect.yMin - 10), Imgproc.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false);}}// draw face points.if (displayDebugFacePoints){for (int i = 0; i < landmarkPoints.Count; i++){OpenCVForUnityUtils.DrawFaceLandmark(rgbaMat, landmarkPoints[i], new Scalar(0, 255, 0, 255), 2);}}print(landmarkPoints.Count);// display face mask image.#region 是否显示遮罩//if (faceMaskTexture != null && faceMaskMat != null)//{//    if (displayFaceRects)//    {//        OpenCVForUnityUtils.DrawFaceRect(faceMaskMat, faceRectInMask, new Scalar(255, 0, 0, 255), 2);//    }//    if (displayDebugFacePoints)//    {//        OpenCVForUnityUtils.DrawFaceLandmark(faceMaskMat, faceLandmarkPointsInMask, new Scalar(0, 255, 0, 255), 2);//    }//    float scale = (rgbaMat.width() / 4f) / faceMaskMat.width();//    float tx = rgbaMat.width() - faceMaskMat.width() * scale;//    float ty = 0.0f;//    Mat trans = new Mat(2, 3, CvType.CV_32F);//1.0, 0.0, tx, 0.0, 1.0, ty);//    trans.put(0, 0, scale);//    trans.put(0, 1, 0.0f);//    trans.put(0, 2, tx);//    trans.put(1, 0, 0.0f);//    trans.put(1, 1, scale);//    trans.put(1, 2, ty);//    Imgproc.warpAffine(faceMaskMat, rgbaMat, trans, rgbaMat.size(), Imgproc.INTER_LINEAR, Core.BORDER_TRANSPARENT, new Scalar(0));//    if (displayFaceRects)//        OpenCVForUnity.UnityUtils.Utils.texture2DToMat(faceMaskTexture, faceMaskMat);//}#endregion//                Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Imgproc.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 1, Imgproc.LINE_AA, false);OpenCVForUnity.UnityUtils.Utils.fastMatToTexture2D(rgbaMat, texture);}}private void MaskFace(TrackedMeshOverlay meshOverlay, TrackedRect tr, List<Vector2> landmarkPoints, List<Vector2> landmarkPointsInMaskImage, float maskImageWidth = 0, float maskImageHeight = 0){float imageWidth = meshOverlay.width;float imageHeight = meshOverlay.height;if (maskImageWidth == 0)maskImageWidth = imageWidth;if (maskImageHeight == 0)maskImageHeight = imageHeight;TrackedMesh tm = meshOverlay.GetObjectById(tr.id);Vector3[] vertices = tm.meshFilter.mesh.vertices;if (vertices.Length == landmarkPoints.Count){for (int j = 0; j < vertices.Length; j++){vertices[j].x = landmarkPoints[j].x / imageWidth - 0.5f;vertices[j].y = 0.5f - landmarkPoints[j].y / imageHeight;}}Vector2[] uv = tm.meshFilter.mesh.uv;if (uv.Length == landmarkPointsInMaskImage.Count){for (int jj = 0; jj < uv.Length; jj++){uv[jj].x = landmarkPointsInMaskImage[jj].x / maskImageWidth;uv[jj].y = (maskImageHeight - landmarkPointsInMaskImage[jj].y) / maskImageHeight;}}meshOverlay.UpdateObject(tr.id, vertices, null, uv);if (tr.numFramesNotDetected > 3){tm.sharedMaterial.SetFloat(shader_FadeID, 1f);}else if (tr.numFramesNotDetected > 0 && tr.numFramesNotDetected <= 3){tm.sharedMaterial.SetFloat(shader_FadeID, 0.3f + (0.7f / 4f) * tr.numFramesNotDetected);}else{tm.sharedMaterial.SetFloat(shader_FadeID, 0.3f);}if (enableColorCorrection){tm.sharedMaterial.SetFloat(shader_ColorCorrectionID, 1f);}else{tm.sharedMaterial.SetFloat(shader_ColorCorrectionID, 0f);}// filter non frontal faces.if (filterNonFrontalFaces && frontalFaceChecker.GetFrontalFaceRate(landmarkPoints) < frontalFaceRateLowerLimit){tm.sharedMaterial.SetFloat(shader_FadeID, 1f);}}private void CorrectFaceMaskColor(int id, Mat src, Mat dst, List<Vector2> src_landmarkPoints, List<Vector2> dst_landmarkPoints){Texture2D LUTTex = faceMaskColorCorrector.UpdateLUTTex(id, src, dst, src_landmarkPoints, dst_landmarkPoints);TrackedMesh tm = meshOverlay.GetObjectById(id);tm.sharedMaterial.SetTexture(shader_LUTTexID, LUTTex);}/// <summary>/// Raises the destroy event./// </summary>void OnDestroy(){webCamTextureToMatHelper.Dispose();if (cascade != null)cascade.Dispose();if (rectangleTracker != null)rectangleTracker.Dispose();if (faceLandmarkDetector != null)faceLandmarkDetector.Dispose();foreach (var key in lowPassFilterDict.Keys){lowPassFilterDict[key].Dispose();}lowPassFilterDict.Clear();foreach (var key in opticalFlowFilterDict.Keys){opticalFlowFilterDict[key].Dispose();}opticalFlowFilterDict.Clear();if (faceMaskColorCorrector != null)faceMaskColorCorrector.Dispose();#if UNITY_WEBGL && !UNITY_EDITORif (getFilePath_Coroutine != null) {StopCoroutine (getFilePath_Coroutine);((IDisposable)getFilePath_Coroutine).Dispose ();}
#endif}/// <summary>/// Raises the back button click event./// </summary>public void OnBackButtonClick(){SceneManager.LoadScene("FaceMaskExample");}/// <summary>/// Raises the play button click event./// </summary>public void OnPlayButtonClick(){webCamTextureToMatHelper.Play();}/// <summary>/// Raises the pause button click event./// </summary>public void OnPauseButtonClick(){webCamTextureToMatHelper.Pause();}/// <summary>/// Raises the change camera button click event./// </summary>public void OnChangeCameraButtonClick(){webCamTextureToMatHelper.requestedIsFrontFacing = !webCamTextureToMatHelper.IsFrontFacing();}/// <summary>/// Raises the change face mask button click event./// </summary>public void OnChangeFaceMaskButtonClick(){RemoveFaceMask();if (faceMaskDatas.Count == 0)return;FaceMaskData maskData = faceMaskDatas[faceMaskDataIndex];faceMaskDataIndex = (faceMaskDataIndex < faceMaskDatas.Count - 1) ? faceMaskDataIndex + 1 : 0;if (maskData == null){Debug.LogError("maskData == null");return;}if (maskData.image == null){Debug.LogError("image == null");return;}if (maskData.landmarkPoints.Count != 68){Debug.LogError("landmarkPoints.Count != 68");return;}faceMaskTexture = maskData.image;faceMaskMat = new Mat(faceMaskTexture.height, faceMaskTexture.width, CvType.CV_8UC4);OpenCVForUnity.UnityUtils.Utils.texture2DToMat(faceMaskTexture, faceMaskMat);if (maskData.isDynamicMode){faceRectInMask = DetectFace(faceMaskMat);faceLandmarkPointsInMask = DetectFaceLandmarkPoints(faceMaskMat, faceRectInMask);maskData.faceRect = faceRectInMask;maskData.landmarkPoints = faceLandmarkPointsInMask;}else{faceRectInMask = maskData.faceRect;faceLandmarkPointsInMask = maskData.landmarkPoints;}if (faceRectInMask.width == 0 && faceRectInMask.height == 0){RemoveFaceMask();Debug.LogError("A face could not be detected from the input image.");} }/// <summary>/// Raises the scan face mask button click event./// </summary>public void OnScanFaceMaskButtonClick(){RemoveFaceMask();// Capture webcam frame.if (webCamTextureToMatHelper.IsPlaying()){Mat rgbaMat = webCamTextureToMatHelper.GetMat();faceRectInMask = DetectFace(rgbaMat);if (faceRectInMask.width == 0 && faceRectInMask.height == 0){Debug.Log("A face could not be detected from the input image.");return;}Rect rect = new Rect((int)faceRectInMask.x, (int)faceRectInMask.y, (int)faceRectInMask.width, (int)faceRectInMask.height);rect.inflate(rect.x / 5, rect.y / 5);rect = rect.intersect(new Rect(0, 0, rgbaMat.width(), rgbaMat.height()));faceMaskTexture = new Texture2D(rect.width, rect.height, TextureFormat.RGBA32, false);faceMaskMat = new Mat(rgbaMat, rect).clone();OpenCVForUnity.UnityUtils.Utils.matToTexture2D(faceMaskMat, faceMaskTexture);Debug.Log("faceMaskMat ToString " + faceMaskMat.ToString());faceRectInMask = DetectFace(faceMaskMat);faceLandmarkPointsInMask = DetectFaceLandmarkPoints(faceMaskMat, faceRectInMask);if (faceRectInMask.width == 0 && faceRectInMask.height == 0){RemoveFaceMask();Debug.Log("A face could not be detected from the input image.");}}}/// <summary>/// Raises the remove face mask button click event./// </summary>public void OnRemoveFaceMaskButtonClick(){RemoveFaceMask();}private void RemoveFaceMask(){faceMaskTexture = null;if (faceMaskMat != null){faceMaskMat.Dispose();faceMaskMat = null;}rectangleTracker.Reset();meshOverlay.Reset();}private UnityEngine.Rect DetectFace(Mat mat){if (useDlibFaceDetecter){OpenCVForUnityUtils.SetImage(faceLandmarkDetector, mat);List<UnityEngine.Rect> result = faceLandmarkDetector.Detect();if (result.Count >= 1)return result[0];}else{using (Mat grayMat = new Mat())using (Mat equalizeHistMat = new Mat())using (MatOfRect faces = new MatOfRect()){// convert image to greyscale.Imgproc.cvtColor(mat, grayMat, Imgproc.COLOR_RGBA2GRAY);Imgproc.equalizeHist(grayMat, equalizeHistMat);cascade.detectMultiScale(equalizeHistMat, faces, 1.1f, 2, 0 | Objdetect.CASCADE_SCALE_IMAGE, new Size(equalizeHistMat.cols() * 0.15, equalizeHistMat.cols() * 0.15), new Size());List<Rect> faceList = faces.toList();if (faceList.Count >= 1){UnityEngine.Rect r = new UnityEngine.Rect(faceList[0].x, faceList[0].y, faceList[0].width, faceList[0].height);// corrects the deviation of a detection result between OpenCV and Dlib.r.y += (int)(r.height * 0.1f);return r;}}}return new UnityEngine.Rect();}private List<Vector2> DetectFaceLandmarkPoints(Mat mat, UnityEngine.Rect rect){OpenCVForUnityUtils.SetImage(faceLandmarkDetector, mat);List<Vector2> points = faceLandmarkDetector.DetectLandmark(rect);return points;}
}
using System.Collections.Generic;
using System.IO;
using System.Linq;
using UnityEngine;
using UnityEngine.SceneManagement;
using DlibFaceLandmarkDetector;
using OpenCVForUnity.ObjdetectModule;
using OpenCVForUnity.CoreModule;
using OpenCVForUnity.ImgprocModule;
using Rect = OpenCVForUnity.CoreModule.Rect;
using FaceMaskExample;
using OpenCVForUnity.RectangleTrack;[RequireComponent(typeof(TrackedMeshOverlay))]
public class TextureExample : MonoBehaviour
{[HeaderAttribute("FaceMaskData")]/// <summary>/// The face mask data list./// </summary>public List<FaceMaskData> faceMaskDatas;/// <summary>/// Determines if use dlib face detector./// </summary>public bool useDlibFaceDetecter = true;/// <summary>/// Determines if enables color correction./// </summary>public bool enableColorCorrection = true;/// <summary>/// Determines if filters non frontal faces./// </summary>public bool filterNonFrontalFaces = false;/// <summary>/// The frontal face rate lower limit./// </summary>[Range(0.0f, 1.0f)]public float frontalFaceRateLowerLimit = 0.85f;/// <summary>/// Determines if displays face rects./// </summary>public bool displayFaceRects = false;/// <summary>/// Determines if displays debug face points./// </summary>public bool displayDebugFacePoints = false;/// <summary>/// The cascade./// </summary>CascadeClassifier cascade;/// <summary>/// The face landmark detector./// </summary>FaceLandmarkDetector faceLandmarkDetector;/// <summary>/// The face mask color corrector./// </summary>FaceMaskColorCorrector faceMaskColorCorrector;/// <summary>/// The mesh overlay./// </summary>TrackedMeshOverlay meshOverlay;/// <summary>/// The haarcascade_frontalface_alt_xml_filepath./// </summary>string haarcascade_frontalface_alt_xml_filepath;/// <summary>/// The sp_human_face_68_dat_filepath./// </summary>string sp_human_face_68_dat_filepath;#if UNITY_WEBGL && !UNITY_EDITORIEnumerator getFilePath_Coroutine;
#endifpublic List<Vector2> tempfacepoint;// Use this for initializationvoid Start(){#if UNITY_WEBGL && !UNITY_EDITORgetFilePath_Coroutine = GetFilePath ();StartCoroutine (getFilePath_Coroutine);
#elsehaarcascade_frontalface_alt_xml_filepath = OpenCVForUnity.UnityUtils.Utils.getFilePath("haarcascade_frontalface_alt.xml");sp_human_face_68_dat_filepath = DlibFaceLandmarkDetector.UnityUtils.Utils.getFilePath("sp_human_face_68.dat");//Run();
#endif}#if UNITY_WEBGL && !UNITY_EDITORprivate IEnumerator GetFilePath (){var getFilePathAsync_0_Coroutine = OpenCVForUnity.UnityUtils.Utils.getFilePathAsync ("haarcascade_frontalface_alt.xml", (result) => {haarcascade_frontalface_alt_xml_filepath = result;});yield return getFilePathAsync_0_Coroutine;var getFilePathAsync_1_Coroutine = DlibFaceLandmarkDetector.UnityUtils.Utils.getFilePathAsync ("sp_human_face_68.dat", (result) => {sp_human_face_68_dat_filepath = result;});yield return getFilePathAsync_1_Coroutine;getFilePath_Coroutine = null;//Run ();}
#endifRectangleTracker rectangleTracker;public void Run(Texture2D imgTexture){rectangleTracker = new RectangleTracker();meshOverlay = this.GetComponent<TrackedMeshOverlay>();  if (imgTexture == null)imgTexture = Config.GetTexture2d(Application.streamingAssetsPath + "/mPhoto.png");//Resources.Load ("family") as Texture2D;gameObject.transform.localScale = new Vector3(imgTexture.width, imgTexture.height, 1);Debug.Log("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);meshOverlay.UpdateOverlayTransform(gameObject.transform);meshOverlay.Reset();float width = 0;float height = 0;width = gameObject.transform.localScale.x;height = gameObject.transform.localScale.y; float widthScale = (float)Screen.width / width;float heightScale = (float)Screen.height / height;if (widthScale < heightScale){Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;}else{Camera.main.orthographicSize = height / 2;}Mat rgbaMat = new Mat(imgTexture.height, imgTexture.width, CvType.CV_8UC4);OpenCVForUnity.UnityUtils.Utils.texture2DToMat(imgTexture, rgbaMat);Debug.Log("rgbaMat ToString " + rgbaMat.ToString());if (faceLandmarkDetector == null)faceLandmarkDetector = new FaceLandmarkDetector(sp_human_face_68_dat_filepath);faceMaskColorCorrector = faceMaskColorCorrector ?? new FaceMaskColorCorrector();FrontalFaceChecker frontalFaceChecker = new FrontalFaceChecker(width, height);// detect faces.List<Rect> detectResult = new List<Rect>();if (useDlibFaceDetecter){OpenCVForUnityUtils.SetImage(faceLandmarkDetector, rgbaMat);List<UnityEngine.Rect> result = faceLandmarkDetector.Detect();foreach (var unityRect in result){detectResult.Add(new Rect((int)unityRect.x, (int)unityRect.y, (int)unityRect.width, (int)unityRect.height));}}else{if (cascade == null)cascade = new CascadeClassifier(haarcascade_frontalface_alt_xml_filepath);//                if (cascade.empty ()) {//                    Debug.LogError ("cascade file is not loaded.Please copy from “FaceTrackerExample/StreamingAssets/” to “Assets/StreamingAssets/” folder. ");//                }// convert image to greyscale.Mat gray = new Mat();Imgproc.cvtColor(rgbaMat, gray, Imgproc.COLOR_RGBA2GRAY);MatOfRect faces = new MatOfRect();Imgproc.equalizeHist(gray, gray);cascade.detectMultiScale(gray, faces, 1.1f, 2, 0 | Objdetect.CASCADE_SCALE_IMAGE, new Size(gray.cols() * 0.05, gray.cols() * 0.05), new Size());//Debug.Log ("faces " + faces.dump ());detectResult = faces.toList();// corrects the deviation of a detection result between OpenCV and Dlib.foreach (Rect r in detectResult){r.y += (int)(r.height * 0.1f);}gray.Dispose();}// face tracking.rectangleTracker.UpdateTrackedObjects(detectResult);List<TrackedRect> trackedRects = new List<TrackedRect>();rectangleTracker.GetObjects(trackedRects, true);// detect face landmark points.OpenCVForUnityUtils.SetImage(faceLandmarkDetector, rgbaMat);List<List<Vector2>> landmarkPoints = new List<List<Vector2>>();foreach (var openCVRect in detectResult){UnityEngine.Rect rect = new UnityEngine.Rect(openCVRect.x, openCVRect.y, openCVRect.width, openCVRect.height);Debug.Log("face : " + rect);//OpenCVForUnityUtils.DrawFaceRect(imgMat, rect, new Scalar(255, 0, 0, 255), 2);if (rect.x==534||rect.y==659)tempfacepoint = faceLandmarkDetector.DetectLandmark(rect);List<Vector2> points = faceLandmarkDetector.DetectLandmark(rect);//OpenCVForUnityUtils.DrawFaceLandmark(imgMat, points, new Scalar(0, 255, 0, 255), 2);landmarkPoints.Add(points);}// mask faces.Creatmaskface(landmarkPoints, trackedRects, frontalFaceChecker,imgTexture,rgbaMat);#region CreatMaskFaces获取识别到的脸部特征人脸数量//int[] face_nums = new int[landmarkPoints.Count];排序//for (int i = 0; i < face_nums.Length; i++)//{//    face_nums[i] = i;//}//face_nums = face_nums.OrderBy (i => System.Guid.NewGuid ()).ToArray ();//float imageWidth = meshOverlay.width;//float imageHeight = meshOverlay.height;//float maskImageWidth = imgTexture.width;//float maskImageHeight = imgTexture.height;//TrackedMesh tm;//int aaindex = -1;//for (int i = 0; i < face_nums.Length; i++)//{//    //创建遮罩//    meshOverlay.CreateObject(i, imgTexture);//    tm = meshOverlay.GetObjectById(i);//    Vector3[] vertices = tm.meshFilter.mesh.vertices;//    if (vertices.Length == landmarkPoints[face_nums[i]].Count)//    {//        for (int j = 0; j < vertices.Length; j++)//        {//            vertices[j].x = landmarkPoints[face_nums[i]][j].x / imageWidth - 0.5f;//            vertices[j].y = 0.5f - landmarkPoints[face_nums[i]][j].y / imageHeight;//        }//    }//    Vector2[] uv = tm.meshFilter.mesh.uv;//    if (uv.Length == landmarkPoints[face_nums[0]].Count)//    {//        for (int jj = 0; jj < uv.Length; jj++)//        {//            uv[jj].x = landmarkPoints[face_nums[aaindex]][jj].x / maskImageWidth;//            uv[jj].y = (maskImageHeight - landmarkPoints[face_nums[aaindex]][jj].y) / maskImageHeight;//        }//    }//    meshOverlay.UpdateObject(i, vertices, null, uv);//    if (enableColorCorrection)//    {//        faceMaskColorCorrector.CreateLUTTex(i);//        Texture2D LUTTex = faceMaskColorCorrector.UpdateLUTTex(i, rgbaMat, rgbaMat, landmarkPoints[face_nums[0]], landmarkPoints[face_nums[i]]);//        tm.sharedMaterial.SetTexture("_LUTTex", LUTTex);//        tm.sharedMaterial.SetFloat("_ColorCorrection", 1f);//    }//    else//    {//        tm.sharedMaterial.SetFloat("_ColorCorrection", 0f);//    }//    // filter non frontal faces.//    if (filterNonFrontalFaces && frontalFaceChecker.GetFrontalFaceRate(landmarkPoints[i]) < frontalFaceRateLowerLimit)//    {//        tm.sharedMaterial.SetFloat("_Fade", 1f);//    }//    else//    {//        tm.sharedMaterial.SetFloat("_Fade", 0.3f);//    }//} #endregion// draw face rects.//if (displayFaceRects)//{//    int ann = face_nums[0];//    UnityEngine.Rect rect_ann = new UnityEngine.Rect(detectResult[ann].x, detectResult[ann].y, detectResult[ann].width, detectResult[ann].height);//    OpenCVForUnityUtils.DrawFaceRect(rgbaMat, rect_ann, new Scalar(255, 255, 0, 255), 2);//    int bob = 0;//    for (int i = 1; i < face_nums.Length; i++)//    {//        bob = face_nums[i];//        UnityEngine.Rect rect_bob = new UnityEngine.Rect(detectResult[bob].x, detectResult[bob].y, detectResult[bob].width, detectResult[bob].height);//        OpenCVForUnityUtils.DrawFaceRect(rgbaMat, rect_bob, new Scalar(255, 0, 0, 255), 2);//    }//}// draw face points.if (displayDebugFacePoints){for (int i = 0; i < landmarkPoints.Count; i++){OpenCVForUnityUtils.DrawFaceLandmark(rgbaMat, landmarkPoints[i], new Scalar(0, 255, 0, 255), 2);}}Texture2D texture = new Texture2D(rgbaMat.cols(), rgbaMat.rows(), TextureFormat.RGBA32, false);OpenCVForUnity.UnityUtils.Utils.matToTexture2D(rgbaMat, texture);gameObject.transform.GetComponent<Renderer>().material.mainTexture = texture;frontalFaceChecker.Dispose();rgbaMat.Dispose();}/// <summary>/// The face mask texture./// </summary>Texture2D faceMaskTexture;/// <summary>/// The Shader.PropertyToID for "_Fade"./// </summary>int shader_FadeID;/// <summary>/// The Shader.PropertyToID for "_ColorCorrection"./// </summary>int shader_ColorCorrectionID;/// <summary>/// The Shader.PropertyToID for "_LUTTex"./// </summary>int shader_LUTTexID;public void Creatmaskface(List<List<Vector2>> landmarkPoints, List<TrackedRect> trackedRects, FrontalFaceChecker frontalFaceChecker,Texture2D texture, Mat rgbaMat){FaceMaskData maskData = faceMaskDatas[0];//faceMaskDataIndex = (faceMaskDataIndex < faceMaskDatas.Count - 1) ? faceMaskDataIndex + 1 : 0;List<Vector2> faceLandmarkPointsInMask = maskData.landmarkPoints;if (maskData == null){Debug.LogError("maskData == null");return;}if (maskData.image == null){Debug.LogError("image == null");return;}if (maskData.landmarkPoints.Count != 68){Debug.LogError("landmarkPoints.Count != 68");return;}faceMaskTexture = maskData.image;Mat faceMaskMat = new Mat(faceMaskTexture.height, faceMaskTexture.width, CvType.CV_8UC4);OpenCVForUnity.UnityUtils.Utils.texture2DToMat(faceMaskTexture, faceMaskMat);#region  创建遮罩face masking.if (faceMaskTexture != null && landmarkPoints.Count >= 1){ // Apply face masking between detected faces and a face mask image.float maskImageWidth = faceMaskTexture.width;float maskImageHeight = faceMaskTexture.height;TrackedRect tr;for (int i = 0; i < trackedRects.Count; i++){tr = trackedRects[i];if (tr.state == TrackedState.NEW){meshOverlay.CreateObject(tr.id, faceMaskTexture);}if (tr.state < TrackedState.DELETED){MaskFace(frontalFaceChecker,meshOverlay, tr, landmarkPoints[i], faceLandmarkPointsInMask, maskImageWidth, maskImageHeight);if (enableColorCorrection){CorrectFaceMaskColor(tr.id, faceMaskMat, rgbaMat, faceLandmarkPointsInMask, landmarkPoints[i]);}}else if (tr.state == TrackedState.DELETED){meshOverlay.DeleteObject(tr.id);}}}else if (landmarkPoints.Count >= 1){ // Apply face masking between detected faces.float maskImageWidth = texture.width;float maskImageHeight = texture.height;TrackedRect tr;for (int i = 0; i < trackedRects.Count; i++){tr = trackedRects[i];if (tr.state == TrackedState.NEW){meshOverlay.CreateObject(tr.id, texture);}if (tr.state < TrackedState.DELETED){MaskFace(frontalFaceChecker, meshOverlay, tr, landmarkPoints[i], landmarkPoints[0], maskImageWidth, maskImageHeight);if (enableColorCorrection){CorrectFaceMaskColor(tr.id, rgbaMat, rgbaMat, landmarkPoints[0], landmarkPoints[i]);}}else if (tr.state == TrackedState.DELETED){meshOverlay.DeleteObject(tr.id);}}}#endregion}private void CorrectFaceMaskColor(int id, Mat src, Mat dst, List<Vector2> src_landmarkPoints, List<Vector2> dst_landmarkPoints){Texture2D LUTTex = faceMaskColorCorrector.UpdateLUTTex(id, src, dst, src_landmarkPoints, dst_landmarkPoints);TrackedMesh tm = meshOverlay.GetObjectById(id);tm.sharedMaterial.SetTexture(shader_LUTTexID, LUTTex);}private void MaskFace(FrontalFaceChecker frontalFaceChecker,TrackedMeshOverlay meshOverlay, TrackedRect tr, List<Vector2> landmarkPoints, List<Vector2> landmarkPointsInMaskImage, float maskImageWidth = 0, float maskImageHeight = 0){shader_LUTTexID = Shader.PropertyToID("_LUTTex");shader_FadeID = Shader.PropertyToID("_Fade");shader_ColorCorrectionID = Shader.PropertyToID("_ColorCorrection");float imageWidth = meshOverlay.width;float imageHeight = meshOverlay.height;if (maskImageWidth == 0)maskImageWidth = imageWidth;if (maskImageHeight == 0)maskImageHeight = imageHeight;TrackedMesh tm = meshOverlay.GetObjectById(tr.id);Vector3[] vertices = tm.meshFilter.mesh.vertices;if (vertices.Length == landmarkPoints.Count){for (int j = 0; j < vertices.Length; j++){vertices[j].x = landmarkPoints[j].x / imageWidth - 0.5f;vertices[j].y = 0.5f - landmarkPoints[j].y / imageHeight;}}Vector2[] uv = tm.meshFilter.mesh.uv;if (uv.Length == landmarkPointsInMaskImage.Count){for (int jj = 0; jj < uv.Length; jj++){uv[jj].x = landmarkPointsInMaskImage[jj].x / maskImageWidth;uv[jj].y = (maskImageHeight - landmarkPointsInMaskImage[jj].y) / maskImageHeight;}}meshOverlay.UpdateObject(tr.id, vertices, null, uv);if (tr.numFramesNotDetected > 3){tm.sharedMaterial.SetFloat(shader_FadeID, 1f);}else if (tr.numFramesNotDetected > 0 && tr.numFramesNotDetected <= 3){tm.sharedMaterial.SetFloat(shader_FadeID, 0.3f + (0.7f / 4f) * tr.numFramesNotDetected);}else{tm.sharedMaterial.SetFloat(shader_FadeID, 0.3f);}if (enableColorCorrection){tm.sharedMaterial.SetFloat(shader_ColorCorrectionID, 1f);}else{tm.sharedMaterial.SetFloat(shader_ColorCorrectionID, 0f);}// filter non frontal faces.if (filterNonFrontalFaces && frontalFaceChecker.GetFrontalFaceRate(landmarkPoints) < frontalFaceRateLowerLimit){tm.sharedMaterial.SetFloat(shader_FadeID, 1f);}}/// <summary>/// Raises the destroy event./// </summary>void OnDestroy(){if (faceMaskColorCorrector != null)faceMaskColorCorrector.Dispose();if (faceLandmarkDetector != null)faceLandmarkDetector.Dispose();if (cascade != null)cascade.Dispose();#if UNITY_WEBGL && !UNITY_EDITORif (getFilePath_Coroutine != null) {StopCoroutine (getFilePath_Coroutine);((IDisposable)getFilePath_Coroutine).Dispose ();}
#endif}
}

以上为两个脚本的代码逻辑;下面需要手动给组件绑定的相关事件方法:
!](https://img-blog.csdnimg.cn/2226064ba29b449caa36b7612ab86d0d.png)

这个预制中能找到,在FaceMaskExample-FaceMaskPrefab文件夹下;


Config文件就几个静态方法,之前还有小伙伴觉得难,这百度一大堆,都是保存图片,或者本地图片转换Sprite精灵图片的方法。

public class Config
{public static string _Path = string.Format("{0}/Facefolder/{1}.png", Application.streamingAssetsPath,"mPhoto");/// <summary>/// 返回图片纹理/// </summary>/// <param name="_strImagePath"></param>/// <returns></returns>public static Texture2D GetTexture2d(string _strImagePath){Texture2D tx = new Texture2D(100, 100);tx.LoadImage(GetImageByte(_strImagePath));// Sprite sp = Sprite.Create(tx, new Rect(0, 0, tx.width, tx.height), Vector2.zero);return tx;}/// <summary>  /// 根据图片路径返回图片的字节流byte[]  /// </summary>  /// <param name="imagePath">图片路径</param>  /// <returns>返回的字节流</returns>  public static byte[] GetImageByte(string imagePath){FileStream files = new FileStream(imagePath, FileMode.Open);byte[] imgByte = new byte[files.Length];files.Read(imgByte, 0, imgByte.Length);files.Close();return imgByte;}
}

好了,自己准备一个模板图片,名字你随便改,如果要和我的一样就改名为mPhoto.png把。放到能加载的地方就行;接下来看效果把。。

脸部特征不是很明显的效果不是很明显,内置的一个外国佬的脸效果刚刚的,但不影响我们的逻辑与功能实现;本文到此结束,希望能够帮助更多和我一样在技术路上探究的小伙伴,一起追寻技术大道,喜欢的点赞,关注,收藏!感谢!!!!!

OpenCvForUnity人脸识别插件动态创建面部特征点Unity换脸相关推荐

  1. Unity换脸插件OpenCvForUnity动态创建面部特征点数据

    因最近项目忙,暂停对此插件进行深一步的了解以及使用,最近有些小伙伴们私信我想看进一步的更新.这两天抽空进行针对性的问题进行了解,并解决动态创建面部特征点数据的方法整理一下,表达能力有限,代码来凑,尽量 ...

  2. 风口上的人脸识别,动态人脸识别市场迎来爆发期。

    人脸识别已经成为人工智能技术的落地风口之一,随着人工智能和深度学习技术的逐渐成熟,人脸识别的商业化落地开始全面加速,人脸支付.人脸门禁.刷脸考勤.刷脸登录,越来越多的智能场景被解锁. 人脸识别包含有动 ...

  3. jQuery 人脸识别插件,支持图片和视频

    jQuery Face Detection 是一款人脸检测插件,能够检测到图片,视频和画布中的人脸坐标.它跟踪人脸并输出人脸模型的坐标位置为一个数组.我们相信,面部识别技术能够给我们的 Web 应用带 ...

  4. 人脸识别(3)---静态人脸识别和动态人脸识别的区别

    静态人脸识别和动态人脸识别的区别 人脸识别,是基于人的脸部特征信息进行身份识别的一种生物识别技术.作为一种新型而且发展较快的技术,很多人对这门技术并没有清晰的理解和认识.比如说,人脸识别有哪些种类,人 ...

  5. lacp静态和动态区别_静态人脸识别和动态人脸识别有哪些区别

    人脸识别,是基于人的脸部特征信息进行身份识别的一种生物识别技术.随着人脸识别技术的发展,在不同的实际应用场景会有不同的种类,比如说静态人脸识别和动态人脸识别,可应用在不同的场景中.这两者有哪些区别呢? ...

  6. 人脸识别标注的68个特征

    人脸识别标注的68个特征.如图 图一标注从1开始,实际应从0开始到67 !!! 图一 图一 人脸识别的68个特征标注顺序. 鼻尖 30   鼻根 27   下巴 8   左眼外角 36   左眼内角 ...

  7. 人脸识别——脸部属性辅助(特征融合)

    <A Deep Face Identification Network Enhanced by Facial Attributes Prediction> 2018,Fariborz Ta ...

  8. 人脸识别经典算法一:特征脸方法(Eigenface)

    这篇文章是撸主要介绍人脸识别经典方法的第一篇,后续会有其他方法更新.特征脸方法基本是将人脸识别推向真正可用的第一种方法,了解一下还是很有必要的.特征脸用到的理论基础PCA在另一篇博客里:特征脸(Eig ...

  9. android识别人脸开放贴纸,人脸识别及动态贴纸

    相关 人脸识别免费框架 使用<大话西游>中一帧作为素材 素材 opencv opencv中demo较为全面,人脸检测使用对应的haarcascades实现. dlib 检测范围更多,精度与 ...

  10. 夜间环境人脸识别_动态人脸识别系统的优势

    TH-894是一款天煌电子全新的三防动态人脸识别xt终端采用嵌入式系统.功耗低,运行更稳定.数据更安全.使用高性能智能处理器,基于深度学习的人脸识别与抓拍信息提取,极大的提高了人脸抓拍率.采用夜间红外 ...

最新文章

  1. XAMPP配置httpd-vhosts.conf后无法启动
  2. ubuntu更改mysql的编码配置
  3. 小程序获取用户信息_App自评估指南:小程序也可参考,第三方获取信息需获用户授权...
  4. Windows端口被占用处理方法
  5. 【CCF】 201604-1折点计数
  6. 自动驾驶——目标检测(Camera传感器)的学习笔记
  7. freemaker--hibernate
  8. dynamic programming动态规划初步理解【-1】
  9. gmp 5.0.1 windows 下编译使用
  10. 全国计算机二级12月福建报名时间,2020年12月福建计算机二级考试报名时间安排...
  11. html:用script实现搜索框
  12. Java导入证书失败Keystore was tampered with, or password was incorrect
  13. 6 模型的属性与功能
  14. 33岁跳槽无路,走投无路之际受贵人指点,成功上岸阿里(Java岗)
  15. 有趣又实用的APP,每一个都让人惊喜满满
  16. work信息每日汇总
  17. comsol如何定义狄利克雷边界_COMSOL中周期性边界条件的应用
  18. 2022字节跳动【数据仓库工程师】日常实习面经-----一面
  19. 为什么会有堆内存和栈内存之分
  20. krpano全景之解决微信内置浏览器不能自动播放音频的问题

热门文章

  1. 统计学和python_深入浅出统计学系列python实现
  2. 利用VBB仿真——实现摇杆时钟
  3. 无线WIFI WPS认证机制破解
  4. batik 在java中,java – Batik不在classpath中
  5. 张宇八套卷(四)复盘
  6. 张宇八套卷(二)复盘
  7. [Mysql]InnoDB数据页结构(掘金小册阅读笔记)
  8. python电子书合集
  9. windows的exe文件反编译为msi安装文件
  10. java两天速成_JAVA速成