基于虹软SDK在C/S 模式下的多人脸实时识别(C#)

一、前言
虹软开发SDK以来,其免费使用的营销策略,成功降低了中小企业使用人脸识别技术的成本。然而,对.NET开发者来说,虹软没有提供C#版本的SDK供开发者直接调用(为什么JAVA就有?!),而是建议开发者利用C++版本封装。大龄的C系程序员都一般是从C开始学习的,但对年轻的开发者来说,指针操作似乎非常棘手。这无形中给虹软SDK的应用带来了较大的挑战。尤其在多人场景下,需要指针移动来获取全部人脸数据。本文通过在.net 5环境下,实现一个基于C/S模式的多人实时识别案例,希望对广大.NETer在运用虹软SDK的时候有一定参考意义。

二、项目结构
1.开发环境 .net5(正式版) 或 .net core3.1 (前后端都是!)

  1. Client端(WPF框架)


    w1.png
  2. Service端(gRPC框架)


    S1.png
  3. 开发工具 / 平台
    VS2019 / Win10

三、项目依赖(nuget)

  1. Client端


    w2.png
  2. Service端


    s2.png

四,项目主要流程
Step1. 客服端监控提取图像种人脸特征。
Step2. 客服端将人脸特征封装入Request Stream,发送至服务端。
Step3. 服务端逐一解析Request Stream中人脸特征,并进行对比识别。
Step4. 服务端将结果写入Response Stream返回。
Step5. 客服端逐一解析Response Stream并显示。

五,核心代码解析

  1. C++ dll 封装
    建议把虹软的dll封装成一个.net core类库,方便前后端调用。
    1.png
using System;
using System.Runtime.InteropServices;

namespace ArcSoft
{
    public class Arcsoft_Face_3_0
    {
        public const string Dll_PATH = "libarcsoft_face_engine.dll";

        /// <summary>
        /// 获取激活文件信息。
        /// </summary>
        /// <param name="activeFileInfo">激活文件信息</param>
        /// <returns></returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFGetActiveFileInfo(IntPtr activeFileInfo);

        /// <summary>
        /// 用于在线激活SDK。
        /// </summary>
        /// <param name="appId">官网获取的APPID</param>
        /// <param name="sdkKey">官网获取的SDKKEY</param>
        /// <returns></returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFOnlineActivation(string appId, string sdkKey);

        /// <summary>
        /// 激活人脸识别SDK引擎函数,ASFActivation 接口与ASFOnlineActivation 功能一致,用于兼容老用户。
        /// </summary>
        /// <param name="appId">SDK对应的AppID</param>
        /// <param name="sdkKey">SDK对应的SDKKey</param>
        /// <returns>调用结果</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFActivation(string appId, string sdkKey);

        /// <summary>
        /// 初始化引擎
        /// </summary>
        /// <param name="detectMode">AF_DETECT_MODE_VIDEO 视频模式 | AF_DETECT_MODE_IMAGE 图片模式</param>
        /// <param name="detectFaceOrientPriority">检测脸部的角度优先值,推荐:ASF_OrientPriority.ASF_OP_0_HIGHER_EXT</param>
        /// <param name="detectFaceScaleVal">用于数值化表示的最小人脸尺寸</param>
        /// <param name="detectFaceMaxNum">最大需要检测的人脸个数</param>
        /// <param name="combinedMask">用户选择需要检测的功能组合,可单个或多个</param>
        /// <param name="hEngine">初始化返回的引擎handle</param>
        /// <returns>调用结果</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFInitEngine(uint detectMode, int detectFaceOrientPriority, int detectFaceScaleVal, int detectFaceMaxNum, int combinedMask, ref IntPtr hEngine);

        /// <summary>
        /// 人脸检测
        /// </summary>
        /// <param name="hEngine">引擎handle</param>
        /// <param name="width">图像宽度</param>
        /// <param name="height">图像高度</param>
        /// <param name="format">图像颜色空间</param>
        /// <param name="imgData">图像数据</param>
        /// <param name="detectedFaces">人脸检测结果</param>
        /// <param name="detectModel">预留字段,当前版本使用默认参数即可</param>
        /// <returns>调用结果</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFDetectFaces(IntPtr hEngine, int width, int height, int format, IntPtr imgData, IntPtr detectedFaces, int detectModel);

        /// <summary>
        /// 检测人脸信息。
        /// </summary>
        /// <param name="hEngine">引擎句柄</param>
        /// <param name="ImgData">图像数据</param>
        /// <param name="detectedFaces">检测到的人脸信息</param>
        /// <param name="detectModel">预留字段,当前版本使用默认参数即可</param>
        /// <returns>人脸信息</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFDetectFacesEx(IntPtr hEngine, IntPtr ImgData, out IntPtr detectedFaces, int detectModel);

        /// <summary>
        /// 单人脸特征提取
        /// </summary>
        /// <param name="hEngine">引擎handle</param>
        /// <param name="width">图像宽度,为4的倍数</param>
        /// <param name="height">图像高度,YUYV/I420/NV21/NV12格式为2的倍数;BGR24/GRAY/DEPTH_U16格式无限制</param>
        /// <param name="format">图像颜色空间</param>
        /// <param name="imgData">图像数据</param>
        /// <param name="faceInfo">单人脸信息(人脸框、人脸角度)</param>
        /// <param name="faceFeature">提取到的人脸特征信息</param>
        /// <returns>人脸特征信息</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFFaceFeatureExtract(IntPtr hEngine, int width, int height, int format, IntPtr imgData, IntPtr faceInfo, IntPtr faceFeature);

        /// <summary>
        /// 单人特征提取。
        /// </summary>
        /// <param name="hEngine">引擎句柄</param>
        /// <param name="imgData">图像数据</param>
        /// <param name="faceInfo">单人脸信息(人脸框、人脸角度)</param>
        /// <param name="feature">提取到的人脸特征信息</param>
        /// <returns></returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFFaceFeatureExtractEx(IntPtr hEngine, IntPtr imgData, IntPtr faceInfo, IntPtr feature);

        /// <summary>
        /// 人脸特征比对,输出比对相似度。
        /// </summary>
        /// <param name="hEngine">引擎句柄</param>
        /// <param name="feature1">人脸特征</param>
        /// <param name="feature2">人脸特征</param>
        /// <param name="confidenceLevel">比对相似度</param>
        /// <param name="compareModel">选择人脸特征比对模型,默认为ASF_LIFE_PHOTO。
        /// 1. ASF_LIFE_PHOTO:用于生活照之间的特征比对,推荐阈值0.80;
        /// 2. ASF_ID_PHOTO:用于证件照或证件照和生活照之间的特征比对,推荐阈值0.82;</param>
        /// <returns>比对相似度</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFFaceFeatureCompare(IntPtr hEngine, IntPtr feature1, IntPtr feature2, ref float confidenceLevel, int compareModel);

        /// <summary>
        /// 设置RGB/IR活体阈值,若不设置内部默认RGB:0.5 IR:0.7。
        /// </summary>
        /// <param name="hEngine">引擎句柄</param>
        /// <param name="threshold">活体阈值,推荐RGB:0.5 IR:0.7</param>
        /// <returns>设置状态</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFSetLivenessParam(IntPtr hEngine, IntPtr threshold);

        /// <summary>
        /// 人脸属性检测
        /// </summary>
        /// <param name="hEngine">引擎句柄</param>
        /// <param name="width">图片宽度,为4的倍数</param>
        /// <param name="height">图片高度,YUYV/I420/NV21/NV12格式为2的倍数;BGR24格式无限制;</param>
        /// <param name="format">支持YUYV/I420/NV21/NV12/BGR24</param>
        /// <param name="imgData">图像数据</param>
        /// <param name="detectedFaces">多人脸信息</param>
        /// <param name="combinedMask">1.检测的属性(ASF_AGE、ASF_GENDER、 ASF_FACE3DANGLE、ASF_LIVENESS),支持多选
        /// 2.检测的属性须在引擎初始化接口的combinedMask参数中启用</param>
        /// <returns>检测状态</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFProcess(IntPtr hEngine, int width, int height, int format, IntPtr imgData, IntPtr detectedFaces, int combinedMask);

        /// <summary>
        /// 人脸信息检测(年龄/性别/人脸3D角度),最多支持4张人脸信息检测,超过部分返回未知(活体仅支持单张人脸检测,超出返回未知),接口不支持IR图像检测。
        /// </summary>
        /// <param name="hEngine">引擎句柄</param>
        /// <param name="imgData">图像数据</param>
        /// <param name="detectedFaces">多人脸信息</param>
        /// <param name="combinedMask">1.检测的属性(ASF_AGE、ASF_GENDER、 ASF_FACE3DANGLE、ASF_LIVENESS),支持多选
        /// 2.检测的属性须在引擎初始化接口的combinedMask参数中启用</param>
        /// <returns>检测状态</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFProcessEx(IntPtr hEngine, IntPtr imgData, IntPtr detectedFaces, int combinedMask);

        /// <summary>
        /// 获取年龄信息
        /// </summary>
        /// <param name="hEngine">引擎handle</param>
        /// <param name="ageInfo">检测到的年龄信息</param>
        /// <returns>调用结果</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFGetAge(IntPtr hEngine, IntPtr ageInfo);

        /// <summary>
        /// 获取性别信息
        /// </summary>
        /// <param name="hEngine">引擎handle</param>
        /// <param name="genderInfo">检测到的性别信息</param>
        /// <returns>调用结果</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFGetGender(IntPtr hEngine, IntPtr genderInfo);

        /// <summary>
        /// 获取3D角度信息
        /// </summary>
        /// <param name="hEngine">引擎handle</param>
        /// <param name="p3DAngleInfo">检测到脸部3D角度信息</param>
        /// <returns>调用结果</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFGetFace3DAngle(IntPtr hEngine, IntPtr p3DAngleInfo);

        /// <summary>
        /// 获取RGB活体信息。
        /// </summary>
        /// <param name="hEngine">引擎句柄</param>
        /// <param name="livenessInfo">检测到的活体信息</param>
        /// <returns>调用结果</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFGetLivenessScore(IntPtr hEngine, IntPtr livenessInfo);

        /// <summary>
        /// 该接口仅支持单人脸IR 活体检测,超出返回未知。
        /// </summary>
        /// <param name="hEngine">引擎句柄</param>
        /// <param name="width">图片宽度,为4的倍数</param>
        /// <param name="height">图片高度</param>
        /// <param name="format">图像颜色格式</param>
        /// <param name="imgData">图像数据</param>
        /// <param name="detectedFaces">多人脸信息</param>
        /// <param name="combinedMask">目前仅支持ASF_IR_LIVENESS</param>
        /// <returns>调用结果</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFProcess_IR(IntPtr hEngine, int width, int height, int format, IntPtr imgData, IntPtr detectedFaces, int combinedMask);

        /// <summary>
        /// 该接口仅支持单人脸IR 活体检测,超出返回未知。
        /// </summary>
        /// <param name="hEngine">引擎句柄</param>
        /// <param name="imgData">图像数据</param>
        /// <param name="detectedFaces">多人脸信息</param>
        /// <param name="combinedMask">目前仅支持ASF_IR_LIVENESS</param>
        /// <returns>调用结果</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFProcessEx_IR(IntPtr hEngine, IntPtr imgData, IntPtr detectedFaces, int combinedMask);

        /// <summary>
        /// 获取IR活体信息。
        /// </summary>
        /// <param name="hEngine">引擎句柄</param>
        /// <param name="livenessInfo">检测到的IR活体信息</param>
        /// <returns></returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFGetLivenessScore_IR(IntPtr hEngine, IntPtr livenessInfo);

        /// <summary>
        /// 获取SDK版本信息。
        /// </summary>
        /// <returns>成功返回版本信息,失败返回Null。</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern ASF_VERSION ASFGetVersion();

        /// <summary>
        /// 销毁SDK引擎。
        /// </summary>
        /// <param name="pEngine">引擎handle</param>
        /// <returns>调用结果</returns>
        [DllImport(Dll_PATH, CallingConvention = CallingConvention.Cdecl)]
        public static extern int ASFUninitEngine(IntPtr pEngine);
    }

    /////////////////////////////////参数枚举/////////////////////////////////
    /// <summary>
    /// 检测模式
    /// </summary>
    public struct ASF_DetectMode
    {
        /// <summary>
        /// Video模式,一般用于多帧连续检测
        /// </summary>
        public const uint ASF_DETECT_MODE_VIDEO = 0x00000000;

        /// <summary>
        /// Image模式,一般用于静态图的单次检测
        /// </summary>
        public const uint ASF_DETECT_MODE_IMAGE = 0xFFFFFFFF;
    }

    /// <summary>
    /// 人脸检测方向
    /// </summary>
    public struct ArcSoftFace_OrientPriority
    {
        /// <summary>
        /// 常规预览下正方向
        /// </summary>
        public const int ASF_OP_0_ONLY = 0x1;

        /// <summary>
        /// 基于0°逆时针旋转90°的方向
        /// </summary>
        public const int ASF_OP_90_ONLY = 0x2;

        /// <summary>
        /// 基于0°逆时针旋转270°的方向
        /// </summary>
        public const int ASF_OP_270_ONLY = 0x3;

        /// <summary>
        /// 基于0°旋转180°的方向(逆时针、顺时针效果一样)
        /// </summary>
        public const int ASF_OP_180_ONLY = 0x4;

        /// <summary>
        /// 全角度
        /// </summary>
        public const int ASF_OP_0_HIGHER_EXT = 0x5;
    }

    /// <summary>
    /// 检测到的人脸角度
    /// </summary>
    public struct ArcSoftFace_OrientCode
    {
        public const int ASF_OC_0 = 0x1; // 0度
        public const int ASF_OC_90 = 0x2; // 90度
        public const int ASF_OC_270 = 0x3; // 270度
        public const int ASF_OC_180 = 0x4; // 180度
        public const int ASF_OC_30 = 0x5; // 30度
        public const int ASF_OC_60 = 0x6; // 60度
        public const int ASF_OC_120 = 0x7; // 120度
        public const int ASF_OC_150 = 0x8; // 150度
        public const int ASF_OC_210 = 0x9; // 210度
        public const int ASF_OC_240 = 0xa; // 240度
        public const int ASF_OC_300 = 0xb; // 300度
        public const int ASF_OC_330 = 0xc; // 330度
    }

    /// <summary>
    /// 检测模型
    /// </summary>
    public struct ASF_DetectModel
    {
        public const int ASF_DETECT_MODEL_RGB = 0x1; //RGB图像检测模型
        //预留扩展其他检测模型
    }

    /// <summary>
    /// 人脸比对可选的模型
    /// </summary>
    public struct ASF_CompareModel
    {
        public const int ASF_LIFE_PHOTO = 0x1;  //用于生活照之间的特征比对,推荐阈值0.80
        public const int ASF_ID_PHOTO = 0x2;    //用于证件照或生活照与证件照之间的特征比对,推荐阈值0.82
    }

    /// <summary>
    /// 支持的颜色空间颜色格式
    /// </summary>
    public struct ASF_ImagePixelFormat
    {
        //8-bit Y 通道,8-bit 2x2 采样 V 与 U 分量交织通道
        public const int ASVL_PAF_NV21 = 2050;
        //8-bit Y 通道,8-bit 2x2 采样 U 与 V 分量交织通道
        public const int ASVL_PAF_NV12 = 2049;
        //RGB 分量交织,按 B, G, R, B 字节序排布
        public const int ASVL_PAF_RGB24_B8G8R8 = 513;
        //8-bit Y 通道, 8-bit 2x2 采样 U 通道, 8-bit 2x2 采样 V 通道
        public const int ASVL_PAF_I420 = 1537;
        //YUV 分量交织, V 与 U 分量 2x1 采样,按 Y0, U0, Y1, V0 字节序排布
        public const int ASVL_PAF_YUYV = 1289;
        //8-bit IR图像
        public const int ASVL_PAF_GRAY = 1793;
        //16-bit IR图像,ASVL_PAF_DEPTH_U16 只是预留。
        public const int ASVL_PAF_DEPTH_U16 = 3074;
    }

    /// <summary>
    /// 算法功能常量值
    /// </summary>
    public struct FaceEngineMask
    {
        //人脸检测
        public const int ASF_FACE_DETECT = 0x00000001;
        //人脸特征
        public const int ASF_FACERECOGNITION = 0x00000004;
        //年龄
        public const int ASF_AGE = 0x00000008;
        //性别
        public const int ASF_GENDER = 0x00000010;
        //3D角度
        public const int ASF_FACE3DANGLE = 0x00000020;
        //RGB活体
        public const int ASF_LIVENESS = 0x00000080;
        //IR活体
        public const int ASF_IR_LIVENESS = 0x00000400;
    }

    /////////////////////////////////数据结构/////////////////////////////////
    /// <summary>
    /// SDK版本信息。
    /// </summary>
    [StructLayout(LayoutKind.Sequential, CharSet = CharSet.Ansi)]
    public struct ASF_VERSION
    {
        //版本号
        public IntPtr Version;
        //构建日期
        public IntPtr BuildDate;
        //版权说明
        public IntPtr CopyRight;
    }

    /// <summary>
    /// 激活文件信息。
    /// </summary>
    [StructLayout(LayoutKind.Sequential, CharSet = CharSet.Ansi)]
    public struct ASF_ActiveFileInfo
    {
        /// <summary>
        /// 开始时间
        /// </summary>
        public IntPtr startTime;

        /// <summary>
        /// 截止时间
        /// </summary>
        public IntPtr endTime;

        /// <summary>
        /// 平台
        /// </summary>
        public IntPtr platform;

        /// <summary>
        /// sdk类型
        /// </summary>
        public IntPtr sdkType;

        /// <summary>
        /// APPID
        /// </summary>
        public IntPtr appId;

        /// <summary>
        /// SDKKEY
        /// </summary>
        public IntPtr sdkKey;

        /// <summary>
        /// SDK版本号
        /// </summary>
        public IntPtr sdkVersion;

        /// <summary>
        /// 激活文件版本号
        /// </summary>
        public IntPtr fileVersion;
    }

    /// <summary>
    /// 人脸框信息。
    /// </summary>
    public struct MRECT
    {
        public int left;
        public int top;
        public int right;
        public int bottom;
    }

    /// <summary>
    /// 单人脸信息。
    /// </summary>
    public struct ASF_SingleFaceInfo
    {
        // 人脸框
        public MRECT faceRect;
        //人脸角度
        public int faceOrient;
    }

    /// <summary>
    /// 多人脸信息。
    /// </summary>
    public struct ASF_MultiFaceInfo
    {
        // 人脸框数组
        public IntPtr faceRects;
        // 人脸角度数组
        public IntPtr faceOrients;
        // 检测到的人脸数
        public int faceNum;
        // 一张人脸从进入画面直到离开画面,faceID不变。在VIDEO模式下有效,IMAGE模式下为空。
        public IntPtr faceID;
    }

    /// <summary>
    /// 人脸特征。
    /// </summary>
    public struct ASF_FaceFeature
    {
        // 人脸特征
        public IntPtr feature;
        // 人脸特征长度
        public int featureSize;
    }

    /// <summary>
    /// 年龄信息。
    /// </summary>
    public struct ASF_AgeInfo
    {
        //0:未知; >0:年龄
        IntPtr ageArray;
        //检测的人脸数
        int num;
    }

    /// <summary>
    /// 性别信息。
    /// </summary>
    public struct ASF_GenderInfo
    {
        //0:男性; 1:女性; -1:未知
        IntPtr genderArray;
        //检测的人脸数
        int num;
    }

    /// <summary>
    /// 3D角度信息。
    /// </summary>
    public struct ASF_Face3DAngle
    {
        //横滚角
        public IntPtr roll;
        //偏航角
        public IntPtr yaw;
        //俯仰角
        public IntPtr pitch;
        //0:正常; 非0:异常
        public IntPtr status;
        //检测的人脸个数
        public IntPtr num;
    }

    /// <summary>
    /// 活体置信度。
    /// </summary>
    public struct ASF_LivenessThreshold
    {
        // BGR活体检测阈值设置,默认值0.5
        float thresholdmodel_BGR;
        // IR活体检测阈值设置,默认值0.7
        float thresholdmodel_IR;
    }

    /// <summary>
    /// 活体信息。
    /// </summary>
    public struct ASF_LivenessInfo
    {
        //0:非真人; 1:真人;-1:不确定; -2:传入人脸数 > 1;-3: 人脸过小;-4: 角度过大;-5: 人脸超出边界
        public IntPtr isLive;
        //检测的人脸个数
        public int num;
    }

    /// <summary>
    /// 图像数据信息。
    /// </summary>
    public struct ASVLOFFSCREEN
    {
        public uint u32PixelArrayFormat;
        public int i32Width;
        public int i32Height;
        [MarshalAs(UnmanagedType.ByValArray, SizeConst = 4, ArraySubType = UnmanagedType.SysUInt)]
        public IntPtr[] ppu8Plane;
        [MarshalAs(UnmanagedType.ByValArray, SizeConst = 4, ArraySubType = UnmanagedType.I4)]
        public int[] pi32Pitch;
    }
}
using ArcSoft.Utilities;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Drawing;
using System.Drawing.Imaging;
using System.IO;
using System.Runtime.InteropServices;


namespace ArcSoft
{
    public class Arcsoft_Face_Action : Arcsoft_Face_3_0, IEnginePoor
    {
        public string AppID { get; }
        public string AppKey { get; }
        public int FaceEngineNums { get; set; }
        public int IDEngineNums { get; set; }
        public int AIEngineNums { get; set; }
        public ConcurrentQueue<IntPtr> FaceEnginePoor { get; set; }
        public ConcurrentQueue<IntPtr> IDEnginePoor { get; set; }
        public ConcurrentQueue<IntPtr> AIEnginePoor { get; set; }

        public Arcsoft_Face_Action()
        {

        }

        public Arcsoft_Face_Action(string appId, string appKey)
        {
            int retCode = -1;
            try
            {
                retCode = ASFOnlineActivation(appId, appKey);
                if (retCode == 0)
                {
                }
                else if (retCode == 90114)
                {
                }
                else
                {
                    throw new Exception("SDK激活失败,错误码:" + retCode);
                }
                AppID = appId;
                AppKey = appKey;
            }
            catch (Exception ex)
            {
                throw new Exception($"Arcsoft_Face_Action 初始化失败,异常:{ex.Message}");
            }
        }

        public IntPtr InitASFEnginePtr(int faceMask, bool isImageMode = true)
        {
            IntPtr pEngines = IntPtr.Zero;
            int retCode = -1;
            try
            {
                if (isImageMode)
                {
                    retCode = ASFInitEngine(ASF_DetectMode.ASF_DETECT_MODE_IMAGE, ArcSoftFace_OrientPriority.ASF_OP_0_HIGHER_EXT, ParmsBestPractice.detectFaceScaleVal_Image, ParmsBestPractice.detectFaceMaxNum, faceMask, ref pEngines);
                }
                else
                {
                    retCode = ASFInitEngine(ASF_DetectMode.ASF_DETECT_MODE_VIDEO, ArcSoftFace_OrientPriority.ASF_OP_0_HIGHER_EXT, ParmsBestPractice.detectFaceScaleVal_Video, ParmsBestPractice.detectFaceMaxNum, faceMask, ref pEngines);
                }
                if (retCode == 0)
                {
                }
                else
                {
                    throw new Exception("SDK初始化失败,错误码:" + retCode);
                }
                return pEngines;
            }
            catch (Exception ex)
            {
                throw new Exception("ASFFunctions->ASFFunctions, generate exception as: " + ex);
            }
        }

        public static ASF_MultiFaceInfo DetectMultipleFace(IntPtr pEngine, ImageInfo imageInfo)
        {
            ASF_MultiFaceInfo multiFaceInfo = new ASF_MultiFaceInfo();
            IntPtr pMultiFaceInfo = Marshal.AllocHGlobal(Marshal.SizeOf<ASF_MultiFaceInfo>());
            try
            {
                int retCode = ASFDetectFaces(pEngine, imageInfo.width, imageInfo.height, imageInfo.format, imageInfo.imgData, pMultiFaceInfo, ASF_DetectModel.ASF_DETECT_MODEL_RGB);
                multiFaceInfo = Marshal.PtrToStructure<ASF_MultiFaceInfo>(pMultiFaceInfo);
                return multiFaceInfo;
            }
            catch
            {
                return multiFaceInfo;
            }
            finally
            {
                Marshal.FreeHGlobal(pMultiFaceInfo);
            }
        }

        public static List<MarkFaceInfor> DetectMultipleFaceAllInformation(IntPtr pEngine, ImageInfo imageInfo, bool extractFaceData = false)
        {
            List<MarkFaceInfor> infors = new List<MarkFaceInfor>();
            ASF_MultiFaceInfo multiFaceInfo = new ASF_MultiFaceInfo();
            IntPtr pMultiFaceInfo = Marshal.AllocHGlobal(Marshal.SizeOf<ASF_MultiFaceInfo>());
            try
            {
                int retCode = ASFDetectFaces(pEngine, imageInfo.width, imageInfo.height, imageInfo.format, imageInfo.imgData, pMultiFaceInfo, ASF_DetectModel.ASF_DETECT_MODEL_RGB);
                multiFaceInfo = Marshal.PtrToStructure<ASF_MultiFaceInfo>(pMultiFaceInfo);
                for (int faceIndex = 0; faceIndex < multiFaceInfo.faceNum; faceIndex++)
                {
                    ASF_SingleFaceInfo singleFaceInfo = new ASF_SingleFaceInfo();
                    singleFaceInfo.faceRect = Marshal.PtrToStructure<MRECT>(multiFaceInfo.faceRects + Marshal.SizeOf<MRECT>() * faceIndex);
                    singleFaceInfo.faceOrient = Marshal.PtrToStructure<int>(multiFaceInfo.faceOrients + Marshal.SizeOf<int>() * faceIndex);
                    MarkFaceInfor markFaceInfor = new MarkFaceInfor(singleFaceInfo.faceRect.left, singleFaceInfo.faceRect.top, singleFaceInfo.faceRect.right - singleFaceInfo.faceRect.left, singleFaceInfo.faceRect.bottom - singleFaceInfo.faceRect.top);
                    markFaceInfor.faceID = Marshal.PtrToStructure<int>(multiFaceInfo.faceID + Marshal.SizeOf<int>() * faceIndex);
                    if (extractFaceData)
                    {
                        markFaceInfor.faceFeatureData = ExtractSingleFaceFeature(pEngine, imageInfo, singleFaceInfo.faceRect, singleFaceInfo.faceOrient);
                    }
                    infors.Add(markFaceInfor);
                }
                return infors;
            }
            catch (Exception ex)
            {
                throw new Exception($"Arcsoft_Face_Action-->DetectMultipleFaceAllInformation 异常,异常信息:{ex.Message}");
            }
            finally
            {
                Marshal.FreeHGlobal(pMultiFaceInfo);
            }
        }

        public static bool ExtractFeaturesFromMemoryStream(Stream ms, IntPtr engine, out List<byte[]> facesFeature, out string errorString)
        {
            facesFeature = new List<byte[]>();
            errorString = null;
            try
            {
                ImageInfo imageInfo = new ImageInfo();
                ASF_MultiFaceInfo facesInfo = new ASF_MultiFaceInfo();
                imageInfo = ImageHelper.ReadBMPFormStream(ms);
                facesInfo = DetectMultipleFace(engine, imageInfo);
                if (facesInfo.faceNum == 0)
                {
                    errorString = "检测到人脸数量为0,请免冠正对镜头重新识别!";
                    return false;
                }
                if (facesInfo.faceNum > 1)
                {
                    errorString = "检测到多张人脸,请多余人员退出识别区,再重新识别!";
                    return false;
                }
                facesFeature = ExtractAllFeatures(engine, imageInfo, facesInfo);
                return true;
            }
            catch
            {
                errorString = "算法错误,请检查输入后重试!";
                return false;
            }
            finally
            {
                GC.Collect();
            }
        }

        private static byte[] ExtractSingleFaceFeature(IntPtr pEngine, ImageInfo imageInfo, MRECT rect, int faceOrient)
        {
            var singleFaceInfo = new ASF_SingleFaceInfo();
            singleFaceInfo.faceRect = rect;
            singleFaceInfo.faceOrient = faceOrient;
            IntPtr pSingleFaceInfo = Marshal.AllocHGlobal(Marshal.SizeOf<ASF_SingleFaceInfo>());
            Marshal.StructureToPtr(singleFaceInfo, pSingleFaceInfo, false);
            IntPtr pFaceFeature = Marshal.AllocHGlobal(Marshal.SizeOf<ASF_FaceFeature>());
            try
            {
                int retCode = ASFFaceFeatureExtract(pEngine, imageInfo.width, imageInfo.height, imageInfo.format, imageInfo.imgData, pSingleFaceInfo, pFaceFeature);
                if (retCode == 0)
                {
                    ASF_FaceFeature faceFeature = Marshal.PtrToStructure<ASF_FaceFeature>(pFaceFeature);
                    byte[] feature = new byte[faceFeature.featureSize];
                    Marshal.Copy(faceFeature.feature, feature, 0, faceFeature.featureSize);
                    return feature;
                }
                if (retCode == 81925)
                {
                    throw new Exception("人脸特征检测结果置信度低!");
                }
                else
                {
                    return null;
                }
            }
            catch (Exception ex)
            {
                throw new Exception($"Arcsoft_Face_Action-->ExtractSingleFaceFeature exception: {ex.Message}");
            }
            finally
            {
                Marshal.FreeHGlobal(pSingleFaceInfo);
                Marshal.FreeHGlobal(pFaceFeature);
            }
        }

        public static List<byte[]> ExtractAllFeatures(IntPtr pEngine, ImageInfo imageInfo, ASF_MultiFaceInfo multiFaceInfo)
        {
            try
            {
                ASF_SingleFaceInfo singleFaceInfo = new ASF_SingleFaceInfo();
                List<byte[]> results = new List<byte[]>();
                for (int index = 0; index < multiFaceInfo.faceNum; index++)
                {
                    singleFaceInfo.faceRect = Marshal.PtrToStructure<MRECT>(multiFaceInfo.faceRects + Marshal.SizeOf<MRECT>() * index);
                    singleFaceInfo.faceOrient = Marshal.PtrToStructure<int>(multiFaceInfo.faceOrients + Marshal.SizeOf<int>() * index);
                    byte[] singleFaceFeature = ExtractSingleFaceFeature(pEngine, imageInfo, singleFaceInfo.faceRect, singleFaceInfo.faceOrient);
                    if (singleFaceFeature != null)
                    {
                        results.Add(singleFaceFeature);
                    }
                }
                return results;
            }
            catch (Exception ex)
            {
                throw new Exception("Arcsoft_Face_Action-->ExtractAllFeatures exception " + ex);
            }
            finally
            {
                Marshal.FreeHGlobal(imageInfo.imgData);
            }
        }

        public static IntPtr GetBMP_Ptr(Bitmap image, out int width, out int height, out int pitch)
        {
            IntPtr imageDataPtr = IntPtr.Zero;
            try
            {
                width = -1;
                height = -1;
                pitch = -1;
                byte[] imageData = ReadBMP(image, ref width, ref height, ref pitch);
                imageDataPtr = Marshal.AllocHGlobal(imageData.Length);
                Marshal.Copy(imageData, 0, imageDataPtr, imageData.Length);
                return imageDataPtr;
            }
            catch (Exception ex)
            {
                Marshal.FreeHGlobal(imageDataPtr);
                throw new Exception($"Arcsoft_Face_Action-->GetBMP_Ptr exception as:{ex.Message}");
            }
        }

        public static byte[] ReadBMP(Bitmap image, ref int width, ref int height, ref int pitch)
        {
            //将Bitmap锁定到系统内存中,获得BitmapData
            BitmapData data = image.LockBits(new Rectangle(0, 0, image.Width, image.Height), ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb);
            //位图中第一个像素数据的地址。它也可以看成是位图中的第一个扫描行
            IntPtr ptr = data.Scan0;
            //定义数组长度
            int soureBitArrayLength = data.Height * Math.Abs(data.Stride);
            byte[] sourceBitArray = new byte[soureBitArrayLength];
            //将bitmap中的内容拷贝到ptr_bgr数组中
            Marshal.Copy(ptr, sourceBitArray, 0, soureBitArrayLength); width = data.Width;
            height = data.Height;
            pitch = Math.Abs(data.Stride);
            int line = width * 3;
            int bgr_len = line * height;
            byte[] destBitArray = new byte[bgr_len];
            for (int i = 0; i < height; ++i)
            {
                Array.Copy(sourceBitArray, i * pitch, destBitArray, i * line, line);
            }
            pitch = line;
            image.UnlockBits(data);
            return destBitArray;
        }

        public static ASVLOFFSCREEN ChangeMat2ASVLOFFSCREEN(Bitmap image)
        {
            int width = -1;
            int height = -1;
            int pitch = -1;
            IntPtr imagePtr = GetBMP_Ptr(image, out width, out height, out pitch);
            ASVLOFFSCREEN offInput = new ASVLOFFSCREEN();
            offInput.u32PixelArrayFormat = 513;
            offInput.ppu8Plane = new IntPtr[4];
            offInput.ppu8Plane[0] = imagePtr;
            offInput.i32Width = width;
            offInput.i32Height = height;
            offInput.pi32Pitch = new int[4];
            offInput.pi32Pitch[0] = pitch;
            return offInput;
        }

        public static IntPtr PutFeatureByteIntoFeatureIntPtr(byte[] data)
        {
            try
            {
                if (data.Length > 0)
                {
                    ASF_FaceFeature localFeature = new ASF_FaceFeature();
                    localFeature.featureSize = data.Length;
                    localFeature.feature = Marshal.AllocHGlobal(localFeature.featureSize);
                    Marshal.Copy(data, 0, localFeature.feature, data.Length);
                    IntPtr intPtrFeature = Marshal.AllocHGlobal(Marshal.SizeOf<ASF_FaceFeature>());
                    Marshal.StructureToPtr(localFeature, intPtrFeature, false);
                    return intPtrFeature;
                }
                else
                {
                    return IntPtr.Zero;
                }
            }
            catch
            {
                return IntPtr.Zero;
            }
        }

        private int InitEnginePool()
        {
            try
            {
                for (int index = 0; index < FaceEngineNums; index++)
                {
                    IntPtr enginePtr = IntPtr.Zero;
                    Arcsoft_Face_Action faceAction = new Arcsoft_Face_Action(AppID, AppKey);
                    enginePtr = faceAction.InitASFEnginePtr(ParmsBestPractice.faceBaseMask);
                    PutEngine(FaceEnginePoor, enginePtr);
                    Console.WriteLine($"FaceEnginePoor add {enginePtr}");
                }
                for (int index = 0; index < IDEngineNums; index++)
                {
                    IntPtr enginePtr = IntPtr.Zero;
                    Arcsoft_Face_Action faceAction = new Arcsoft_Face_Action(AppID, AppKey);
                    enginePtr = faceAction.InitASFEnginePtr(ParmsBestPractice.faceBaseMask);
                    PutEngine(IDEnginePoor, enginePtr);
                    Console.WriteLine($"IDEnginePoor add {enginePtr}");
                }
                for (int index = 0; index < AIEngineNums; index++)
                {
                    IntPtr enginePtr = IntPtr.Zero;
                    int aiMask = FaceEngineMask.ASF_AGE | FaceEngineMask.ASF_GENDER | FaceEngineMask.ASF_FACE3DANGLE | FaceEngineMask.ASF_LIVENESS;
                    Arcsoft_Face_Action faceAction = new Arcsoft_Face_Action(AppID, AppKey);
                    enginePtr = faceAction.InitASFEnginePtr(ParmsBestPractice.faceBaseMask | aiMask);
                    PutEngine(AIEnginePoor, enginePtr);
                    Console.WriteLine($"AIEnginePoor add {enginePtr}");
                }
                return 0;
            }
            catch (Exception ex)
            {
                throw new Exception($"InitEnginePool--> exception {ex}");
            }
        }

        public IntPtr GetEngine(ConcurrentQueue<IntPtr> queue)
        {
            IntPtr item = IntPtr.Zero;
            if (queue.TryDequeue(out item))
            {
                return item;
            }
            else
            {
                return IntPtr.Zero;
            }
        }

        public void PutEngine(ConcurrentQueue<IntPtr> queue, IntPtr item)
        {
            if (item != IntPtr.Zero)
            {
                queue.Enqueue(item);
            }
        }

        public void Arcsoft_EnginePool(int faceEngineNums = 1, int idEngineNums = 0, int aiEngineNums = 0)
        {
            FaceEnginePoor = new ConcurrentQueue<IntPtr>();
            IDEnginePoor = new ConcurrentQueue<IntPtr>();
            AIEnginePoor = new ConcurrentQueue<IntPtr>();
            try
            {
                FaceEngineNums = faceEngineNums;
                IDEngineNums = idEngineNums;
                AIEngineNums = aiEngineNums;
                int status = InitEnginePool();
                if (status != 0)
                {
                    throw new Exception("引擎池初始化失败!");
                }
            }
            catch (Exception ex)
            {
                throw new Exception($"ArcSoft_EnginePool-->ArcSoft_EnginePool exception as: {ex}");
            }
        }
    }

    public struct ParmsBestPractice
    {
        //VIDEO模式取值范围[2,32],推荐值为16
        public const int detectFaceScaleVal_Video = 16;

        //MAGE模式取值范围[2,32],推荐值为30
        public const int detectFaceScaleVal_Image = 32;

        //最大需要检测的人脸个数,取值范围[1,50]
        public const int detectFaceMaxNum = 50;

        //人脸识别最基本功能。
        public const int faceBaseMask = FaceEngineMask.ASF_FACE_DETECT | FaceEngineMask.ASF_FACERECOGNITION;

        //RGB活体检测
        public const int faceLivingMask = FaceEngineMask.ASF_FACE_DETECT | FaceEngineMask.ASF_FACERECOGNITION | FaceEngineMask.ASF_LIVENESS;

        //process可传入属性组合只有ASF_AGE 、ASF_LIVENESS 、ASF_AGE 和 ASF_LIVENESS
        public const int processSupportMask = FaceEngineMask.ASF_AGE | FaceEngineMask.ASF_GENDER | FaceEngineMask.ASF_FACE3DANGLE | FaceEngineMask.ASF_LIVENESS;
    }
}
using System;
using System.Collections.Generic;
using System.Drawing;
using System.Drawing.Imaging;
using System.Runtime.InteropServices;

namespace ArcSoft
{
    public class FaceAction : Arcsoft_Face_3_0
    {
        public FaceAction(string appId, string appKey, ref IntPtr pEngines, bool isImageMode = false, int faceMask = (FaceEngineMask.ASF_FACE_DETECT | FaceEngineMask.ASF_FACERECOGNITION))
        {
            int retCode = -1;
            try
            {
                retCode = ASFActivation(appId, appKey);
                if ((retCode == 0) || (retCode == 90114))
                {
                }
                else
                {
                    throw new Exception("SDK激活失败,错误码:" + retCode);
                }
                if (isImageMode)
                {
                    retCode = ASFInitEngine(ASF_DetectMode.ASF_DETECT_MODE_IMAGE, ArcSoftFace_OrientPriority.ASF_OP_0_ONLY, ParmsBestPractice.detectFaceScaleVal_Image, ParmsBestPractice.detectFaceMaxNum, faceMask, ref pEngines);
                }
                else
                {
                    retCode = ASFInitEngine(ASF_DetectMode.ASF_DETECT_MODE_VIDEO, ArcSoftFace_OrientPriority.ASF_OP_0_HIGHER_EXT, ParmsBestPractice.detectFaceScaleVal_Video, ParmsBestPractice.detectFaceMaxNum, faceMask, ref pEngines);
                }
                if ((retCode == 0))
                {
                }
                else
                {
                    throw new Exception("SDK初始化失败,错误码:" + retCode);
                }
            }
            catch (Exception ex)
            {
                return;
            }
        }

        public IntPtr InitASFEnginePtr(string appId, string appKey, bool isImageMode = false, int faceMask = (FaceEngineMask.ASF_FACE_DETECT | FaceEngineMask.ASF_FACERECOGNITION))
        {
            IntPtr pEngines = IntPtr.Zero;
            int retCode = -1;
            try
            {
                retCode = ASFOnlineActivation(appId, appKey);
                if ((retCode == 0) || (retCode == 90114))
                {
                }
                else
                {
                    throw new Exception("SDK激活失败,错误码:" + retCode);
                }
                if (isImageMode)
                {
                    retCode = ASFInitEngine(ASF_DetectMode.ASF_DETECT_MODE_IMAGE, ArcSoftFace_OrientPriority.ASF_OP_0_ONLY, ParmsBestPractice.detectFaceScaleVal_Image, ParmsBestPractice.detectFaceMaxNum, faceMask, ref pEngines);
                }
                else
                {
                    retCode = ASFInitEngine(ASF_DetectMode.ASF_DETECT_MODE_VIDEO, ArcSoftFace_OrientPriority.ASF_OP_0_HIGHER_EXT, ParmsBestPractice.detectFaceScaleVal_Video, ParmsBestPractice.detectFaceMaxNum, faceMask, ref pEngines);
                }
                if ((retCode == 0))
                {
                }
                else
                {
                    throw new Exception("SDK初始化失败,错误码:" + retCode);
                }
                return pEngines;
            }
            catch (Exception ex)
            {
                return pEngines;
            }
        }

        public ASF_MultiFaceInfo DetectMultipleFace(IntPtr pEngine, ImageInfo imageInfo)
        {
            ASF_MultiFaceInfo multiFaceInfo = new ASF_MultiFaceInfo();
            IntPtr pMultiFaceInfo = Marshal.AllocHGlobal(Marshal.SizeOf<ASF_MultiFaceInfo>());
            try
            {
                int retCode = ASFDetectFaces(pEngine, imageInfo.width, imageInfo.height, imageInfo.format, imageInfo.imgData, pMultiFaceInfo, ASF_DetectModel.ASF_DETECT_MODEL_RGB);
                multiFaceInfo = Marshal.PtrToStructure<ASF_MultiFaceInfo>(pMultiFaceInfo);
                return multiFaceInfo;
            }
            catch
            {
                return multiFaceInfo;
            }
            finally
            {
                Marshal.FreeHGlobal(pMultiFaceInfo);
            }
        }

        public List<MarkFaceInfor> DetectMultipleFaceAllInformation(IntPtr pEngine, ImageInfo imageInfo, bool extractFaceData = false)
        {
            List<MarkFaceInfor> infors = new List<MarkFaceInfor>();
            ASF_MultiFaceInfo multiFaceInfo = new ASF_MultiFaceInfo();
            IntPtr pMultiFaceInfo = Marshal.AllocHGlobal(Marshal.SizeOf<ASF_MultiFaceInfo>());
            try
            {
                int retCode = ASFDetectFaces(pEngine, imageInfo.width, imageInfo.height, imageInfo.format, imageInfo.imgData, pMultiFaceInfo, ASF_DetectModel.ASF_DETECT_MODEL_RGB);
                multiFaceInfo = Marshal.PtrToStructure<ASF_MultiFaceInfo>(pMultiFaceInfo);
                for (int faceIndex = 0; faceIndex < multiFaceInfo.faceNum; faceIndex++)
                {
                    ASF_SingleFaceInfo singleFaceInfo = new ASF_SingleFaceInfo();
                    singleFaceInfo.faceRect = Marshal.PtrToStructure<MRECT>(multiFaceInfo.faceRects + Marshal.SizeOf<MRECT>() * faceIndex);
                    singleFaceInfo.faceOrient = Marshal.PtrToStructure<int>(multiFaceInfo.faceOrients + Marshal.SizeOf<int>() * faceIndex);
                    MarkFaceInfor markFaceInfor = new MarkFaceInfor(singleFaceInfo.faceRect.left, singleFaceInfo.faceRect.top, singleFaceInfo.faceRect.right - singleFaceInfo.faceRect.left, singleFaceInfo.faceRect.bottom - singleFaceInfo.faceRect.top);
                    markFaceInfor.faceID = Marshal.PtrToStructure<int>(multiFaceInfo.faceID + Marshal.SizeOf<int>() * faceIndex);
                    if (extractFaceData)
                    {
                        markFaceInfor.faceFeatureData = ExtractSingleFaceFeature(pEngine, imageInfo, singleFaceInfo.faceRect, singleFaceInfo.faceOrient);
                    }
                    infors.Add(markFaceInfor);
                }
                return infors;
            }
            catch
            {
                return null;
            }
            finally
            {
                Marshal.FreeHGlobal(pMultiFaceInfo);
            }
        }

        private byte[] ExtractSingleFaceFeature(IntPtr pEngine, ImageInfo imageInfo, MRECT rect, int faceOrient)
        {
            var singleFaceInfo = new ASF_SingleFaceInfo();
            singleFaceInfo.faceRect = rect;
            singleFaceInfo.faceOrient = faceOrient;
            IntPtr pSingleFaceInfo = Marshal.AllocHGlobal(Marshal.SizeOf<ASF_SingleFaceInfo>());
            Marshal.StructureToPtr(singleFaceInfo, pSingleFaceInfo, false);
            IntPtr pFaceFeature = Marshal.AllocHGlobal(Marshal.SizeOf<ASF_FaceFeature>());
            try
            {
                int retCode = ASFFaceFeatureExtract(pEngine, imageInfo.width, imageInfo.height, imageInfo.format, imageInfo.imgData, pSingleFaceInfo, pFaceFeature);
                if (retCode == 0)
                {
                    ASF_FaceFeature faceFeature = Marshal.PtrToStructure<ASF_FaceFeature>(pFaceFeature);
                    byte[] feature = new byte[faceFeature.featureSize];
                    Marshal.Copy(faceFeature.feature, feature, 0, faceFeature.featureSize);
                    return feature;
                }
                else
                {
                    return null;
                }
            }
            catch
            {
                return null;
            }
            finally
            {
                Marshal.FreeHGlobal(pSingleFaceInfo);
                Marshal.FreeHGlobal(pFaceFeature);
            }
        }

        public List<byte[]> ExtractAllFeatures(IntPtr pEngine, ImageInfo imageInfo, ASF_MultiFaceInfo multiFaceInfo)
        {
            try
            {
                List<byte[]> results = new List<byte[]>();
                ASF_SingleFaceInfo singleFaceInfo = new ASF_SingleFaceInfo();
                for (int index = 0; index < multiFaceInfo.faceNum; index++)
                {
                    singleFaceInfo.faceRect = Marshal.PtrToStructure<MRECT>(multiFaceInfo.faceRects + Marshal.SizeOf<MRECT>() * index);
                    singleFaceInfo.faceOrient = Marshal.PtrToStructure<int>(multiFaceInfo.faceOrients + Marshal.SizeOf<int>() * index);
                    byte[] singleFaceFeature = ExtractSingleFaceFeature(pEngine, imageInfo, singleFaceInfo.faceRect, singleFaceInfo.faceOrient);
                    if (singleFaceFeature != null)
                    {
                        results.Add(singleFaceFeature);
                    }
                }
                return results;
            }
            catch (Exception ex)
            {
                throw new Exception("Arcsoft2-->ExtractAllFeatures exception " + ex);
            }
            finally
            {
                Marshal.FreeHGlobal(imageInfo.imgData);
            }
        }

        public IntPtr GetBMP_Ptr(Bitmap image, out int width, out int height, out int pitch)
        {
            width = -1;
            height = -1;
            pitch = -1;
            byte[] imageData = ReadBMP(image, ref width, ref height, ref pitch);
            IntPtr imageDataPtr = Marshal.AllocHGlobal(imageData.Length);
            Marshal.Copy(imageData, 0, imageDataPtr, imageData.Length);
            return imageDataPtr;
        }

        public byte[] ReadBMP(Bitmap image, ref int width, ref int height, ref int pitch)
        {
            //将Bitmap锁定到系统内存中,获得BitmapData
            BitmapData data = image.LockBits(new Rectangle(0, 0, image.Width, image.Height), ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb);
            //位图中第一个像素数据的地址。它也可以看成是位图中的第一个扫描行
            IntPtr ptr = data.Scan0;
            //定义数组长度
            int soureBitArrayLength = data.Height * Math.Abs(data.Stride);
            byte[] sourceBitArray = new byte[soureBitArrayLength];
            //将bitmap中的内容拷贝到ptr_bgr数组中
            Marshal.Copy(ptr, sourceBitArray, 0, soureBitArrayLength); width = data.Width;
            height = data.Height;
            pitch = Math.Abs(data.Stride);
            int line = width * 3;
            int bgr_len = line * height;
            byte[] destBitArray = new byte[bgr_len];
            for (int i = 0; i < height; ++i)
            {
                Array.Copy(sourceBitArray, i * pitch, destBitArray, i * line, line);
            }
            pitch = line;
            image.UnlockBits(data);
            return destBitArray;
        }

        public ASVLOFFSCREEN ChangeMat2ASVLOFFSCREEN(Bitmap image)
        {
            int width = -1;
            int height = -1;
            int pitch = -1;
            IntPtr imagePtr = GetBMP_Ptr(image, out width, out height, out pitch);
            ASVLOFFSCREEN offInput = new ASVLOFFSCREEN();
            offInput.u32PixelArrayFormat = 513;
            offInput.ppu8Plane = new IntPtr[4];
            offInput.ppu8Plane[0] = imagePtr;
            offInput.i32Width = width;
            offInput.i32Height = height;
            offInput.pi32Pitch = new int[4];
            offInput.pi32Pitch[0] = pitch;
            return offInput;
        }
    }
}

  1. 客户端多人特征提取与推流
private async void DetectFaceTick(object sender, ElapsedEventArgs e)
        {
            Mat currentMat;
            lock (_obj)
            {
                currentMat = mat;
            }
            List<MarkFaceInfor> markFaceInfors = ExtractFaceData(currentMat, _enginePool);
            if (markFaceInfors == null)
            {
                return;
            }
            if (markFaceInfors.Count==0)
            {
                return;
            }
            while(!_complete)
            {
                Task.Delay(10).Wait();
            }
            _complete = false;
            var regFace = _client.RecongnizationByFace();

            //定义接收响应逻辑                       
            var regFaceResponseTask = Task.Run(async () =>
            {
                WriteReceiveMsgAsync(string.Format("当前接收时间{0}", DateTime.Now.ToString("HH:mm:ss:fff")));
                await foreach (var resp in regFace.ResponseStream.ReadAllAsync())
                {                  
                    WriteReceiveMsgAsync($"姓名:{resp.PersonName},相似度:{resp.ConfidenceLevel}");
                }
            });

            //开始调用           
            WriteSendMsgAsync(string.Format("开始发送时间{0}", DateTime.Now.ToString("HH:mm:ss:fff")));
            for (int index = 0; index < markFaceInfors.Count; index++)
            {
                WriteSendMsgAsync($"发送编号:{index}");
                await regFace.RequestStream.WriteAsync(new FaceRequest()
                {
                    FaceFeature = Google.Protobuf.ByteString.CopyFrom(markFaceInfors[index].faceFeatureData)
                });
            }           
            await regFace.RequestStream.CompleteAsync();
            
            //等待结果          
            await regFaceResponseTask;
            _complete = true;
        }
  1. 服务端多人特征判断与返回
public override async Task RecongnizationByFace(IAsyncStreamReader<FaceRequest> requestStream, IServerStreamWriter<FaceReply> responseStream, ServerCallContext context)
        {
            var faceQueue = new Queue<Google.Protobuf.ByteString>();
            IntPtr featurePoint = IntPtr.Zero;
            IntPtr engine = FaceProcess.GetEngine(FaceProcess.FaceEnginePoor);
            FaceReply faceReply = new FaceReply();

            while (await requestStream.MoveNext())
            {
                //识别业务
                byte[] featureByte = requestStream.Current.FaceFeature.ToByteArray();
                if (featureByte.Length != 1032)
                {
                    continue;
                }
                featurePoint = Arcsoft_Face_Action.PutFeatureByteIntoFeatureIntPtr(featureByte);
                float maxScore = 0f;

                while (engine == IntPtr.Zero)
                {
                    Task.Delay(10).Wait();
                    engine = FaceProcess.GetEngine(FaceProcess.IDEnginePoor);
                }
                foreach (var f in StaticDataForTestUse.dbFaceInfor)
                {
                    float result = 0;
                    int compareStatus = Arcsoft_Face_3_0.ASFFaceFeatureCompare(engine, featurePoint, f.Key, ref result, 1);
                    if (compareStatus == 0)
                    {
                        if (result >= maxScore)
                        {
                            maxScore = result;
                        }
                        if (result >= _faceMix && result >= maxScore)
                        {
                            faceReply.PersonName = f.Value;
                            faceReply.ConfidenceLevel = result;
                        }
                    }
                    else
                    {
                        faceReply.PersonName = $"对比异常 error code={compareStatus}";
                        faceReply.ConfidenceLevel = result;
                    }
                }
                if (maxScore < _faceMix)
                {
                    faceReply.PersonName = $"未找到匹配者";
                    faceReply.ConfidenceLevel = maxScore;
                }
                Marshal.FreeHGlobal(featurePoint);
                await responseStream.WriteAsync(faceReply);
            }
            FaceProcess.PutEngine(FaceProcess.FaceEnginePoor, engine);
        }

六,效果演示

  1. 客户端:


    r1.png
  1. 服务端:


    2.png

七,源代码与使用
源代码公布在GitHub上
https://github.com/18628271760/MultipleFacesProcess

具体操作:
详见 项目 ReadMe.docx(图文并茂哦!)

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 217,907评论 6 506
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 92,987评论 3 395
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 164,298评论 0 354
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 58,586评论 1 293
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 67,633评论 6 392
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 51,488评论 1 302
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 40,275评论 3 418
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 39,176评论 0 276
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 45,619评论 1 314
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 37,819评论 3 336
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 39,932评论 1 348
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 35,655评论 5 346
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 41,265评论 3 329
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 31,871评论 0 22
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 32,994评论 1 269
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 48,095评论 3 370
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 44,884评论 2 354

推荐阅读更多精彩内容