百度離線人臉識別sdk的使用


1.1下載sdk運行

百度離線人臉識別sdk的使用

 

 

 

 

 

1.2配置環境

添加至項目,可以拖動復制或者以類庫形式添加
face-resource此文件夾 放到根目錄上一層

 

 

激活文件與所有dll引用放到根目錄
嫌麻煩可以將sdk的根目錄所有文件直接拖動至項目的根目錄

1.3 使用
百度離線人臉識別sdk只支持Release方式運行,所以項目的配置也改為此方式(不然會提示baiduapi…….一系列dll找不到)
sdk使用OpenCvSharp3-AnyCPU的引用

  static class Program
    {
        /// <summary>
        /// 應用程序的主入口點。
        /// </summary>

        [UnmanagedFunctionPointer(CallingConvention.Cdecl)]
        delegate void FaceCallback(IntPtr bytes, int size, String res);
        // sdk初始化
        [DllImport("BaiduFaceApi.dll", EntryPoint = "sdk_init", CharSet = CharSet.Ansi
             , CallingConvention = CallingConvention.Cdecl)]
        private static extern int sdk_init(bool id_card);
        // 是否授權
        [DllImport("BaiduFaceApi.dll", EntryPoint = "is_auth", CharSet = CharSet.Ansi
                , CallingConvention = CallingConvention.Cdecl)]
        private static extern bool is_auth();
        // 獲取設備指紋
        [DllImport("BaiduFaceApi.dll", EntryPoint = "get_device_id", CharSet = CharSet.Ansi
                 , CallingConvention = CallingConvention.Cdecl)]
        private static extern IntPtr get_device_id();
        // sdk銷毀
        [DllImport("BaiduFaceApi.dll", EntryPoint = "sdk_destroy", CharSet = CharSet.Ansi
             , CallingConvention = CallingConvention.Cdecl)]
        private static extern void sdk_destroy();

        // 測試獲取設備指紋device_id
        public static void test_get_device_id()
        {
            IntPtr ptr = get_device_id();
            string buf = Marshal.PtrToStringAnsi(ptr);
            Console.WriteLine("device id is:" + buf);
        }

        [STAThread]
        static void Main()
        {
            bool id = false;

            int n = sdk_init(id);
            if (n != 0)
            {
                Console.WriteLine("auth result is {0:D}", n);
                Console.ReadLine();
            }

            // 測試是否授權
            bool authed = is_auth();
            Console.WriteLine("authed res is:" + authed);
            test_get_device_id();

            Application.EnableVisualStyles();
            Application.SetCompatibleTextRenderingDefault(false);
            Application.Run(new Form1());
        }
    }

使用時將Program類修改,不然無法使用人臉檢測接口 錯誤報告為參數非法回傳 -1

1.3.1 人臉圖片檢測方法

[DllImport("BaiduFaceApi.dll", EntryPoint = "track", CharSet = CharSet.Ansi
           , CallingConvention = CallingConvention.Cdecl)]
        private static extern IntPtr track(string file_name, int max_track_num);
 public void test_track()
        {
            // 傳入圖片文件絕對路徑
            string file_name = "d:\\kehu2.jpg";
            int max_track_num = 1; // 設置最多檢測人數(多人臉檢測),默認為1,最多可設為10
            IntPtr ptr = track(file_name, max_track_num);
            string buf = Marshal.PtrToStringAnsi(ptr);
            Console.WriteLine("track res is:" + buf);
        }

人臉檢測接口,此方法需要在d盤根目錄放一張名為“kehu2.jpg”的圖片來檢測是否含有人臉。

1.3.2 人臉攝像頭實時檢測方法

public void usb_csharp_track_face(int dev)
        {
            using (var window = new Window("face"))
            using (VideoCapture cap = VideoCapture.FromCamera(dev))
            {
                if (!cap.IsOpened())
                {
                    Console.WriteLine("open camera error");
                    return;
                }
                // Frame image buffer
                Mat image = new Mat();
                // When the movie playback reaches end, Mat.data becomes NULL.
                while (true)
                {
                    RotatedRect box;                                       
                    cap.Read(image); // same as cvQueryFrame
                    if (!image.Empty())
                    {
                        int ilen = 2;//傳入的人臉數
                        TrackFaceInfo[] track_info = new TrackFaceInfo[ilen];
                        for (int i = 0; i < ilen; i++)
                        {
                            track_info[i] = new TrackFaceInfo();
                            track_info[i].landmarks = new int[144];
                            track_info[i].headPose = new float[3];
                            track_info[i].face_id = 0;
                            track_info[i].score = 0;
                        }
                        int sizeTrack = Marshal.SizeOf(typeof(TrackFaceInfo));
                        IntPtr ptT = Marshal.AllocHGlobal(sizeTrack* ilen);
                        //Marshal.Copy(ptTrack, 0, ptT, ilen);
                        //                        FaceInfo[] face_info = new FaceInfo[ilen];
                        //                        face_info = new FaceInfo(0.0F,0.0F,0.0F,0.0F,0.0F);

                        //Cv2.ImWrite("usb_track_Cv2.jpg", image);
                        /*  trackMat
                         *  傳入參數: maxTrackObjNum:檢測到的最大人臉數,傳入外部分配人臉數,需要分配對應的內存大小。
                         *            傳出檢測到的最大人臉數
                         *    返回值: 傳入的人臉數 和 檢測到的人臉數 中的最小值,實際返回的人臉。
                         ****/
                        int faceSize = ilen;//返回人臉數  分配人臉數和檢測到人臉數的最小值
                        int curSize = ilen;//當前人臉數 輸入分配的人臉數,輸出實際檢測到的人臉數
                        faceSize = track_mat(ptT, image.CvPtr, ref curSize);
                        for (int index = 0; index < faceSize; index++)
                        {
                            IntPtr ptr = new IntPtr();
                            if( 8 == IntPtr.Size)
                            {
                                ptr = (IntPtr)(ptT.ToInt64() + sizeTrack * index);
                            }
                            else if(4 == IntPtr.Size)
                            {
                                ptr = (IntPtr)(ptT.ToInt32() + sizeTrack * index);
                            }
                            
                            track_info[index] = (TrackFaceInfo)Marshal.PtrToStructure(ptr, typeof(TrackFaceInfo));
                            //face_info[index] = (FaceInfo)Marshal.PtrToStructure(info_ptr, typeof(FaceInfo));
                            Console.WriteLine("in Liveness::usb_track face_id is {0}:",track_info[index].face_id);
                            Console.WriteLine("in Liveness::usb_track landmarks is:");
                            for (int k = 0; k < 1; k ++)
                            {
                                Console.WriteLine("{0},{1},{2},{3},{4},{5},{6},{7},{8},{9},", 
                                    track_info[index].landmarks[k], track_info[index].landmarks[k+1],
                                    track_info[index].landmarks[k+2], track_info[index].landmarks[k + 3],
                                    track_info[index].landmarks[k+4], track_info[index].landmarks[k + 5],
                                    track_info[index].landmarks[k+6], track_info[index].landmarks[k + 7],
                                    track_info[index].landmarks[k+8], track_info[index].landmarks[k + 9]
                                    );
                            }
                        
                            for (int k = 0; k < track_info[index].headPose.Length; k++)
                            {
                                Console.WriteLine("in Liveness::usb_track angle is:{0:f}", track_info[index].headPose[k]);
                            }
                            Console.WriteLine("in Liveness::usb_track score is:{0:f}", track_info[index].score);
                            // 角度
                            Console.WriteLine("in Liveness::usb_track mAngle is:{0:f}", track_info[index].box.mAngle);
                            // 人臉寬度
                            Console.WriteLine("in Liveness::usb_track mWidth is:{0:f}", track_info[index].box.mWidth);
                            // 中心點X,Y坐標
                            Console.WriteLine("in Liveness::usb_track mCenter_x is:{0:f}", track_info[index].box.mCenter_x);
                            Console.WriteLine("in Liveness::usb_track mCenter_y is:{0:f}", track_info[index].box.mCenter_y);
                             畫人臉框
                            box = bounding_box(track_info[index].landmarks, track_info[index].landmarks.Length);
                            draw_rotated_box(ref image, ref box, new Scalar(0, 255, 0));
                            // 實時檢測人臉屬性和質量可能會視頻卡頓,若一定必要可考慮跳幀檢測
                            // 獲取人臉屬性(通過傳入人臉信息)
                            //IntPtr ptrAttr = FaceAttr.face_attr_by_face(image.CvPtr, ref track_info[index]);
                            //string buf = Marshal.PtrToStringAnsi(ptrAttr);
                            //Console.WriteLine("attr res is:" + buf);
                             獲取人臉質量(通過傳入人臉信息)
                            //IntPtr ptrQua = FaceQuality.face_quality_by_face(image.CvPtr, ref track_info[index]);
                            //buf = Marshal.PtrToStringAnsi(ptrQua);
                            //Console.WriteLine("quality res is:" + buf);
                             實時檢測人臉特征值可能會視頻卡頓,若一定必要可考慮跳幀檢測
                            //float[] feature = new float[512];
                            //IntPtr ptrfea = new IntPtr();
                            //int count = FaceCompare.get_face_feature_by_face(image.CvPtr, ref track_info[index], ref ptrfea);
                             返回值為512表示取到了特征值
                            //if(ptrfea == IntPtr.Zero)
                            //{
                            //    Console.WriteLine("Get feature failed!");
                            //    continue;
                            //}
                            //if (count == 512)
                            //{
                            //    for (int i = 0; i < count; i++)
                            //    {
                            //        IntPtr floptr = new IntPtr();
                            //        if ( 8 == IntPtr.Size)
                            //        {
                            //            floptr = (IntPtr)(ptrfea.ToInt64() + i * count * Marshal.SizeOf(typeof(float)));
                            //        }
                            //        else if( 4 == IntPtr.Size)
                            //        {
                            //            floptr = (IntPtr)(ptrfea.ToInt32() + i * count * Marshal.SizeOf(typeof(float)));
                            //        }
                                    
                            //        feature[i] = (float)Marshal.PtrToStructure(floptr, typeof(float));
                            //        Console.WriteLine("feature {0} is: {1:e8}", i, feature[i]);
                            //    }
                            //    Console.WriteLine("feature count is:{0}", count);
                            //}
                        }
                        Marshal.FreeHGlobal(ptT);
                        window.ShowImage(image);
                        Cv2.WaitKey(1);
                        Console.WriteLine("mat not empty");
                    }
                    else
                    {
                        Console.WriteLine("mat is empty");
                    }
                }
                image.Release();
            }
        }

以上為原sdk方法;
只可檢測人臉 不可活體檢測;

  static VideoCapture camera1 = null;
  static VideoCapture camera2 = null;
  public bool usb_csharp_track_face(int dev)
        {
            string buf = "";
            int faceSize = 0;
            int faceNum = 2;//傳入的人臉數
            int face_size = faceNum;//當前傳入人臉數,傳出人臉數
            //VideoCapture cap = VideoCapture.FromCamera(dev);
            //Mat image = new Mat();
            long ir_time = 0;
            // 序號0為電腦識別的usb攝像頭編號,本demo中0為ir紅外攝像頭
            // 不同攝像頭和電腦識別可能有區別
            // 編號一般從0-10 */            
            int device = select_usb_device_id();
            camera1 = VideoCapture.FromCamera(device);
            if (!camera1.IsOpened())
            {
                Console.WriteLine("camera1 open error");
                return false;
            }
            camera1.Set(CaptureProperty.FrameWidth, 1440);
            camera1.Set(CaptureProperty.FrameHeight, 1080);
           camera2 = VideoCapture.FromCamera(device + 1);
            if (!camera2.IsOpened())
            {
                Console.WriteLine("camera2 open error");
                return false;
            }

            RotatedRect box;
            Mat frame1 = new Mat();
            Mat frame2 = new Mat();
            Mat rgb_mat = new Mat();
            Mat ir_mat = new Mat();
            //var window_ir = new Window("ir_face");
            //var window_rgb = new Window("rgb_face");
            if (!camera1.IsOpened())
            {
                return false;
            }
            #region 嘗試
            while (true)
            {

                //Thread.Sleep(1000);

                camera1.Read(frame1);
                camera2.Read(frame2);

                if (!frame1.Empty() && !frame2.Empty())
                {
                    int ilen = 2;//傳入的人臉數
                    TrackFaceInfo[] track_info = new TrackFaceInfo[ilen];
                    for (int i = 0; i < ilen; i++)
                    {
                        track_info[i] = new TrackFaceInfo();
                        track_info[i].landmarks = new int[144];
                        track_info[i].headPose = new float[3];
                        track_info[i].face_id = 0;
                        track_info[i].score = 0;
                    }
                    int sizeTrack = Marshal.SizeOf(typeof(TrackFaceInfo));
                    IntPtr ptT = Marshal.AllocHGlobal(sizeTrack * ilen);
                    #regionif (frame1.Size(0) > frame2.Size(0))
                    {
                        rgb_mat = frame1;
                        ir_mat = frame2;
                    }
                    else
                    {
                        rgb_mat = frame2;
                        ir_mat = frame1;
                    }
                    #endregion
                    float rgb_score = 0;
                    float ir_score = 0;
                    //Marshal.Copy(ptTrack, 0, ptT, ilen);
                    //FaceInfo[] face_info = new FaceInfo[ilen];
                    //face_info = new FaceInfo(0.0F,0.0F,0.0F,0.0F,0.0F);
                    //Cv2.ImWrite("usb_track_Cv2.jpg", image);
                    /*  trackMat
                     *  傳入參數: maxTrackObjNum:檢測到的最大人臉數,傳入外部分配人臉數,需要分配對應的內存大小。
                     *            傳出檢測到的最大人臉數
                     *    返回值: 傳入的人臉數 和 檢測到的人臉數 中的最小值,實際返回的人臉。
                     ****/
                    faceSize = ilen;//返回人臉數  分配人臉數和檢測到人臉數的最小值
                    int curSize = ilen;//當前人臉數 輸入分配的人臉數,輸出實際檢測到的人臉數
                    faceSize = track_mat(ptT, frame1.CvPtr, ref curSize);
                    IntPtr ptr1 = rgb_ir_liveness_check_faceinfo(rgb_mat.CvPtr, ir_mat.CvPtr, ref rgb_score, ref ir_score, ref faceSize, ref ir_time, ptT);
                    curSize = ilen;
                    faceSize = track_mat(ptT, frame1.CvPtr, ref curSize);
                    //Console.WriteLine(rgb_score.ToString());
                    //Console.WriteLine(ir_score.ToString());
                    //textBox1.Text = faceSize.ToString();
                    for (int index = 0; index < faceSize; index++)
                    {
                        IntPtr ptr = new IntPtr();
                        if (8 == IntPtr.Size)
                        {
                            ptr = (IntPtr)(ptT.ToInt64() + sizeTrack * index);
                        }
                        else if (4 == IntPtr.Size)
                        {
                            ptr = (IntPtr)(ptT.ToInt32() + sizeTrack * index);
                        }
                        track_info[index] = (TrackFaceInfo)Marshal.PtrToStructure(ptr, typeof(TrackFaceInfo));
                        IntPtr ptrQua = FaceQuality.face_quality_by_face(frame1.CvPtr, ref track_info[index]);
                        buf = Marshal.PtrToStringAnsi(ptrQua);
                    }
                    textBox2.Text = buf.ToString();
                    Marshal.FreeHGlobal(ptT);
                    //window.ShowImage(image);
                    pictureBox1.Image = frame1.ToBitmap();
                    Cv2.WaitKey(1);
                    #region 測試
                    if (rgb_score != 0 && ir_score != 0)
                    {
                        if (faceSize >= 1)
                        {

                            JObject jObject = (JObject)JsonConvert.DeserializeObject(buf);
                            var json = (JObject)JsonConvert.DeserializeObject(jObject["data"].ToString());


                            string result = json["result"].ToString();
                            // textBox1.Text = json["result"].ToString();
                            var json1 = (JObject)JsonConvert.DeserializeObject(result);
                            if (double.Parse(json1["bluriness"].ToString()) <= 0.00001 && double.Parse(json1["occl_r_eye"].ToString()) < 0.1 && double.Parse(json1["occl_l_eye"].ToString()) < 0.1 &&
                                double.Parse(json1["occl_mouth"].ToString()) <= 0 && double.Parse(json1["occl_nose"].ToString()) < 0.11 && double.Parse(json1["occl_r_contour"].ToString()) < 0.1
                                && double.Parse(json1["occl_chin"].ToString()) <= 0 && double.Parse(json1["occl_l_contour"].ToString()) < 0.1)
                            {
                                string fileName = DateTime.Now.ToString("yyyyMMdd") + ".jpg";
                                string path = AppDomain.CurrentDomain.BaseDirectory;
                                string picture = path + @"img" + "\\" + fileName;

                                frame1.SaveImage(picture);
                                falnem = picture;
                                if (Sdistinguish() != "0")
                                {
                                    Class1 c = new Class1();
                                    Thread.Sleep(2000);
                                    dtTo = new TimeSpan(0, 0, 2);
                                    camera2.Release();
                                    camera2.Release();
                                    thread.Abort();
                                    return true;

                                }
                                else
                                {
                                    Thread.Sleep(2000);
                                    dtTo = new TimeSpan(0, 0, 2);
                                    camera2.Release();
                                    camera2.Release();
                                    thread.Abort();
                                    //msg2 msg2 = new msg2();
                                    //msg2.Show();
                                    return true;
                                    this.Close();
                                }

                                Thread.Sleep(2000);

                                thread.Abort();
                                //msg2 msg2 = new msg2();
                                //msg2.Show();
                                this.Close();
                                //window.Close();


                            }
                        }

                    }

                    #endregion

                    //return faceSize;
                }
                else
                {
                    Console.WriteLine("mat is empty");
                }
            }
            #endregion


            frame1.Release();

            return false;
            //usb_csharp_track_face(1);
        }

以上為我修改后的;

 

  private string distinguish()
        {
            dz = "1";
            //string fileName = DateTime.Now.ToString("yyyyMMddHH") + ".jpg";
            string fileName = falnem;

            //string fileName = "" + ".jpg";
            //SavePicture(fileName);
            FaceCompare faceCompare = new FaceCompare();
            JObject jObject = (JObject)JsonConvert.DeserializeObject(faceCompare.test_identify(fileName, "group", ""));

            //  JObject jObject = (JObject)JsonConvert.DeserializeObject(faceCompare.test_identify("d:\\4.jpg", "test_group", "u1"));
            var json = (JObject)JsonConvert.DeserializeObject(jObject["data"].ToString());
            var json1 = (JObject)JsonConvert.DeserializeObject(json["result"].ToString());
             var json1 = (JArray)json["result"].ToString();
             textBox1.Text = "對比值:" + json1["score"].ToString() +"/r用戶:"+json1["user_id"].ToString() ;
            textBox1.Text= faceCompare.test_identify("d:\\4.jpg", "test_group", "test_user");

            string result = json["result"].ToString().Substring(1, json["result"].ToString().Length - 2);
            // textBox1.Text = json["result"].ToString();
            var json1 = (JObject)JsonConvert.DeserializeObject(result);
            //textBox2.Text = "對比值:" + json1["score"].ToString() + "\r\n用戶:" + json1["user_id"].ToString();
            string user = json1["user_id"].ToString();

            string lockes = "0";
            #region 查詢數據庫
            using (var db = new ces(dbPath))
            {
                //List<TestFaceDB> testFaces = db.testFaces.Where(x => x.User == json1["user_id"].ToString()).ToList();
                var testFaces = db.testFaces.Where(x => x.User == user).ToList();
                //string testFaces = db.testFaces.Select(*).Where(x => x.User == json1["user_id"].ToString()).ToString();
                //this.Text = $"{DateTime.Now}, 查到{books.Count}條記錄";
                lockes = testFaces[0].Locks;
            }

            int lo = int.Parse(lockes);
            if (int.Parse(lockes) > 100)
            {
                int d = 0;
                while (lo > 100)
                {
                    lo -= 100;
                    d++;
                }
                int z = 1;
                for (int i = 0; i < d - 1; i++)
                {
                    z = z * 2;
                };
                dz = "0" + z.ToString();
            }
            #endregion

            double score = double.Parse(json1["score"].ToString());

            lockes = lo.ToString();

            if (score > 80)
            {
                if (int.Parse(lockes) < 10) lockes = "0" + lockes;
                return lockes;
            }
            else
            {
                return lockes;
            }

        }
//1:N比較
public /*void*/string test_identify(string str, string usr_grp, string usr_id)
        {
            string file1 = str;//"d:\\6.jpg";
            string user_group = usr_grp;//"test_group";
            string user_id = usr_id;//"test_user";
            IntPtr ptr = identify(file1, user_group, user_id);
            string buf = Marshal.PtrToStringAnsi(ptr);
            //Console.WriteLine("identify res is:" + buf);
            return buf;
        }

distinguish()方法為識別后在數據庫中尋找匹配人臉數據;
將活體實時檢測與人臉實時檢測合並為一個方法,關閉了window顯示窗口,修改為pictureBox控件 通過線程調用(不然死循環會崩潰);關閉攝像頭為 Release();方法

數據均為json數據 ;
具體數據參數見百度文檔
根據需求修改判斷條件是否符合項目中的人臉標准;
識別流程為
啟動線程->打開攝像頭->檢測到人臉->將人臉圖片保存->用圖片傳入調用人臉對比方法;

1.3.3 人臉注冊方法

string user_id = User.Text;
                                string group_id = "group";
                                string file_name = falnem;
                                string lockse = group.Text;
                                string user_info = info.Text;
                                string zc =  faceManager.test_user_add(user_id, group_id, file_name, user_info);
                                JObject jObjectzc = (JObject)JsonConvert.DeserializeObject(zc);
                                if(jObjectzc["msg"].ToString()== "success")
                                {
                                    List<TestFaceDB> testFaceDBs  = new List<TestFaceDB>()
                                    {
                                        new TestFaceDB()  {User = user_id,Locks = lockse,Phone = user_info}
                                    };
                                    textBox1.Text = dbPath;
                                    using (var db = new ces(dbPath))
                                    {
                                        int count = db.InsertAll(testFaceDBs);
                                        //this.Text = $"{DateTime.Now}, 插入{count}條記錄";
                                    }
                                    msg msg = new msg();
                                    msg.ShowDialog();
                                    Thread.Sleep(5000);

                                    dtTo = new TimeSpan(0, 0, 2);
                                    camera1.Release();
                                    camera2.Release();

                                    return true;
                                }
                                Thread.Sleep(3000);

同上的方法 先檢測攝像頭中是否使活體人臉並存圖片文件,保存后調用人臉注冊

public /*void*/ string test_user_add(string uid,string gid,string fil,string uin)
        {
            //人臉注冊
            //string user_id = "test_user";
            //string group_id = "test_group";
            //string file_name = "d:\\2.jpg";
            //string user_info = "user_info";
            string user_id = uid;
            string group_id = gid;
            string file_name = fil;
            string user_info = uin;
            IntPtr ptr = user_add(user_id, group_id, file_name, user_info);
            string buf = Marshal.PtrToStringAnsi(ptr);
            return buf;
        }

這個我修改添加了參數;

 

 

 

因sdk帶的SQLite數據庫字段比較少,並且我也沒有花心思想辦法去修改它創建的數據庫,所以創建個新的和sdk中數據庫以一個字段相連,就當擴展數據庫了;
在注冊的時候順帶存新數據庫一份就可以,要什么字段看自己需求;
當查找數據庫中數據的時候查到就取那個字段,然后去新數據庫中查找對應的其他字段。

1.4 打包

 

 

 

打包簡單方法直接將根目錄數據放到一個文件夾然后添加到Application Folder文件夾,然后在同目錄將face-resource放進去就可以了,在安裝的時候不能安裝到系統盤 不然接口會出問題;不知道什么原因
安裝到一個電腦的時候需要使用百度離線識別的sdk序列號激活不然不可使用;因為百度離線人臉識別sdk 是通過電腦指紋判斷是否激活,如果遇到LicenseTool.exe打不開彈出缺少系統dll組件請去網絡下載,或許因為電腦系統為盜版系統組件未安裝齊
也可以在百度下載激活的壓縮文件;
在添加輸出的時候將sdk作為本地化資源輸出,項目作為主輸出,輸出類型為Release;

 

 

「碼農_半只龍蝦」  也是我  ————————————————  

版權聲明:本文為csdn博主「碼農_半只龍蝦」的原創文章,遵循CC 4.0 BY-SA版權協議,轉載請附上原文出處鏈接及本聲明。
csdn鏈接:https://blog.csdn.net/weixin_44986330/article/details/106297549

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM