利用線結構光進行三維重構(測距)


利用線結構光進行三維重構(測距)

 

通過線激光器掃描物體,同時用攝像機對其拍照得到帶有結構光的圖片,提取結構光上的點的三維坐標,激光器掃描整個物體就可求出所有點的三維坐標實現物體表面的三維重構,即可測量物體表面任意兩點距離。

 

准備知識:

四個坐標系的轉換

  

 

世界坐標系--攝像機坐標系

將攝像機光心定位攝像機坐標原點Oc,攝像機的光軸定位攝像機坐標系的Zc軸,Xc軸,Yc軸分別與圖像坐標系的xy軸平行。

 

 

R3階正交單位矩陣,t為平移向量,均為相機外參數

物理坐標系--像素坐標系

圖像的xy軸分別和像素的uv軸平行

 u=x/dx+u0, v=y/dy+v0

 

 

 

 

 

物理坐標系--攝像機坐標系

 

 

像素坐標系--世界坐標系

fxfyu0v0為攝像機內部參數,Rt,為外部參數

 

 

系統測量模型

 

 

P點既在OP’直線上又在光平面上(由結構光投射器與物體表面結構光構成的平面),攝像機和結構光投射器相對位置不變,光平面方程不變,設攝像機下光平面方程為

 

 

設點P’在攝像機坐標系下的圖像坐標(xy1P點坐標為(XcYcZc)則直線OcP直線方程為

 

 

聯立可得

 

 

求空間點在攝像機坐標系下的空間三維坐標需要光平面方程,P’點的圖像坐標,求解P’的坐標需要攝像機內參數,所以需要攝像機標定和光平面標定。

 

系統實現方案

固定攝像機和激光器,移動器棋盤格拍攝帶有結構光和不帶結構光的圖片,需要不同角度拍攝。

通過harris角點提取圖片中角點像素坐標

運用張正友標定思想完成相機標定

提取線結構光方程,與圖像角點直線方程結合,求角點直線和結構光的交點坐標

通過角點的像素坐標和角點與結構光的交點坐標運用交比不變性得到結構光上一系列交點的攝像機坐標,通過最小二乘擬合得到攝像機坐標系下的光平面方程

運用光平面方程,可得到結構光上任意一點的相機坐標

 

相機標定和光平面實現

Qt+openCV

cameraCalibrate函數原型

calibrateCamera(InputArrayOfArrays objectPoints,

InputArrayOfArrays imagePoints,

  Size imageSize,

InputOutputArray cameraMatrix,

InputOutputArray distCoeffs,

  OutputArrayOfArrays rvecs,

  OutputArrayOfArrays tvecs,

 int flags=0 );

參數

objectPoints  初始化世界坐標系的所有角點的三維坐標點

應輸入   vector<vector<Point3f>> objectPoints

 imagePoints 與其對應的像素坐標系的所有角點的二維坐標點

應輸入  vector< vector< Point2f>> imagePoints

  imageSize    圖像大小

     cameraMatrix  相機內參數矩陣   

  輸入一個cv::Mat cameraMatrix即可。

         distCoeffs   為畸變矩陣。輸入一個cv::Mat distCoeffs即可

rvecs  為旋轉向量 應輸入一個vector<cv::Mat>

tvecs  為平移向量 應輸入一個vector<cv::Mat>

         flags為標定是所采用的算法。可如下某個或者某幾個參數:

CV_CALIB_USE_INTRINSIC_GUESS:使用該參數時,在cameraMatrix矩陣中應該有fx,fy,cx,cy的估計值。否則的話,將初始化(cx,cy)圖像的中心點,使用最小二乘估算出fxfy。如果內參數矩陣和畸變居中已知的時候,應該標定模塊中的solvePnP()函數計算外參數矩陣。

CV_CALIB_FIX_PRINCIPAL_POINT:在進行優化時會固定光軸點。當CV_CALIB_USE_INTRINSIC_GUESS參數被設置,光軸點將保持在中心或者某個輸入的值。

CV_CALIB_FIX_ASPECT_RATIO:固定fx/fy的比值,只將fy作為可變量,進行優化計算。當CV_CALIB_USE_INTRINSIC_GUESS沒有被設置,fxfy將會被忽略。只有fx/fy的比值在計算中會被用到。

CV_CALIB_ZERO_TANGENT_DIST:設定切向畸變參數(p1,p2)為零。

CV_CALIB_FIX_K1,...,CV_CALIB_FIX_K6:對應的徑向畸變在優化中保持不變。如果設置了CV_CALIB_USE_INTRINSIC_GUESS參數,

CV_CALIB_RATIONAL_MODEL:計算k4k5k6三個畸變參數。如果沒有設置,則只計算其它5個畸變參數。

 

攝像機標定需要角點的世界坐標棋盤格表面為x-y軸,所以棋盤格角點z坐標全為0,已知兩角點的實際距離就可求出所有角點的世界坐標

攝像機標定還需角點的像素坐標

利用openCV庫函數findChessboardCorners()可以提取到。

帶入這兩個參數到cameraCalibrate函數即可求出相機內參數和每幅圖對應的外參數

 

光平面標定

1、提取線結構光中心線方程,結構光處像素值為255利用二值化原理提取只含結構光的圖像,利用goodFeaturesToTrack()取得結構光上的點,將這些點代入fitline(),擬合得到線結構光方程。

2、擬合角點直線方程,結合線結構光中心線方程,求兩直線交點可得一系列光條上的點的像素坐標。

3、通過角點世界坐標,結合相機所求每一幅圖的相機外參數,求出每幅圖的角點攝像機坐標

4、通過攝像機內參數,將角點圖像坐標轉換為角點物理坐標

 

 

5、已知A B C 角點 D 為角點直線與結構光角點直線的交點,A’ B ‘ C’ D’為成像點,

    已知A’ B ‘ C’ D’四點物理坐標或像素坐標,A B C三點相機坐標可求出D的相機坐標。

 

 

6、對於多幅圖求出物體表面結構光與角點直線的交點坐標運用最小二乘擬合得到光平面方程

 

運用光平面方程求距離

鼠標點擊結構光上任意兩點獲取兩點的像素坐標通過公式

 

 

其中1/dx=fx,1/dy=fy,fx,fy均為攝像機內參數

然后物理坐標轉為攝像機坐標

 

 

 

運用兩點間距離公式可求出結構光上兩點實際距離。

 

 

 

 

 

附:

 

 

最小二乘擬合平面

 

CvMat*points_mat = cvCreateMat(X_vector.size(), 3, CV_32FC1);

//定義用來存儲需要擬合點的矩陣大小N*3; 

 

for (int i=0;i < X_vector.size(); ++i)

{

points_mat->data.fl[i*3+0] = X_vector[i];

//矩陣的值進行初始化   X的坐標值

points_mat->data.fl[i * 3 + 1] = Y_vector[i];

//  Y的坐標值

points_mat->data.fl[i * 3 + 2] = Z_vector[i];

//<span style="font-family: Arial, Helvetica, sans-serif;">

//  Z的坐標值</span>

 

}

 

float plane12[4] = { 0 };//定義用來儲存平面參數的數組 

cvFitPlane(points_mat, plane12);//調用方程 

        

//  其中  Plane12[4]     數組中對應ABCD;

 //Ax+by+cz=D

        void cvFitPlane(const CvMat* points, float* plane){

            // Estimate geometric centroid.

            int nrows = points->rows;

            int ncols = points->cols;

            int type = points->type;

            CvMat* centroid = cvCreateMat(1, ncols, type);

            cvSet(centroid, cvScalar(0));

            for (int c = 0; c<ncols; c++){

                for (int r = 0; r < nrows; r++)

                {

                centroid->data.fl[c] += points->data.fl[ncols*r + c];

                }

                centroid->data.fl[c] /= nrows;

            }

            // Subtract geometric centroid from each point.

            CvMat* points2 = cvCreateMat(nrows, ncols, type);

            for (int r = 0; r<nrows; r++)

                for (int c = 0; c<ncols; c++)

                    points2->data.fl[ncols*r + c] = points->data.fl[ncols*r + c] - centroid->data.fl[c];

            // Evaluate SVD of covariance matrix.

            CvMat* A = cvCreateMat(ncols, ncols, type);

            CvMat* W = cvCreateMat(ncols, ncols, type);

            CvMat* V = cvCreateMat(ncols, ncols, type);

            cvGEMM(points2, points, 1, NULL, 0, A, CV_GEMM_A_T);

            cvSVD(A, W, NULL, V, CV_SVD_V_T);

            // Assign plane coefficients by singular vector corresponding to smallest singular value.

            plane[ncols] = 0;

            for (int c = 0; c<ncols; c++){

                plane[c] = V->data.fl[ncols*(ncols - 1) + c];

                plane[ncols] += plane[c] * centroid->data.fl[c];

            }

            // Release allocated resources.

            //cvReleaseMat(¢roid);

            cvReleaseMat(&points2);

            cvReleaseMat(&A);

            cvReleaseMat(&W);

            cvReleaseMat(&V);

        }

 

 

 

 

最小二乘擬合線:

函數原型如下:

void fitLine( InputArray points,

    OutputArray line,

    int distType,

    double param,

    double reps,

double aeps );

distType 指定擬合函數的類型,可以取 CV_DIST_L2

param 就是 CV_DIST_FAIR、CV_DIST_WELSCH、CV_DIST_HUBER 公式中的C。如果取 0,則程序自動選取合適的值。

reps 表示直線到原點距離的精度,建議取 0.01。 
aeps 表示直線角度的精度,建議取 0.01。

計算出的直線信息存放在 line 中,為 cv::Vec4f 類型。line[0]、line[1] 存放的是直線的方向向量。line[2]、line[3] 存放的是直線上一個點的坐標。

如果直線用 y=kx+by=kx+b 來表示,那么 k = line[1]/line[0],b = line[3] - k * line[2]

 

附部分源碼

其中部分為ui設計和對特定圖片寫的去結構光的算法不可生搬硬套!

#ifndef MAINWINDOW_H

#define MAINWINDOW_H

 

#include <QMainWindow>

#include <iostream>

#include <fstream>

#include<QFileDialog>

#include<QString>

#include<QImage>

#include<QPixmap>

#include <QWidget>

#include<QMouseEvent>

#include<QEvent>

#include<QDebug>

#include<QMessageBox>

#include "opencv2/core/core.hpp"

#include "opencv2/imgproc/imgproc.hpp"

#include "opencv2/calib3d/calib3d.hpp"

#include "opencv2/highgui/highgui.hpp"

#include <opencv2\imgproc\types_c.h>

using namespace cv;

using namespace std;

#define IMGCOUNT 14

namespace Ui {

class MainWindow;

}

 

class MainWindow : public QMainWindow

{

    Q_OBJECT

 

public:

    explicit MainWindow(QWidget *parent = 0);

    ~MainWindow();

 

private slots:

    void cameraCalibrate();//無結構光攝像機標定

    void squareCalibrate();//有結構光攝像機標定和光平面標定

    Point2f getcrosspoint(Vec4f lineA,Vec4f lineB);//找出兩條直線的交點

    void GetCrossPointAll();//找出所有圖片的結構光交點

    void calMatrix_M();//計算世界坐標系與攝像機坐標系的關系矩陣M

    void calCameraCornerPoints();//計算攝像機坐標系下的角點坐標

    void calCameraCrossPoints();//計算攝像機坐標下的結構光交點坐標

    void calCornersInCamera(Point2f A,Point2f B,Point2f C,Point2f D,Mat A_,Mat B_,Mat C_);

    void fitting_light_surface();

    void cvFitPlane(const CvMat* points, float* plane);

    void calDistance();

    void pushbutton1();

public slots:

    void open();

protected:

    void mousePressEvent(QMouseEvent *m);//重載mousePressEvent函數

private:

   double zoom=1;

   vector<Point2f> mousePoint;

   QString path;

private:

    Ui::MainWindow *ui;

    int imageCount;

    string file;

    Size image_size;//圖像的尺寸

    Size board_size;     //標定板上每列,行的角點數7  size.width 代表列數 size.height 代表 行數

    vector<Point2f> image_points_buf;  //緩存每幅圖像上檢測到的角點

    vector<vector<Point3f>> object_points; //保存標定板上角點的三維坐標,為標定函數的第一個參數

    vector<vector<Point2f>> image_points_seq; //保存檢測到的所有角點

   Size square_size;//實際測量得到的標定板上每個棋盤格的大小,這里其實沒測,就假定了一個值,

   Mat cameraMatrix;//攝像機內參數

   Mat  distCoeffs;//畸變系數

   vector<Mat> tvecsMat;//每幅圖像的旋轉向量

   vector<Mat> R_matrix;//每幅圖的旋轉矩陣

   vector<Mat> rvecsMat;//每幅圖像的平移向量

   vector<Point2f>corners;//結構光上的點

   Vec4f line_para; //輸出的直線

   vector<Mat> M;//攝像機坐標 世界坐標 轉換系數

   vector<vector<Mat>>corners_in_camera;//角點在相機坐標系下的坐標

   vector<vector<Point2f>>crossPointAll;//所有圖像的結構光與角點直線的坐標

   //擬合點的三維坐標

   vector<double>X_vector;

   vector<double>Y_vector;

   vector<double>Z_vector;

   float plane12[4] = { 0 };//定義用來儲存平面參數的數組

};

 

#endif // MAINWINDOW_H

 

 

 

#include "mainwindow.h"

#include "ui_mainwindow.h"

 

MainWindow::MainWindow(QWidget *parent) :

    QMainWindow(parent),

    ui(new Ui::MainWindow)

{

    ui->setupUi(this);

     connect(ui->actioncalibrateCamera,SIGNAL(triggered()),this,SLOT(cameraCalibrate()));

     connect(ui->pushButton,SIGNAL(clicked()),this,SLOT(pushbutton1()));

     connect(ui->pushButton_2,SIGNAL(clicked()),this,SLOT(open()));

     ui->rows->setValue(7);ui->cols->setValue(7);

     ui->filename->setText("D:\\QT projects\\PIC\\struct_image_copy");

     ui->dis->setValue(10);

     ui->pic->setValue(11);

}

 

MainWindow::~MainWindow()

{

    delete ui;

}

 

void MainWindow::cameraCalibrate()

{

           board_size.width=ui->cols->value();board_size.height=ui->rows->value();

           QString f="D:\\QT projects\\PIC\\cal_image_copy";

           for (int image_num = 1; image_num <= IMGCOUNT; image_num++)

           {

               QString n("\\%1.bmp");n=n.arg(image_num);

               QString m=f+n;

               string file=string((const char *)m.toLocal8Bit());

                Mat imageInput = imread(file);

               if (!findChessboardCorners(imageInput, board_size, image_points_buf))

                      {

                          cout << "can not find chessboard corners!\n";//找不到角點

                          return;

                      }

                      else

                      {

                          Mat view_gray;//灰度圖

                          cvtColor(imageInput, view_gray, CV_RGB2GRAY);

                          /*亞像素精確化*/

                          find4QuadCornerSubpix(view_gray, image_points_buf, Size(5, 5));//對粗提取的角點進行精確化

                          drawChessboardCorners(view_gray, board_size, image_points_buf, true);//用於在圖片中標記角點

                          image_points_seq.push_back(image_points_buf);//保存亞像素角點

                          imshow("Camera Calibration", view_gray);//顯示圖片

                          waitKey(500);//停半秒

                      }

                      image_size.width = imageInput.cols;

                      image_size.height = imageInput.rows;

                      imageInput.release();

           }

/*相機標定*/

                      for (int t = 0; t<IMGCOUNT; t++)

                         {

                                  vector<Point3f> tempPointSet; //世界坐標點

                             for (int i = 0; i<board_size.width; i++)//每列 size.width代表圖像的寬度 即列數

                             {

                                 for (int j = 0; j<board_size.height; j++)//每行

                                 {

                                     Point3f realPoint;

                                     //假設標定板放在世界坐標系中z=0的平面上

                                     realPoint.x = j*square_size.width;

                                     realPoint.y = i*square_size.height;

                                     realPoint.z = 0;

                                     tempPointSet.push_back(realPoint);

                                 }

                             }

                             object_points.push_back(tempPointSet);

                         }

                      //內外參數對象

                         cameraMatrix = Mat(3, 3,CV_32FC1, Scalar::all(0));//攝像機內參數矩陣

                        // vector<int> point_counts;// 每幅圖像中角點的數量

                         distCoeffs = Mat(1, 5, CV_64F, Scalar::all(0));//攝像機的5個畸變系數:k1,k2,p1,p2,k3

 

                         calibrateCamera(object_points, image_points_seq, image_size, cameraMatrix, distCoeffs, rvecsMat, tvecsMat, 0);//相機標定

 

/*保存內外參數*/

                             ofstream fout("caliberation_result.txt");//保存標定結果的文件

                             fout << "相機內參數矩陣:" << endl;

                             fout << cameraMatrix << endl << endl;

                             fout << "畸變系數:\n";

                             fout << distCoeffs << endl << endl << endl;

                              waitKey(0);//停半秒

 

}

 

void MainWindow::squareCalibrate()

{

 

   object_points.clear();image_points_seq.clear();

  for (int image_num = 1; image_num <= imageCount; image_num++)

  {

      QString n("\\%1.bmp");n=n.arg(image_num);

      QString m=path+n;

       file=string((const char *)m.toLocal8Bit());

      //sprintf(filenames, "D:\\QT projects\\PIC\\struct_image_copy\\%d.bmp", image_num);

       Mat origin_image = imread(file);

           Mat grey_image;

           cvtColor(origin_image, grey_image, CV_RGB2GRAY);

           CV_Assert(grey_image.depth() != sizeof(uchar));

           int row=grey_image.rows;

           int col=grey_image.cols;

           uchar*p;

/*階段去除高亮部分*/

           for(int i=0;i<row;i++)

           {

               p=grey_image.ptr<uchar>(i);

               for(int j=0;j<col;j++)

               {

                   if(p[j]>250)p[j]-=100;

                   if(p[j]>220)p[j]-=70;

                   if(p[j]>190)p[j]-=30;

                   if(p[j]>170)p[j]-=10;

 

               }

           }

/*去除黑方格上的亮點*/

           for(int i=0;i<row;i++)

           {

               p=grey_image.ptr<uchar>(i);

               for(int j=0;j<col;j++)

               {

                   if(p[j]>140){

                       if(j<15&&p[j+15]<100)

                       p[j]=p[j+15];

                       if((j>15)&&(j<col-15))

                       {if(p[j+15]<100)p[j]=p[j+15];

                       if(p[j+15]<100)p[j]=p[j-15];

                       }

                       if(j>col-15&&p[j+15]<100)

                           p[j]=p[j-15];

                   }

               }

           }

/*腐蝕進一步消除*/

           Mat ele = getStructuringElement(MORPH_RECT,Size(4,4));

            Mat ele2 = getStructuringElement(MORPH_RECT,Size(2,2));

              erode(grey_image,grey_image,ele);//erode函數直接進行腐蝕操作

              dilate(grey_image,grey_image,ele2);//膨脹增加精確度

             // imshow("after erode operation",grey_image);

 

              Mat imageInput=grey_image.clone();

              if (!findChessboardCorners(imageInput, board_size, image_points_buf,CV_CALIB_CB_FILTER_QUADS))

                     {

                         cout << "can not find chessboard corners!\n";//找不到角點

                         return;

                     }

                     else

                     {

 

                         Mat view_gray=grey_image.clone();

                         /*亞像素精確化*/

                         find4QuadCornerSubpix(view_gray, image_points_buf, Size(5, 5));//對粗提取的角點進行精確化

                         drawChessboardCorners(view_gray, board_size, image_points_buf, true);//用於在圖片中標記角點

                         image_points_seq.push_back(image_points_buf);//保存亞像素角點

                        // imshow("Camera Calibration", view_gray);//顯示圖片

                         waitKey(100);

                     }

                     image_size.width = imageInput.cols;

                     image_size.height = imageInput.rows;

                     imageInput.release();

 }

/*相機標定*/

                    for (int t = 0; t<imageCount; t++)

                        {

                          vector<Point3f> tempPointSet;

                            for (int i = 0; i<board_size.width; i++)//每列

                            {

 

                                for (int j = 0; j<board_size.height; j++)//每行

                                {

                                    Point3f realPoint;

                                    //假設標定板放在世界坐標系中z=0的平面上

                                    realPoint.x = j*square_size.width;//小方格的寬度

                                    realPoint.y = i*square_size.height;

                                    realPoint.z = 0;

                                    tempPointSet.push_back(realPoint);

                                }

                            }

                            object_points.push_back(tempPointSet);

                        }

                     //內外參數對象

                        cameraMatrix = Mat(3, 3, CV_32FC1, Scalar::all(0));//攝像機內參數矩陣

                        //int point_counts;// 每幅圖像中角點的數量

                        distCoeffs = Mat(1, 5, CV_32FC1, Scalar::all(0));//攝像機的5個畸變系數:k1,k2,p1,p2,k3

                        calibrateCamera(object_points, image_points_seq, image_size, cameraMatrix, distCoeffs, rvecsMat, tvecsMat, 0);//相機標定

 

/*二值化*/

                                      Mat origin_image=imread(file);//

                                      Mat grey_image;

                                      cvtColor(origin_image, grey_image, CV_RGB2GRAY);

                                      // imshow("origin",origin_image);

                                      // imshow("grey",grey_image);

                                      //waitKey(100);

                                      Mat binary_image;

                                      threshold(grey_image,binary_image,240, 255, CV_THRESH_BINARY);

                                      //imshow("binary",binary_image);

                                      //尋找二值化圖像上的角點並保存

                                      goodFeaturesToTrack(binary_image,corners,18,0.01,10,Mat());

                                      for(unsigned int i=0;i<corners.size();i++)

 

                                          {

 

                                              circle(origin_image,corners[i],2,Scalar(255,0,0),2);

 

                                          }

                                     // imshow("binary points",origin_image);

 

 

 

/*直線擬合*/

                                         fitLine(corners, line_para, cv::DIST_L2, 0, 1e-2, 1e-2);

                                         cv::Point point0;

                                         point0.x = line_para[2];

                                         point0.y = line_para[3];

                                         double k = line_para[1] / line_para[0];

                                         //計算直線的端點(y = k(x - x0) + y0)

                                         Point point1, point2;

                                         point1.x = 0;

                                         point1.y = k * (0 - point0.x) + point0.y;

                                         point2.x = origin_image.cols;

                                         point2.y = k * (origin_image.cols - point0.x) + point0.y;

                                         cv::line(origin_image, point1, point2, cv::Scalar(255, 255, 0), 2, 8, 0);

                                        // cv::imshow("image", origin_image);

/*獲取結構光與角點直線坐標*/

                                         GetCrossPointAll();

/*將旋轉向量轉換成旋轉矩陣*/                 for(int i=0;i<imageCount;i++)

                                         {   Mat temp;

                                             Rodrigues(rvecsMat[i],temp );

                                             R_matrix.push_back(temp);

                                         }

/*求M矩陣*/

                                         calMatrix_M();

/*求角點的攝像機坐標*/

                                         calCameraCornerPoints();

/*交比不變求結構光與角點直線的攝像機坐標*/

                                         calCameraCrossPoints();

/*擬合光平面*/

                                         fitting_light_surface();

/*保存內外參數*/

                            ofstream fout("struct_caliberation_result.txt");//保存標定結果的文件

                            fout << "相機內參數矩陣:" << endl;

                            fout << cameraMatrix << endl << endl;

                            fout << "畸變系數:\n";

                            fout<<distCoeffs<<endl;

                            fout<<"第一幅圖角點像素坐標"<<endl<<image_points_seq[0]<<endl;

                            fout<<"第一幅圖角點世界坐標"<<endl<<object_points[0]<<endl;

                            fout<<"第一幅圖交點像素坐標"<<endl<<crossPointAll[0]<<endl;

                            fout<<"a b c d"<<image_points_seq[10][0]<<endl<<image_points_seq[10][3]<<endl<<image_points_seq[10][6]<<endl<<crossPointAll[1][0]<<endl;

                            fout<<"camera"<<endl<<corners_in_camera[6][0]<<endl<<corners_in_camera[6][3]<<endl<<corners_in_camera[6][6]<<endl;

                            fout<<plane12[0]<<"    "<<plane12[1]<<"    "<<plane12[2]<<"    "<<plane12[3]<<endl;

                            for(int i=0;i<50;i++)

                            fout<<"X_vector"<<"  "<<X_vector[i]<<"   "<<"Y_vector"<<"  "<<Y_vector[i]<<"   "<<"Z_vector"<<"  "<<Z_vector[i]<<endl;

                             waitKey(0);//停半秒

 

 

}

 

 

Point2f MainWindow:: getcrosspoint(Vec4f lineA,Vec4f lineB)

    {  //求兩條直線角點

        double ka=lineA[1]/lineA[0];

        double kb=lineB[1]/lineB[0];

        Point2f cross_point;

        cross_point.x=(lineB[3]-lineA[3]+ka*lineA[2]-kb*lineB[2])/(ka-kb);

        cross_point.y=ka*(cross_point.x-lineA[2])+lineA[3];

        return cross_point;

    }

 

 

void MainWindow:: GetCrossPointAll()

  {

     for(int i=0;i<imageCount;i++)//11幅圖

      {

          vector<Point2f>crossPointPerPic;

          for(int j=0;j<board_size.height;j++)//遍歷每行

          {

              vector<Point2f>temp;//取每行角點

              for(int m=0;m<board_size.width;m++)

                  temp.push_back(image_points_seq[i][j*7+m]);

              Vec4f para;

              fitLine(temp,para,DIST_L2,0,1e-2,1e-2);//擬合每行角點直線

              Point2f temp_point=getcrosspoint(para,line_para);//得出角點直線與結構光交點

              crossPointPerPic.push_back(temp_point);

 

          }

          crossPointAll.push_back(crossPointPerPic);

      }

 

  }

 

void MainWindow::calMatrix_M()

{

//利用旋轉矩陣和平移向量求得攝像機坐標系和世界坐標系之間的關系矩陣M

    for(int k=0;k<imageCount;k++)

    {  Mat temp(4,4,CV_32F);

        for(int i=0;i<3;i++)

         for(int j=0;j<3;j++)

         {temp.at<float>(i,j)=R_matrix[k].at<double>(i,j);

          temp.at<float>(0,3)=tvecsMat[k].at<double>(0);

          temp.at<float>(1,3)=tvecsMat[k].at<double>(1);

          temp.at<float>(2,3)=tvecsMat[k].at<double>(2);

          temp.at<float>(3,0)=0;temp.at<float>(3,1)=0;temp.at<float>(3,2)=0;temp.at<float>(3,3)=1;

         }

      M.push_back(temp);

    }

}

 

 

 

void MainWindow::calCameraCornerPoints()

{//求出攝像機坐標系下的角點

for(int i=0;i<imageCount;i++)

{

    vector<Mat>temp;

    for(int j=0;j<board_size.height;j++)

        for(int k=0;k<board_size.width;k++)

        {  Mat temp2(4,1,CV_32F);

           Mat world(4,1,CV_32F);world.at<float>(0)=object_points[i][j*7+k].x;//取出世界坐標系下的角點

           world.at<float>(1)=object_points[i][j*7+k].y;world.at<float>(2)=object_points[i][j*7+k].z;

           world.at<float>(3)=1;

            temp2=M[i]*world;//轉換為攝像機坐標系

            temp.push_back(temp2);

        }

    corners_in_camera.push_back(temp);

}

}

 

void MainWindow::calCameraCrossPoints()

{

    for(int i=0;i<imageCount;i++)

    {

        for(int j=0;j<board_size.height;j++)

        {     //每行取三個角點

                Point2f A=image_points_seq[i][j*7];//找到每行的第1,4,7個角點

                Point2f B=image_points_seq[i][j*7+3];

                Point2f C=image_points_seq[i][j*7+6];

                Point2f D=crossPointAll[i][j];//每行的結構光交點

                Mat A_=corners_in_camera[i][j*7];//找到對應角點在攝像機坐標下的坐標

                Mat B_=corners_in_camera[i][j*7+3];

                Mat C_=corners_in_camera[i][j*7+6];

                calCornersInCamera(A,B,C,D,A_,B_,C_);//將角點坐標轉換為攝像機坐標

        }

 

    }

}

 

 void MainWindow::calCornersInCamera(Point2f A,Point2f B,Point2f C,Point2f D,Mat A_,Mat B_,Mat C_)

 {

        double CR=(D.x-B.x)*(A.x-C.x)/((A.x-B.x)*(D.x-C.x));//求出交比

        //利用像素坐標系下求得的交比,利用交比不變性求出結構光點在攝像機下的坐標,已知,A,B,C,D像素坐標和A_,B_,C_,攝像機坐標可求D_坐標

        //由於像素坐標系與物理坐標系是線性關系,可省略像素坐標系轉圖像坐標系這一步,在像素坐標系下求得的交比與在圖像坐標系下求得的交比一樣

        double m=CR*(A_.at<float>(0)-B_.at<float>(0))/(A_.at<float>(0)-C_.at<float>(0));

        double x=(B_.at<float>(0)-m*C_.at<float>(0))/(1-m);

        double y=(B_.at<float>(1)-m*C_.at<float>(1))/(1-m);

        double z=(B_.at<float>(2)-m*C_.at<float>(2))/(1-m);

        X_vector.push_back(x);  Y_vector.push_back(y);  Z_vector.push_back(z);

 

 

 }

 

 

 void MainWindow::fitting_light_surface()

 {

     CvMat*points_mat = cvCreateMat(X_vector.size(), 3, CV_32FC1);

     //定義用來存儲需要擬合點的矩陣大小N*3;

      for (unsigned int i=0;i < X_vector.size(); ++i)

             {

              points_mat->data.fl[i*3+0] = X_vector[i];

               //矩陣的值進行初始化   X的坐標值

             points_mat->data.fl[i * 3 + 1] = Y_vector[i];

              //  Y的坐標值

             points_mat->data.fl[i * 3 + 2] = Z_vector[i];

             //<span style="font-family: Arial, Helvetica, sans-serif;">

             //  Z的坐標值</span>

      }

             cvFitPlane(points_mat, plane12);//調用方程

 }

 

 

 

 

 void MainWindow::cvFitPlane(const CvMat* points, float* plane)

 {

     int nrows = points->rows;

     int ncols = points->cols;

     int type = points->type;

     CvMat* centroid = cvCreateMat(1, ncols, type);

     cvSet(centroid, cvScalar(0));

     for (int c = 0; c<ncols; c++)

     {

      for (int r = 0; r < nrows; r++)

          {

           centroid->data.fl[c] += points->data.fl[ncols*r + c];

          }

      centroid->data.fl[c] /= nrows;

      }

      // Subtract geometric centroid from each point.

      CvMat* points2 = cvCreateMat(nrows, ncols, type);

      for (int r = 0; r<nrows; r++)

         for (int c = 0; c<ncols; c++)

             points2->data.fl[ncols*r + c] = points->data.fl[ncols*r + c] - centroid->data.fl[c];

    // Evaluate SVD of covariance matrix.

    CvMat* A = cvCreateMat(ncols, ncols, type);

    CvMat* W = cvCreateMat(ncols, ncols, type);

    CvMat* V = cvCreateMat(ncols, ncols, type);

    cvGEMM(points2, points, 1, NULL, 0, A, CV_GEMM_A_T);

    cvSVD(A, W, NULL, V, CV_SVD_V_T);

    // Assign plane coefficients by singular vector corresponding to smallest singular value.

    plane[ncols] = 0;

    for (int c = 0; c<ncols; c++){

    plane[c] = V->data.fl[ncols*(ncols - 1) + c];

    plane[ncols] += plane[c] * centroid->data.fl[c];

    }

    // Release allocated resources.

    //cvReleaseMat(¢roid);

    cvReleaseMat(&points2);

    cvReleaseMat(&A);

    cvReleaseMat(&W);

    cvReleaseMat(&V);

 }

 

 

 

 void MainWindow::mousePressEvent(QMouseEvent *e)

 {

        static int i=0;

     if(i<2)

     {

          if(i==0){ui->x1->setText(QString::number(0));ui->y1->setText(QString::number(0));

                   ui->x2->setText(QString::number(0));ui->y2->setText(QString::number(0));

                   ui->realdis->setText(QString::number(0));}

         Point2f temp;

         temp.x = e->x();

         temp.y = e->y()-34;//縱坐標應減去MainWindow上方空白的長度

         if(i==0){ui->x1->setText(QString::number(temp.x));ui->y1->setText(QString::number(temp.y));}

         if(i==1){ui->x2->setText(QString::number(temp.x));ui->y2->setText(QString::number(temp.y));}

         mousePoint.push_back(temp);

         ++i;

     }

     else

     {     //收集到兩個點之后再次點擊鼠標計算距離並清空mousePoint

         if(image_points_seq.size()!=0)

         {           calDistance();

             i-=2;mousePoint.pop_back();mousePoint.pop_back();

         }

         else

         QMessageBox::warning( this, tr("warning"),

                           tr("未標定攝像機和光平面"

                                   "")

                                 );

 

     }

 }

 

 

 void::MainWindow::open()

 {

     QString path = QFileDialog::getOpenFileName(

                         this,

                         "文件對話框",

                         "../",//上一級路徑

                         "Image(*.bmp *.jpg *.png)"

                        );

    QImage* image=new QImage(path);

     if(image->width()>1500)zoom=(double)image->width()/1500;//寬度大於1500像素進行縮放 zoom為縮放比

      int width = image->width()/zoom;//縮放后的寬度

      int height = image->height()/zoom;//縮放后的高度

     //QPixmap fitpixmap = pixmap.scaled(width, height,Qt::KeepAspectRatio Qt::IgnoreAspectRatio, Qt::SmoothTransformation);  // 飽滿填充

    QPixmap pixmap = QPixmap::fromImage(*image);

    QPixmap fitpixmap = pixmap.scaled(width, height, Qt::IgnoreAspectRatio, Qt::SmoothTransformation);  // 按比例縮放

     //重新設置label的面積使圖片充滿整個label區域

     ui->label->resize(width,height);

     ui->label->setPixmap(fitpixmap);

 

 }

void MainWindow::calDistance()

{

    //獲取需要的數據

    //double fx=7079.108034043226;   double fy=7138.477799905151;

    //double u0=1385.469717666468;  double v0=1009.67646851548;

    //double a=0.999773;double b=-0.0105343;double c= 0.0185257;double d= -19.9609;

    double fx= cameraMatrix.at<double>(0,0);

    double fy= cameraMatrix.at<double>(1,1);

    double u0=cameraMatrix.at<double>(0,2);

    double v0=cameraMatrix.at<double>(1,2);

    double a=(double)plane12[0];double b=(double)plane12[1];double c=(double)plane12[2];double d=(double)plane12[3];

    Mat physic_to_pixel(3,3,CV_32F);//歸一化坐標和像素坐標之間的關系矩陣

 

    physic_to_pixel.at<float>(0,0)=(double)fx;physic_to_pixel.at<float>(0,1)=0;physic_to_pixel.at<float>(0,2)=u0;

    physic_to_pixel.at<float>(1,0)=0;physic_to_pixel.at<float>(1,1)=(double)fy;physic_to_pixel.at<float>(1,2)=v0;

    physic_to_pixel.at<float>(2,0)=0;physic_to_pixel.at<float>(2,1)=0;physic_to_pixel.at<float>(2,2)=1;

    //兩個點的像素,攝像機,歸一化物理坐標

    Mat pixel1(3,1,CV_32F);Mat pixel2(3,1,CV_32F);

    Mat camera1(3,1,CV_32F);Mat camera2(3,1,CV_32F);

    Mat physic1(3,1,CV_32F);Mat physic2(3,1,CV_32F);

    //賦值 計算

    pixel1.at<float>(0)=mousePoint[0].x*zoom;pixel1.at<float>(1)=mousePoint[0].y*zoom;pixel1.at<float>(2)=1;

    pixel2.at<float>(0)=mousePoint[1].x*zoom;pixel2.at<float>(1)=mousePoint[1].y*zoom;pixel2.at<float>(2)=1;

 

    physic1=physic_to_pixel.inv()*pixel1;physic2=physic_to_pixel.inv()*pixel2;

 

    camera1.at<float>(0)=(d/(a*physic1.at<float>(0)+b*physic1.at<float>(1)+c))*physic1.at<float>(0); camera1.at<float>(1)=(d/(a*physic1.at<float>(0)+b*physic1.at<float>(1)+c))*physic1.at<float>(1); camera1.at<float>(2)=(d/(a*physic1.at<float>(0)+b*physic1.at<float>(1)+c))*physic1.at<float>(2);

    camera2.at<float>(0)=(d/(a*physic2.at<float>(0)+b*physic2.at<float>(1)+c))*physic2.at<float>(0);camera2.at<float>(1)=(d/(a*physic2.at<float>(0)+b*physic2.at<float>(1)+c))*physic2.at<float>(1);camera2.at<float>(2)=(d/(a*physic2.at<float>(0)+b*physic2.at<float>(1)+c))*physic2.at<float>(2);

    //得到兩點在攝像機坐標下得距離

    double dis=sqrt((camera1.at<float>(0)-camera2.at<float>(0))*(camera1.at<float>(0)-camera2.at<float>(0))+(camera1.at<float>(1)-camera2.at<float>(1))*(camera1.at<float>(1)-camera2.at<float>(1))+(camera1.at<float>(2)-camera2.at<float>(2))*(camera1.at<float>(2)-camera2.at<float>(2)));

     ui->realdis->setText(QString::number(dis));

 

}

 

void MainWindow::pushbutton1()

{

    path=ui->filename->toPlainText();

    board_size.width=ui->cols->value();board_size.height=ui->rows->value();

    square_size.width=square_size.height=ui->dis->value();

    imageCount=ui->pic->value();

    squareCalibrate();

    QString text("%1x+%2y+%3z\n=%4");text=text.arg(plane12[0]).arg(plane12[1]).arg(plane12[2]).arg(plane12[3]);

    ui->surface->setText(text);

}

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM