透視變換原理我也不細說,原理可以參考:https://blog.csdn.net/xiaowei_cqu/article/details/26471527
在opencv中只要調兩個函數就可以了。
cv::Mat warpMatrix = cv::getPerspectiveTransform(src_pt, dst_pt);
cv::warpPerspective(SrcImg, m_src_correct, warpMatrix, SrcImg.size(), cv::INTER_NEAREST, cv::BORDER_CONSTANT);
src_pt是原來物體的4個點坐標,dst_pt是根據原來的坐標計算得來的,一般是原來坐標的外接矩形。
#include <iostream>
#include <opencv2/opencv.hpp>
bool correct_cjh_3(cv::Point2f pt_tl,cv::Point2f pt_bl,cv::Point2f pt_tr,cv::Point2f pt_br,cv::Mat &SrcImg,cv::Mat &m_MingP,bool b_debug)
{
// 0 2
// 1 3
Mat m_src_correct;
cv::Point2f src_pt[4];
cv::Point2f dst_pt[4];
src_pt[0] = pt_tl;
src_pt[1] = pt_bl;
src_pt[2] = pt_tr;
src_pt[3] = pt_br;
int T_1 = 0;
int T_2 = 0;
dst_pt[0] = cv::Point(MIN(src_pt[0].x,src_pt[1].x) - T_1,MIN(src_pt[0].y,src_pt[2].y));
dst_pt[1] = cv::Point(MIN(src_pt[0].x,src_pt[1].x) -T_1,MAX(src_pt[1].y,src_pt[3].y));
dst_pt[2] = cv::Point(MAX(src_pt[2].x,src_pt[3].x) + T_2,MIN(src_pt[0].y,src_pt[2].y));
dst_pt[3] = cv::Point(MAX(src_pt[2].x,src_pt[3].x) + T_2,MAX(src_pt[1].y,src_pt[3].y));
Point ptout_tl(dst_pt[0].x,dst_pt[0].y);
Point ptout_br(dst_pt[3].x,dst_pt[3].y);
Rect roi_mingp = Rect(ptout_tl,ptout_br);
cv::Mat warpMatrix = cv::getPerspectiveTransform(src_pt, dst_pt);
cout<<"warpMatrix=\n"<<warpMatrix<<endl;
cv::warpPerspective(SrcImg, m_src_correct, warpMatrix, SrcImg.size(), cv::INTER_NEAREST, cv::BORDER_CONSTANT);
RoiCorrect(m_src_correct,roi_mingp);
m_MingP = m_src_correct(roi_mingp).clone();
Point pt_center = Point(SrcImg.cols/2,SrcImg.rows/2);
if(b_debug)
{
Mat m_show = m_src_correct.clone();
rectangle(m_show,roi_mingp,cv::Scalar(0,255,255),1);
Mat SrcImg_cp = SrcImg.clone();
circle(SrcImg_cp,pt_center,9,Scalar(255,0,0),3);
cv::namedWindow("SrcImg",cv::WINDOW_NORMAL);
cv::imshow("SrcImg",SrcImg_cp);
cv::Mat_<double> mat_pt(3,1);
mat_pt(0,0) = pt_center.x;
mat_pt(0,1) = pt_center.y;
mat_pt(0,2) = 1;
cout<<"mat_pt==\n"<<mat_pt<<endl;
Point pt_correct;
Mat mat_tmp = warpMatrix * mat_pt;
std::cout<<"mat_tmp=\n"<<mat_tmp<<std::endl;
double a1 = mat_tmp.at<double>(0,0);
double a2 = mat_tmp.at<double>(1,0);
double a3 = mat_tmp.at<double>(2,0);
cout<<"a1="<<a1<<" a2="<<a2<<" a3="<<a3<<endl;
pt_correct = Point(a1,a2);
circle(m_show,pt_correct,10,cv::Scalar(0,255,255),8);
cv::namedWindow("correctPic",cv::WINDOW_NORMAL);
cv::imshow("correctPic",m_show);
cv::namedWindow("correctRoi",cv::WINDOW_NORMAL);
cv::imshow("correctRoi",m_MingP);
waitKey(0);
}
return true;
}
int main() {
Mat img = imread("/data_2/everyday/0317/bugall_snapshot22.png");
cv::Point2f pt_tl = Point2f(367,0);
cv::Point2f pt_tr = Point2f(605,58);
cv::Point2f pt_bl = Point2f(11,162);
cv::Point2f pt_br = Point2f(281,351);
Mat m_roi;
bool b_debug = true;
correct_cjh_3(pt_tl,pt_bl,pt_tr, pt_br,img,m_roi,b_debug);
}
一般情況下我們用透視變換到這里就可以了,拿到透視變換后的圖繼續處理就可以,但是在某些情況下需要點的映射。比如原圖的中心點經過透視變換之后該點在哪里?按照下面的矩陣相乘,按理說,點坐標乘以透視變換矩陣就可以了。
但是,實踐起來並不是這樣的,透視變換矩陣是33的,點坐標是(x,y),為了相乘,點坐標需要補上1,(x,y,1)。因為透視變換是三維里面的變換,需要z坐標。代碼中可以看到我把各個矩陣都打印出來了:
warpMatrix=
[2.175396430644075, 4.822550437851066, -786.7332303651573;
-0.5767283475053204, 2.366574943211475, 211.6593035344521;
0.0001578547637398106, 0.004169585228612301, 1]
mat_pt==
[316;
186;
1]
mat_tmp=
[797.6864231586686;
469.5960851601052;
1.825424957863668]
a1=797.686 a2=469.596 a3=1.82542
在變換之后的圖上並沒有找到我畫的變換之后的中心點。應該是超出圖像范圍了。。。經過一頓亂操作,左乘右乘還是不行。再仔細看看圖片,透視變換之后好像被裁剪了一點,感覺是因為裁剪導致坐標對不上了的。。。。后來同事幫忙,弄對了。說需要歸一化,讓我把得到的點第三個坐標變為1,這樣歸一化同一個平面。
pt_correct = Point(a11.0/a3,a2*1.0/a3);
哈哈,果真出來了!!!
還有個問題,就是已知透視變換之后的圖上點,如何知道該點變換之前的坐標?