激光雷達與相機融合(六)-------目標檢測與跟蹤ros實時版


本部分中,將之前的優達學城的整套目標檢測與跟蹤算法改寫為ros實時處理,但改寫完成后利用我現有的數據包實時檢測跟蹤,並計算TTC,發現效果不盡人意啊。。。算法的魯棒性整體較差,環境稍微復雜一點計算的碰撞時間就有問題,甚至為負值。另外在實際運行時,還會時不時出現core dumped內存泄漏的問題,不知道是哪里寫的有問題,有同學發現錯誤歡迎留言交流,萬分感謝。

大概講解一下整套算法結構。

1.整體框架
這一部分的代碼被我改寫為一個ros package的形式,包名為ros_detection_tracking,包下有兩個節點-----ros_ttc和project,project節點在上一個博客中已經詳細解釋了。在這個包中Cmakelist.txt文件中project的部分被我注釋了,需要運行的朋友可以自行找到下列幾行取消注釋,編譯就可以運行project節點了。

#add_executable(projector src/project/project.cpp)
#target_link_libraries(projector ${catkin_LIBRARIES}
#                                ${OpenCV_LIBS})
#add_dependencies(projector ${${PROJECT_NAME}_EXPORTED_TARGETS} ${catkin_EXPORTED_TARGETS})

  

主要的頭文件存放在目錄include下,源程序存放在src/ros_ttc目錄下,配置文件放置在cfg目錄下,YOLO的相關配置和權重文件放置在dat/yolo目錄下。

2.主要函數
寫成了一個節點實現,方法簡單粗暴,該節點的主體加上構造函數就只有三個函數。
分別是主要的回調函數、參數加載函數、以及初始化構造函數。

 void detectandtrack_callback(const sensor_msgs::Image::ConstPtr &img, 
                                const sensor_msgs::PointCloud2::ConstPtr &pc);
 void initParams();
 DetectandTrack();

  

當然,絕大部分的調用到的函數都在其余的源文件中。
這里就大概貼一下主要的回調函數,這個函數包含了整個目標檢測跟蹤的整體流程。包括數據加載以及點雲分割、利用YOLO進行目標檢測、基於boundingbox對點雲聚類、基於圖像的特征點提取、描述、匹配並跟蹤目標,最后基於點雲計算碰撞時間。
由於博主水平有限,改寫的部分有誤的還望批評指正,歡迎交流。
void detectandtrack_callback(const sensor_msgs::Image::ConstPtr &img,
const sensor_msgs::PointCloud2::ConstPtr &pc);

void DetectandTrack::detectandtrack_callback(const sensor_msgs::Image::ConstPtr &img, 
                                const sensor_msgs::PointCloud2::ConstPtr &pc)
{
    /*1.construct data Frame and crop*/
    cv_bridge::CvImagePtr cv_ptr;
     try {
            cv_ptr = cv_bridge::toCvCopy(img, "bgr8");
        }
    catch (cv_bridge::Exception &e) {
        return;
    }
    cv::Mat raw_img = cv_ptr->image;
    pcl::PointCloud<pcl::PointXYZ>::Ptr cloud(new pcl::PointCloud<pcl::PointXYZ>);
    pcl::fromROSMsg(*pc, *cloud);
    
    cv::Mat visImg = raw_img.clone();
    cv::Mat overlay = visImg.clone();
    pcl::PointCloud<pcl::PointXYZ>::Ptr cloudA(new pcl::PointCloud<pcl::PointXYZ>);
    cloudA = cloud;
    DataFrame frame;
    frame.cameraImg = raw_img;

    cropLidarPoints(cloudA, i_params.crop_minX, i_params.crop_maxX, i_params.crop_Y, i_params.crop_minZ, i_params.crop_maxZ);//crop pointcloud
    
    //frame.pointcloud = *cloudA;
  
    cout<<"/*1.construct data Frame and crop*/ done"<<endl;

   /*2.detection and classification*/
   detectObjects(frame.cameraImg, frame.boundingBoxes, confThreshold, nmsThreshold, i_params.yoloBasePath, i_params.yoloClassesFile,
                    i_params.yoloModelConfiguration, i_params.yoloModelWeights, bVis);

    cout<<"/*2.detection and classification*/ has done"<<endl;

    /*3.cluster the point with the bounding box*/

    clusterLidarWithROI(frame.boundingBoxes, cloudA, shrinkFactor, i_params.cameraIn, i_params.camtocam_mat, i_params.RT);
    cout<<" /*3.cluster the point with the bounding box*/ done" <<endl;

     /*4.detect keypoint*/
     cv::cvtColor(frame.cameraImg, imgGray, cv::COLOR_BGR2GRAY);
     detKeypointsShiTomasi(keypoints, imgGray, false);
     frame.keypoints = keypoints;
    cout<<"/*4.detect keypoint*/ done"<<endl;

    /*5.extract keypoint descriptors*/
    descKeypoints(frame.keypoints, frame.cameraImg, descriptors, "BRISK");
    frame.descriptors = descriptors;
    cout<<"/*5.extract keypoint descriptors*/ done"<<endl;
    
    //cout<<first_frame<<endl;
    if (first_frame){
        databuffer.push_back(frame); //初始化存儲第一幀
        first_frame = false; //第一幀存儲完后,該標志為變為false
        cout<<"first frame done"<<endl;
    }
    else {
        if(databuffer[0].boundingBoxes[0].lidarPoints.size()>0){
            cout<<"databuffer size:"<<databuffer[0].boundingBoxes[0].lidarPoints.size()<<endl;}
       
            /*6.match keypoints*/
            databuffer.push_back(frame);
            matchDescriptors(databuffer[0].keypoints, databuffer[1].keypoints,
                                databuffer[0].descriptors, databuffer[1].descriptors,
                                matches, descriptorType, matcherType, selectorType);
            databuffer[1].kptMatches = matches;
            cout<<"/*6.match keypoints*/ done"<<endl;

            /*7.track object*/
            matchBoundingBoxes(matches, bbBestMatches, databuffer[0], databuffer[1]);
            databuffer[1].bbMatches = bbBestMatches;
            cout<<"/*7.track object*/"<<endl;
            //cout<<int(databuffer.size())<<endl;

            if (databuffer[1].bbMatches.size()>0)
            {   
              
                /*8.computer TTC*/
                for (auto it1 = databuffer[1].bbMatches.begin(); it1 != databuffer[1].bbMatches.end(); ++it1)
                { 
                    if (databuffer[1].boundingBoxes.size()>0 &&databuffer[0].boundingBoxes.size()>0){
                    
                        // find bounding boxes associates with current match
                        
                        for (auto it2 = databuffer[1].boundingBoxes.begin(); it2 != databuffer[1].boundingBoxes.end(); ++it2)
                        {
                            if (it1->second == it2->boxID) // check wether current match partner corresponds to this BB
                            {
                                currBB = & (*it2);
                                
                            }
                        }
                    
                  
                     
                        for (auto it2 = databuffer[0].boundingBoxes.begin(); it2 != databuffer[0].boundingBoxes.end(); ++it2)
                        {
                            if (it1->first == it2->boxID) // check wether current match partner corresponds to this BB
                            {
                                prevBB = & (*it2);
                            }
                        }
                    }
                      else{
                        BoundingBox *box, *box1;
                        box->boxID = -1;
                        box1->boxID = -1;
                        currBB = box;
                        prevBB = box1;
                    }
                   
                   // cout<<double(currBB->lidarPoints.size())<<"and"<<double(prevBB->lidarPoints.size())<<endl;
                   // if( currBB->lidarPoints.size()>0 && prevBB->lidarPoints.size()>0 ) // only compute TTC if we have Lidar points
                   if(currBB->boxID != -1 && prevBB->boxID != -1 && prevBB->lidarPoints.size()>0 && currBB->lidarPoints.size()>0)
                        {
                             STUDENT ASSIGNMENT
                             TASK FP.2 -> compute time-to-collision based on Lidar data (implement -> computeTTCLidar)
                            double ttcLidar; 
                            computeTTCLidar(prevBB->lidarPoints, currBB->lidarPoints, 10, ttcLidar, true);
                            std_msgs::Float32 ttc_lidar;
                            ttc_lidar.data = ttcLidar;
                            ttc_publisher.publish(ttc_lidar);
                            cout<<"/*8.computer TTC*/ done"<<endl;
                        }
                }
            }

        databuffer[0] = databuffer[1];//將后一幀往前移動
        databuffer.pop_back();//刪除后一幀,為下一幀提供空位
        cout<<databuffer.size()<<endl;
         cout<<"frame pairs process done"<<endl;
    }
    
}

  

3.配置文件

initial_params.txt:設置接收的點雲圖像話題,以及點雲篩分參數,傳感器標定參數

/kitti/camera_color_left/image_raw
/kitti/velo/pointcloud
-1.3 -0.1 2.0 50 1.2 0.1 //點雲分割參數,單位米 crop_minZ, crop_maxZ, crop_minX, crop_maxX, crop_Y, crop_minR
9.998817e-01 1.511453e-02 -2.841595e-03 0.0
-1.511724e-02 9.998853e-01 -9.338510e-04 0.0
2.827154e-03 9.766976e-04 9.999955e-01 0.0
0 0 0 1           //相機與相機外參矩陣,這里用的02相機
7.215377e+02 0.000000e+00 6.095593e+02 4.485728e+01
0.000000e+00 7.215377e+02 1.728540e+02 2.163791e-01
0.000000e+00 0.000000e+00 1.000000e+00 2.745884e-03 //相機內參
7.533745e-03 -9.999714e-01 -6.166020e-04 -4.069766e-03
1.480249e-02 7.280733e-04 -9.998902e-01 -7.631618e-02
9.998621e-01 7.523790e-03 1.480755e-02 -2.717806e-01
0 0 0 1   					//雷達與相機外參矩陣

  detect_config.txt :設置YOLO的配置文件和權重文件路徑,最好使用絕對路徑。

/home/base/shared_dir/sensorfusion/catkin_ws/src/ros_detection_tracking/dat/yolo/
/home/base/shared_dir/sensorfusion/catkin_ws/src/ros_detection_tracking/dat/yolo/coco.names
/home/base/shared_dir/sensorfusion/catkin_ws/src/ros_detection_tracking/dat/yolo/yolov3.cfg
/home/base/shared_dir/sensorfusion/catkin_ws/src/ros_detection_tracking/dat/yolo/yolov3.weights

  

4.運行環境配置
在這里需要用到3.4.3以上的openCV,還需要使用到cvbridge,這里會出現cvbridge的低版本opencv與自己安裝的高版本沖突的問題,解決方法是自己下載cvbridge包單獨編譯,並連接到自己安裝的openCV,具體方法參考我之前的博客。參考

最后附上完整的代碼下載地址:git clone https://github.com/jhzhang19/ros_detectandtrack.git
YOLOv3權重文件需要自行下載放置在相應位置。

附錄:kitti完整標定參數
calib_cam_to_cam.txt

calib_time: 09-Jan-2012 13:57:47
corner_dist: 9.950000e-02
S_00: 1.392000e+03 5.120000e+02
K_00: 9.842439e+02 0.000000e+00 6.900000e+02 0.000000e+00 9.808141e+02 2.331966e+02 0.000000e+00 0.000000e+00 1.000000e+00
D_00: -3.728755e-01 2.037299e-01 2.219027e-03 1.383707e-03 -7.233722e-02
R_00: 1.000000e+00 0.000000e+00 0.000000e+00 0.000000e+00 1.000000e+00 0.000000e+00 0.000000e+00 0.000000e+00 1.000000e+00
T_00: 2.573699e-16 -1.059758e-16 1.614870e-16
S_rect_00: 1.242000e+03 3.750000e+02
R_rect_00: 9.999239e-01 9.837760e-03 -7.445048e-03 -9.869795e-03 9.999421e-01 -4.278459e-03 7.402527e-03 4.351614e-03 9.999631e-01
P_rect_00: 7.215377e+02 0.000000e+00 6.095593e+02 0.000000e+00 0.000000e+00 7.215377e+02 1.728540e+02 0.000000e+00 0.000000e+00 0.000000e+00 1.000000e+00 0.000000e+00
S_01: 1.392000e+03 5.120000e+02
K_01: 9.895267e+02 0.000000e+00 7.020000e+02 0.000000e+00 9.878386e+02 2.455590e+02 0.000000e+00 0.000000e+00 1.000000e+00
D_01: -3.644661e-01 1.790019e-01 1.148107e-03 -6.298563e-04 -5.314062e-02
R_01: 9.993513e-01 1.860866e-02 -3.083487e-02 -1.887662e-02 9.997863e-01 -8.421873e-03 3.067156e-02 8.998467e-03 9.994890e-01
T_01: -5.370000e-01 4.822061e-03 -1.252488e-02
S_rect_01: 1.242000e+03 3.750000e+02
R_rect_01: 9.996878e-01 -8.976826e-03 2.331651e-02 8.876121e-03 9.999508e-01 4.418952e-03 -2.335503e-02 -4.210612e-03 9.997184e-01
P_rect_01: 7.215377e+02 0.000000e+00 6.095593e+02 -3.875744e+02 0.000000e+00 7.215377e+02 1.728540e+02 0.000000e+00 0.000000e+00 0.000000e+00 1.000000e+00 0.000000e+00
S_02: 1.392000e+03 5.120000e+02
K_02: 9.597910e+02 0.000000e+00 6.960217e+02 0.000000e+00 9.569251e+02 2.241806e+02 0.000000e+00 0.000000e+00 1.000000e+00
D_02: -3.691481e-01 1.968681e-01 1.353473e-03 5.677587e-04 -6.770705e-02
R_02: 9.999758e-01 -5.267463e-03 -4.552439e-03 5.251945e-03 9.999804e-01 -3.413835e-03 4.570332e-03 3.389843e-03 9.999838e-01
T_02: 5.956621e-02 2.900141e-04 2.577209e-03
S_rect_02: 1.242000e+03 3.750000e+02
R_rect_02: 9.998817e-01 1.511453e-02 -2.841595e-03 -1.511724e-02 9.998853e-01 -9.338510e-04 2.827154e-03 9.766976e-04 9.999955e-01
P_rect_02: 7.215377e+02 0.000000e+00 6.095593e+02 4.485728e+01 0.000000e+00 7.215377e+02 1.728540e+02 2.163791e-01 0.000000e+00 0.000000e+00 1.000000e+00 2.745884e-03
S_03: 1.392000e+03 5.120000e+02
K_03: 9.037596e+02 0.000000e+00 6.957519e+02 0.000000e+00 9.019653e+02 2.242509e+02 0.000000e+00 0.000000e+00 1.000000e+00
D_03: -3.639558e-01 1.788651e-01 6.029694e-04 -3.922424e-04 -5.382460e-02
R_03: 9.995599e-01 1.699522e-02 -2.431313e-02 -1.704422e-02 9.998531e-01 -1.809756e-03 2.427880e-02 2.223358e-03 9.997028e-01
T_03: -4.731050e-01 5.551470e-03 -5.250882e-03
S_rect_03: 1.242000e+03 3.750000e+02
R_rect_03: 9.998321e-01 -7.193136e-03 1.685599e-02 7.232804e-03 9.999712e-01 -2.293585e-03 -1.683901e-02 2.415116e-03 9.998553e-01
P_rect_03: 7.215377e+02 0.000000e+00 6.095593e+02 -3.395242e+02 0.000000e+00 7.215377e+02 1.728540e+02 2.199936e+00 0.000000e+00 0.000000e+00 1.000000e+00 2.729905e-03

  calib_imu_to_velo.txt

calib_time: 25-May-2012 16:47:16
R: 9.999976e-01 7.553071e-04 -2.035826e-03 -7.854027e-04 9.998898e-01 -1.482298e-02 2.024406e-03 1.482454e-02 9.998881e-01
T: -8.086759e-01 3.195559e-01 -7.997231e-01

  calib_velo_to_cam.txt

calib_time: 15-Mar-2012 11:37:16
R: 7.533745e-03 -9.999714e-01 -6.166020e-04 1.480249e-02 7.280733e-04 -9.998902e-01 9.998621e-01 7.523790e-03 1.480755e-02
T: -4.069766e-03 -7.631618e-02 -2.717806e-01
delta_f: 0.000000e+00 0.000000e+00
delta_c: 0.000000e+00 0.000000e+00

  

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM