raulmur / ORB_SLAM2

Real-Time SLAM for Monocular, Stereo and RGB-D Cameras, with Loop Detection and Relocalization Capabilities
Other
9.37k stars 4.69k forks source link

How to get trackedMapPoint 3D coordinate #745

Closed jiqirenzhifu closed 5 years ago

jiqirenzhifu commented 5 years ago

Hello everyone, I am a newcomer to SLAM. I want to know how to get the 3D coordinates of the green dot on the display interface. Who can tell me, or what ideas do you have? Thank you very much for your reply.

Shubhamvithalani commented 5 years ago

@jiqirenzhifu In your ros_mono.cc file add these lines. It will publish tracked Mappoints as a pointcloud message in ROS. sensor_msgs::PointCloud cloud; cloud.header.frame_id = "camera_link"; std::vector geo_points; std::vector<ORB_SLAM2::MapPoint*> points = mpSLAM->GetTrackedMapPoints(); //cout << points.size() << endl; //cout << "Points size: " << points.size() << endl; for (std::vector::size_type i = 0; i != points.size(); i++) {

    if (points[i]) {
        cv::Mat coords = points[i]->GetWorldPos();
        geometry_msgs::Point32 pt;
        pt.x = coords.at<float>(2);
        pt.y = -coords.at<float>(0);
        pt.z = -coords.at<float>(1);
        geo_points.push_back(pt);
    } else {
    }
}
//cout << geo_points.size() << endl;
cloud.points = geo_points;
cloud_pub.publish(cloud);
cout << "" << endl;

The published message is in this format. http://docs.ros.org/melodic/api/sensor_msgs/html/msg/PointCloud.html You will get the 3D coordinates of all these points. Also take care for Monocular scale is different for every initialisation.

jiqirenzhifu commented 5 years ago

@Shubhamvithalani Thank you very much for your reply. Your answer is very helpful. If I don't use ROS, is there any good way to get 3D points? Thank you for any reply.

Shubhamvithalani commented 5 years ago

@jiqirenzhifu Yes you can but depends on which way you want the output. As a text file or all you want to use them some other way.

jiqirenzhifu commented 5 years ago

@Shubhamvithalani Thank you very much, I want to save those 3D points as a TXT file, in order to render in the front end.

Shubhamvithalani commented 5 years ago
float x,y,z;
std::vector<ORB_SLAM2::MapPoint*> points = mpSLAM->GetTrackedMapPoints();
for (std::vector::size_type i = 0; i != points.size(); i++) {
if (points[i]){
cv::Mat coords = points[i]->GetWorldPos();
 x = coords.at<float>(2);
 y = -coords.at<float>(0);
 z = -coords.at<float>(1);
//////////////////////////////////////////////////////////////////////////
////  Enter Your Code Here to save file //////////////
/////////////////////////////////////////////////////////////////////////
}
}

Try to write your code to save file using this, https://www.geeksforgeeks.org/readwrite-class-objects-fromto-file-c/

jiqirenzhifu commented 5 years ago

@Shubhamvithalani Thanks a lot.but when I used std::vector<ORB_SLAM2::MapPoint*> points = mpSLAM->GetTrackedMapPoints(); There was no error when compiling, but the following error was thrown during execution. Segmentation fault (core dumped) I tried to modify the code to std::vector<ORB_SLAM2::MapPoint*> points =mTrackedMapPoints; But it seems to just save the coordinates of one frame, but I want to save the coordinates of the green tracking point of each frame, I used the following code. for (int i = 0; i < mCurrentFrame.N; i++) {//****the code of yours *****//} But the output seems to be some repetitive points, I am troubled by this, can you tell me how to solve it? Thank you for any reply.

qiuliangcheng commented 1 year ago

hello wen i use the code : ` sensor_msgs::PointCloud cloud;

cloud.header.frame_id = "camera_link";

std::vector<geometry_msgs::Point32> geo_points;

std::vector<ORB_SLAM2::MapPoint*> points = mpSLAM->GetTrackedMapPoints();

//std::vector<ORB_SLAM2::MapPoint*> points =mTrackedMapPoints;

for (std::vector<geometry_msgs::Point32>::size_type i = 0; i != points.size(); i++) {

    if (points[i]) {

        cv::Mat coords = points[i]->GetWorldPos();

        geometry_msgs::Point32 pt;

        pt.x = coords.at<float>(2);

        pt.y = -coords.at<float>(0);

        pt.z = -coords.at<float>(1);

        geo_points.push_back(pt);

} 

    else {

}

}

cloud.points = geo_points;`

i have a error :Floating point exception (core dumped) I don't know why. My complete code is : `void ImageGrabber::GrabImage(const sensor_msgs::ImageConstPtr& msg)

{

// Copy the ros image message to cv::Mat.

cv_bridge::CvImageConstPtr cv_ptr;

try

{

    cv_ptr = cv_bridge::toCvShare(msg);

}

catch (cv_bridge::Exception& e)

{

    ROS_ERROR("cv_bridge exception: %s", e.what());

    return;

}

bool test=0;

//cv::Mat Tcw = mpSLAM->TrackMonocular(cv_ptr->image,cv_ptr->header.stamp.toSec());

cv::Mat position = mpSLAM->TrackMonocular(cv_ptr->image,cv_ptr->header.stamp.toSec());

nav_msgs::Path memo_path;

//

//geometry_msgs::PoseStamped pose;

//pose.header.stamp = ros::Time::now();

//pose.header.frame_id ="path";

//test = !Tcw;

ros::Time current_frame_time_ = ros::Time::now();

sensor_msgs::PointCloud cloud;

cloud.header.frame_id = "camera_link";

std::vector<geometry_msgs::Point32> geo_points;

std::vector<ORB_SLAM2::MapPoint*> points = mpSLAM->GetTrackedMapPoints();

//std::vector<ORB_SLAM2::MapPoint*> points =mTrackedMapPoints;

for (std::vector<geometry_msgs::Point32>::size_type i = 0; i != points.size(); i++) {

    if (points[i]) {

        cv::Mat coords = points[i]->GetWorldPos();

        geometry_msgs::Point32 pt;

        pt.x = coords.at<float>(2);

        pt.y = -coords.at<float>(0);

        pt.z = -coords.at<float>(1);

        geo_points.push_back(pt);

} 

    else {

}

}

cloud.points = geo_points;

//cloud_pub.publish(cloud);

if (!position.empty()) {

  tf::Transform transform = TransformFromMat (position);

  //static tf::TransformBroadcaster tf_broadcaster;

  static tf::TransformBroadcaster tf_broadcaster;

  current_frame_time_=ros::Time::now();

  geometry_msgs::PoseStamped pose_msg;

  pose_msg.header.frame_id ="map";

  tf_broadcaster.sendTransform(tf::StampedTransform(transform, current_frame_time_, "map", "camera_link"));

  tf::Transform grasp_tf = TransformFromMat (position);

  tf::Stamped<tf::Pose> grasp_tf_pose(grasp_tf, current_frame_time_, "map");

  memo_path.header.frame_id="map";

  tf::poseStampedTFToMsg (grasp_tf_pose, pose_msg);

  pose_pub.publish(pose_msg);

  memo_path.header.stamp=ros::Time::now();

  memo_path.poses.push_back(pose_msg);

  memo_path_pub.publish(memo_path);

}

// test if 

//if(test==1){

  //  tf::Transform new_transform;

    //tf::poseTFToMsg(new_transform, pose.pose);

   // pose_pub.publish(pose);

// }

//else{

//cv::Mat Rwc = Tcw.rowRange(0,3).colRange(0,3).t(); // Rotation information

//cv::Mat twc = -Rwc*Tcw.rowRange(0,3).col(3); // translation information

//vector<float> q = ORB_SLAM2::Converter::toQuaternion(Rwc);

//tf::Transform new_transform;

//new_transform.setOrigin(tf::Vector3(twc.at<float>(0, 0), twc.at<float>(0, 1), twc.at<float>(0, 2)));

//tf::Quaternion quaternion(q[0], q[1], q[2], q[3]);

//new_transform.setRotation(quaternion);

//tf::poseTFToMsg(new_transform, pose.pose);

//pose_pub.publish(pose);

//memo_path.header.frame_id="path";

//memo_path.header.stamp=ros::Time::now();

//memo_path.poses.push_back(pose);

//memo_path_pub.publish(memo_path);

//}

}

`