wh200720041 / ssl_slam2

SSL_SLAM2: Lightweight 3-D Localization and Mapping for Solid-State LiDAR (mapping and localization separated) ICRA 2021
GNU General Public License v3.0
392 stars 71 forks source link

How to apply the trajectory to point cloud from the bags? #4

Closed cdb0y511 closed 3 years ago

cdb0y511 commented 3 years ago

Hi,@wh200720041. I am trying to apply your work to do some 3D reconstructions. But I am not familiar with the ROS, so maybe I make some basic mistakes. I output the trajectory from /odom as your said https://github.com/wh200720041/ssl_slam/issues/14#issuecomment-792251212 And I also output the original point from the bag. However, the pcl::transformPointCloud for output pose and the original point goes completely wrong, what I am miss? The pose and pcd are obtained through odomEstimationLocalizationNode.cpp

I make a fork, maybe you can point out the problem. Thanks.

https://github.com/cdb0y511/ssl_slam2/blob/fec5de0e21c49801e9a5ccfb6c921929ada04855/src/odomEstimationLocalizationNode.cpp#L127

https://github.com/cdb0y511/ssl_slam2/blob/fec5de0e21c49801e9a5ccfb6c921929ada04855/src/odomEstimationLocalizationNode.cpp#L99

cdb0y511 commented 3 years ago

Screenshot from 2021-04-02 16-35-19 Screenshot from 2021-04-02 16-36-50 I follow the instruction to do mapping and location. The trajectory from location seems promising. However, the output pose applies on saved pcd goes completely wrong.

cdb0y511 commented 3 years ago

Maybe I am confused with slam and sfm. For reconstruction the sfm is preferred, namely, camera poses. I think the quaternions from \odom are the poses in the world coordinate, but it is the relative pose from the last frame. Am I right?

e04su3184 commented 3 years ago

It seems like your feed in the wrong omomtry. While running the sample you should not run other cameras.

cdb0y511 commented 3 years ago

Hi, @e04su3184 You mean I saved the wrong pose in the https://github.com/cdb0y511/ssl_slam2/blob/fec5de0e21c49801e9a5ccfb6c921929ada04855/src/odomEstimationLocalizationNode.cpp#L127 How to make it right?

e04su3184 commented 3 years ago

As far as I know, the reason why your pcd file goes completely wrong is within the construction process you might feed redundant features that you don`t need to be. Maybe you can try again and check rviz result while constructing maps.

wh200720041 commented 3 years ago

Hi

The mapping and localization modules are two independent modules, they should not run together.

The SFM and SLAM are trying to solve the same thing, while the term SFM is often used in computer vision and SLAM is often used in robotics. In robotics, the mapping and localization units are often separated, because the robot needs to localize itself during a very long period (the map is usually not changing so that it is not necessary to keep updating the map). Basically, we need to get a map first, and then perform localization on that map.

In you case I can see that you are able to get the mapping result. after get the mapping result, you need to quit the mapping thread by stopping the mapping node. then launch the localization node. Dont run this two nodes simultaneously.

I looked at your code, it is better to put

if(isSavePCD){
                std::string path = savePCD_path+std::to_string(total_frame)+".pcd";
                pcl::io::savePCDFile<pcl::PointXYZRGB>(path,*pointcloud_in);
            }

outside the mutex lock as the writing process is very slow (especially when you are not using SSD drive). I think if you enable the save flag some frame will be discarded by ROS in order to achieve real-time performance. This will lead to tracking loss.

cdb0y511 commented 3 years ago

@wh200720041 ,thanks for reply. I have followed your instruction. Run map first, and the surf and edge point are saved. Then, the localization node load the surf and edg point cloud (with the same bag). The images may confuse you, I just want to illustrate the pose and reconstruction. I put save pose process in the localization node (I will try your way later). My question is

  1. the saved Quaternion is the camera pose, which transform to the gobal coordinate or to last frame?
  2. the saved Quaternion is corresponding to each frame from the input bag, (from 0 to end)? In my experiment, I use the pcl::tranformpointcloud ( the piont cloud, same point cloud, and corresponding saved quaternion. matrix). It seems completely wrong. I think I make some mistake on coordinate system. Forgive me, I am not familiar with robotics. I study on reconstruction. But I think your work is very useful for me. I am looking forward your reply, thanks again.
cdb0y511 commented 3 years ago

In you case I can see that you are able to get the mapping result. after get the mapping result, you need to quit the mapping thread by stopping the mapping node. then launch the localization node. Dont run this two nodes simultaneously.

I looked at your code, it is better to put

if(isSavePCD){
                std::string path = savePCD_path+std::to_string(total_frame)+".pcd";
                pcl::io::savePCDFile<pcl::PointXYZRGB>(path,*pointcloud_in);
            }

outside the mutex lock as the writing process is very slow (especially when you are not using SSD drive). I think if you enable the save flag some frame will be discarded by ROS in order to achieve real-time performance. This will lead to tracking loss.

hi,@wh200720041 quite interesting, you mean the saved pcd may not match with the saved pose. But I got the same number of pcds and poses. It maybe disordered?

wh200720041 commented 3 years ago

@cdb0y511

To your questions:

  1. The odom is in global coordinates

  2. Yes, the poses are generated frame by frame. For the tranformpointcloud, I could find it in your forked code, where did you put the code?

  3. If you have the same number of pcds and poses, then there is no problems with your code, the pcds and poses are in ordered. The code I point out is just a suggestion base on my experience. In some computer (mini pc or raspberry pi) the computational power is very limited, which can cause some problem.

cdb0y511 commented 3 years ago

@cdb0y511

To your questions:

  1. The odom is in global coordinates

  2. Yes, the poses are generated frame by frame. For the tranformpointcloud, I could find it in your forked code, where did you put the code?

  3. If you have the same number of pcds and poses, then there is no problems with your code, the pcds and poses are in ordered. The code I point out is just a suggestion base on my experience. In some computer (mini pc or raspberry pi) the computational power is very limited, which can cause some problem.

Thanks, I do not upload the experiment file yet, I will check again, and upload latter. In my experiment the sfm is all about the global poses or relative poses, typically colmap do global optimization with bundle adjustment. But your mapping and localiztion process are quite differerent. The mapping get the map global feature (surf and edge point, still it gets the camera poses, otherwise the edge point can not merged into one complete model) and do loop clousre. The localiztion can have different input bag, and the camera poses can be obtained through the surf and edge point cloud.

  1. no loop closure is needed in localiztion process? cause the global feature is already obtained, track to the global, so no loop clousre is required.
  2. I notice the ssl2 use the GTSAM as optimizer, the octacve is removed, it improves the performace (speed or accuracy)?
cdb0y511 commented 3 years ago
//Created by cd on 2020/2/2.
  //cdb0y511@126.com
  #include "../../thirdparty/yaml-cpp/include/yaml-cpp/yaml.h"
  #include <glog/logging.h>
  #include <pcl/console/parse.h>
  #include <pcl/console/print.h>
  #include <pcl/common/transforms.h>
  using namespace std;
  using namespace pcl::io;
  using namespace pcl::console;

  bool LoadPoses(const std::shared_ptr<vector<Sophus::SE3d>> &image_tr_pattern,const char *path) {
      YAML::Node file_node;
      try {
          file_node = YAML::LoadFile(path);
      } catch (YAML::BadFile &ex) {
          LOG(ERROR) << "Cannot read file: " << path;
          return false;
      }
      YAML::Node poses_node = file_node["poses"];
      image_tr_pattern->resize(poses_node.size());
      if (!poses_node.IsSequence()) {
          LOG(ERROR) << "Cannot parse file: " << path;
          LOG(ERROR) << "Root node is not a sequence";
          return false;
      }
      for (int i = 0; i < poses_node.size(); ++i) {
          YAML::Node node = poses_node[i];
          int index = node["index"].as<int>();
          Sophus::SE3d &pose = image_tr_pattern->at(index);
          pose.translation().x() = node["tx"].as<double>();
          pose.translation().y() = node["ty"].as<double>();
          pose.translation().z() = node["tz"].as<double>();
          pose.setQuaternion(Eigen::Quaterniond(
                  node["qw"].as<double>(),
                  node["qx"].as<double>(),
                  node["qy"].as<double>(),
                  node["qz"].as<double>()));
      }
      return true;
  }

  int main(int argc, char **argv) {
      string workspace_path = "/home/cd/catkin_ws/src/ssl_slam2/";
      pcl::PointCloud<pcl::PointXYZRGB>::Ptr merged = pcl::make_shared<pcl::PointCloud<pcl::PointXYZRGB>>(
              pcl::PointCloud<pcl::PointXYZRGB>());
      std::shared_ptr<vector<Sophus::SE3d>>
              image_tr_pattern(new vector<Sophus::SE3d>());

      LoadPoses(image_tr_pattern,(workspace_path+"poses/pose.yaml").c_str());
      for(int i = 0; i < 100; i++){
          pcl::PointCloud<pcl::PointXYZRGB>::Ptr tmp = pcl::make_shared<pcl::PointCloud<pcl::PointXYZRGB>>(
                  pcl::PointCloud<pcl::PointXYZRGB>());
          string path = workspace_path + "pcd/"+to_string(i)+".pcd";
          pcl::io::loadPCDFile<pcl::PointXYZRGB>(path,*tmp);
          pcl::transformPointCloud(*tmp,*tmp,image_tr_pattern->at(i).matrix().cast<float>());
          *merged += *tmp;
      }
      PointCloudIO::savePLY<pcl::PointXYZRGB>(workspace_path+"/poses/mereged.ply",merged);

      return 0;
  }
cdb0y511 commented 3 years ago

Hi, @wh200720041. I put the save pcd outside the mutex lock as you suggested https://github.com/cdb0y511/ssl_slam2/blob/8c697fcff816a6e4e5a1ae66af987e6d684b5073/src/odomEstimationLocalizationNode.cpp#L45

However, still the same results. for first 100 frames Screenshot from 2021-04-10 17-40-40 Screenshot from 2021-04-10 17-40-43

So I give the saved YAML pose.zip Maybe you can check what goes wrong.

wh200720041 commented 3 years ago

Hi @cdb0y511

To your questions

  1. no loop closure is needed in localiztion process? cause the global feature is already obtained, track to the global, so no loop clousre is required. Note that the localization only match the current points into global map, there is no loop closure required.

  2. I notice the ssl2 use the GTSAM as optimizer, the octacve is removed, it improves the performace (speed or accuracy)? Note that unlike ORBSLAM, our work mainly uses geometric features instead of visual features, the octave is used for ORB features. There is no BA

  3. If I understand correctly, you are trying to 1. build a map 2. localization based on built map and record point cloud and trajectory. 3. rebuild map based on the recorded points and trajectory.

  4. to put the save point cloud outside mutex, you can simply use

    
            ros::Time pointcloud_time = (pointCloudEdgeBuf.front())->header.stamp;
            pointCloudEdgeBuf.pop();
            pointCloudSurfBuf.pop();
            pointCloudBuf.pop();
            mutex_lock.unlock();
            if(isSavePCD){
                std::string path = savePCD_path+std::to_string(total_frame)+".pcd";
                pcl::io::savePCDFile<pcl::PointXYZRGB>(path,*pointcloud_in);
            }
compared to previous code
        if(isSavePCD){
            std::string path = savePCD_path+std::to_string(total_frame)+".pcd";
            pcl::io::savePCDFile<pcl::PointXYZRGB>(path,*pointcloud_in);
        }

        ros::Time pointcloud_time = (pointCloudEdgeBuf.front())->header.stamp;
        pointCloudEdgeBuf.pop();
        pointCloudSurfBuf.pop();
        pointCloudBuf.pop();
        mutex_lock.unlock();

it is not necessary to use service call since you want to record points for every frame
cdb0y511 commented 3 years ago

Hi,@wh200720041 Thanks, you help a lot.

For question 2, you have said the difference between ORBSLAM and SSL_SLAM2 (visual odometry and lidar odometry, visual feature vs. geometry feature). So GTSAM is preferred (am I right?). But, how about GTSAM vs. OctoMap? You replace OctoMap with GTSAM in SSL_SLAM2. Does GTSAM have better performance? So SSL_SLAM2 is faster or more accurate than SSL_SLAM1?

I think the reconstruction process is the so-called map in the robotics area. 1.The first map (sparse reconstruction) is mainly to recover the trajectory (camera poses), 2. the localization can be built simultaneously if the input is the same bag (slam), but I follow your process, assume the second input bag may be different from the sparse reconstruction (rarely case in reconstruction, we usually do not scan an object twice, first for the map, second for the reconstruction). 3. rebuild the map is our main reconstruction process, multiple point clouds or depth frames are the input alone with the camera poses. The main purpose is to achieve dense point cloud with high accuracy and completeness, some algorithm, like surfelmeshing https://github.com/puzzlepaint/surfelmeshing can get decent surface at the same time. For the robotics area, elastic fusion is more famous https://github.com/mp3guy/ElasticFusion.

For my experiment, pcl::transformPointCloud can not transform the saved point cloud into the same coordinate system with the saved poses. I do not know why? It bugs me a long time.

wh200720041 commented 3 years ago

Hi,

  1. Compared to the first version, this version adds loop closure and global optimization. It's more accurate than the SSL_SLAM. GTSAM is an optimization package, and octomap is a mapping package, they are different.

  2. Note that

    void pcl::transformPointCloud (const pcl::PointCloud< PointT > &cloud_in, pcl::PointCloud< PointT > &cloud_out, const Eigen::Transform< Scalar, 3, Eigen::Affine > &transform, bool copy_all_fields=true)

    It seems that the eigen matrix is accepted, but I checked your code you used Sophus::SE3d, perhaps you can try eigen instead.

cdb0y511 commented 3 years ago

@wh200720041 Unfortunately, this is not the reason. Sophus::SE3d::matrix() return Eigen::Matrix<double, 4, 4>. I doubt the saved pose may be wrong. Could you check the pose.zip for me? The poses are saved from the MappingTest.bag through ssl_slam2. I also find that different frame which triggers loop clousre result in different optimization result (pose). I assume this is normal. pose.zip

cdb0y511 commented 3 years ago

The saved pose just can not transform two pcd into a proper coordinate system. I do not know why?

cdb0y511 commented 3 years ago

My bad, I figure it out. This line makes coordinate transform https://github.com/wh200720041/ssl_slam2/blob/8eeeba72581c4f073033c1583dff335c62a0ab47/src/laserProcessingClass.cpp#L18 Problem solved!