rayvburn / ORB-SLAM2_ROS

A ROS wrapper for ORB-SLAM2
65 stars 18 forks source link

Pointcloud transformation #13

Open MagdaZal opened 1 year ago

MagdaZal commented 1 year ago

Hi! Hi! I want to add an INS to tf which will transform pointcloud from ORB SLAM2 to real world coordinate. My idea is to do a transformation so that the INS is at the rotational center of the robot.

My tf_tree looks like this: INS -> map -> base_link -> camera_link

My problem - despite the set transformation, the point cloud is saved in the orb slam system, not in the real coordinate system with INS. For my future work I have to get point cloud with world coordinates in ECEF.

Has anyone ever had this problem or can give me some advice?

rayvburn commented 1 year ago

Hello!

Not really sure what you are planning to do with the point cloud - some more details of the application context could be helpful. However, if I understand correctly, you want to express the point cloud (that ORB-SLAM2 produces) in your custom coordinate system. Hence, you should:

Also, it is not clear for me what do you mean by INS? Is it Inertial Navigation System? If your IMU sensor is attached to the robot, you should refer sensor's position to the robot's base_link (or rotational center if it is not circular).

MagdaZal commented 1 year ago

Hi @rayvburn ! Thank you for your replay. If I see correctly, you have a Polish first and last name. I am from Poland and it would be easier for me to explain it in Polish. If I am wrong please forgive me and I will translate it into Polish.

Posiadam drona pływającego z kamerą, LiDARem oraz inercyjnym systemem nawigacji (anteny GNSS, czujnik IMU). Chcę pozyskać chmurę punktów osobno dla komponentu kamery (z użyciem ORB SLAMu) oraz osobno dla LiDARu (z użyciem paczki dedykowanej urządzeniu). Oba komponenty działają w ROS Melodic. Współrzędne tych punktów chcę przetransformować na rzeczywisty układ np. WGS’84, by później wykonać dalszą analizę.

1) Pomiędzy INS a „map” utworzyłam broadcaster, który subskrybuje topici /imu/data oraz /imu/pos_ecef. Fragment poniżej:

...
            transform_msg = TransformStamped()
            transform_msg.header.stamp = rospy.Time.now()
            transform_msg.header.frame_id = "INS"
            transform_msg.child_frame_id = "map"
...
    rospy.Subscriber("/imu/data", Imu, sbg_publisher.imu_callback)
    rospy.Subscriber("/imu/pos_ecef", PointStamped, sbg_publisher.pos_callback)
...

2) Jeśli dobrze rozumiem, to ORB SLAM transformuje punkty między układami "map" a "base_link". 3) Pomiędzy "base_link" a "camera_link" utworzyłam transformację statyczną i dla testu podałam same wartości zerowe.

node pkg="tf2_ros" type="static_transform_publisher" name="gopro_offset" args="0 0 0 0 0 0 base_link camera_link" /

Jednak do końca nie rozumiem jak wyznaczyć "base_link". Rozumiem, że jest to punkt zaczepu między układem robota a powiedzmy światem, ale nie jest to fizyczny punkt. Dlatego miałam pomysł, by INS był base_linkiem, jednak to również nie działa. A jakkolwiek bym nie zmieniała układów, dostaję chmurę punktów z ORB SLAMu w układzie kamery (?), a nie z rzeczywistymi współrzędnymi.

rayvburn commented 1 year ago

@MagdaZal this is true, however I will keep my reply in English so English-speakers could also benefit from this conversation.

So, the main question is - is your camera monocular? My package was designated to help estimating the scale of the point cloud generated by monocular cameras. This type of cameras do not deal with point cloud scale well.

Is your LiDAR able to emit multiple vertical lines? Like, e.g., VLP-16? If yes, I think this could give you way more accurate point cloud compared to ORB-SLAM2 with monocular camera.

Since you have got GNSS (I think that we should call it this way, instead of INS), you will probably be interested in navsat_transform node (see doc), that can help you with transform from global coordinate systems (referenced to so-called datum) to the robot coordinate system (like base_link). IIRC the navsat_transform node can produce static transform from some global coordinate system (like WGS) to the datum and also dynamically transforms from datum = map to robot's odom. Have a look at this component.

  1. Try to use the navsat_transform node instead of noisy IMU sensors.
  2. ORB-SLAM2 generally produces point cloud expressed in the frame of the camera sensor. Later PCL can be transformed to any frame you have defined. Keep in mind that you do not have to transform PCL if you only want to visualize it in Rviz. If something is broken at this stage, it means that your TF tree is mesed up.
  3. I understand that your custom robot does not have the URDF description in ROS yet. base_link should be placed somewhere inside the robot. For circular robots, in most cases base_link will be located at the geometrical center. Once you have your base_link, you should refer sensor offsets to this (e.g. using static_transform_publisher). Have a look how to prepare robot links (e.g. base_link) in some ROS package https://robots.ros.org/category/aerial/. Also, you can find ROS frame conventions in REP0103.
    1. Yes, as I tried to explain in GNSS paragraph.

Did you try to ask a question at https://answers.ros.org/questions/? I think that general ROS questions will get a lot more attention on the forum instead of Github.

If it comes to the ORB-SLAM2 package from this repository, did you manage to run any .bag file from the README? This should give you some hints on how to manage TF frames.

MagdaZal commented 1 year ago

Yes, I use monocular camera (gopro). I haven't known about navsat_transform before. I think this nodelet might help. I will try to use this transformation in the next few days. Thank you

rayvburn commented 1 year ago

So, having a monocular camera, you should also feed my modification of ORB-SLAM2 with the odometry data to perform scale correction. The resultant point cloud should be pretty accurate. Have fun :)