lardemua / atom

Calibration tools for multi-sensor, multi-modal robotic systems
GNU General Public License v3.0
247 stars 27 forks source link

Compute agrob zed-velodyne cost function #149

Closed aaguiar96 closed 4 years ago

aaguiar96 commented 4 years ago

Hi @miguelriemoliveira I've been busy these days. Trying to start working on this today. :)

miguelriemoliveira commented 4 years ago

Hi André,

we're synced. I also did not have any time to devote to this until tonight or tomorrow. Miguel

On Thu, 7 May 2020 at 09:42, André Aguiar notifications@github.com wrote:

Hi @miguelriemoliveira https://github.com/miguelriemoliveira I've been busy these days. Trying to start working on this today. :)

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/lardemua/AtlasCarCalibration/issues/149#issuecomment-625116834, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWTHVWTURADNIOEFO3MUYTRQJYBBANCNFSM4M3EAP3Q .

aaguiar96 commented 4 years ago

Hi @miguelriemoliveira

is this output normal for the agrob_view_optimization?

[ERROR] [1589210974.404574464]: PluginlibFactory: The plugin for class 'rviz_visual_tools/KeyTool' failed to load.  Error: According to the loaded plugin descriptions the class rviz_visual_tools/KeyTool with base class type rviz::Tool does not exist. Declared types are  rviz/FocusCamera rviz/Interact rviz/Measure rviz/MoveCamera rviz/PublishPoint rviz/Select rviz/SetGoal rviz/SetInitialPose rviz_plugin_tutorials/PlantFlag
[rospack] Error: no package given
[librospack]: error while executing command
[rospack] Error: no package given
[librospack]: error while executing command
[WARN] [1589210974.415142]: The 'use_gui' parameter was specified, which is deprecated.  We'll attempt to find and run the GUI, but if this fails you should install the 'joint_state_publisher_gui' package instead and run that.  This backwards compatibility option will be removed in Noetic.
[INFO] [1589210974.430104]: xacro_file is /home/andre/Source/catkin_ws/src/agrob/agrob_description/urdf/agrob.urdf.xacro
[INFO] [1589210974.433015]: prefix is initial_
[INFO] [1589210974.434224]: Reading xml xacro file ...
Unknown attribute "type" in /robot[@name='agrob']/transmission[@name='front_left_wheel_trans']
Unknown attribute "type" in /robot[@name='agrob']/transmission[@name='front_right_wheel_trans']
Unknown attribute "type" in /robot[@name='agrob']/transmission[@name='rear_left_wheel_trans']
Unknown attribute "type" in /robot[@name='agrob']/transmission[@name='rear_right_wheel_trans']
[INFO] [1589210974.451797]: Changing link names ...
[INFO] [1589210974.452871]: Changing joint names, parents and childs ...
[INFO] [1589210974.453733]: Writting new initial_robot_description parameter
[create_initial_robot_description-2] process has finished cleanly
log file: /home/andre/.ros/log/36517cbc-939c-11ea-921d-a0a4c5480909/create_initial_robot_description-2*.log
/opt/ros/melodic/lib/robot_state_publisher/robot_state_publisher: __name:=robot_state_publisher: command not found
[robot_state_publisher-3] process has died [pid 20426, exit code 127, cmd bash -c sleep 2; $4 $@ /opt/ros/melodic/lib/robot_state_publisher/robot_state_publisher __name:=robot_state_publisher __log:=/home/andre/.ros/log/36517cbc-939c-11ea-921d-a0a4c5480909/robot_state_publisher-3.log].
log file: /home/andre/.ros/log/36517cbc-939c-11ea-921d-a0a4c5480909/robot_state_publisher-3*.log

Also, I'm getting this error when launching the test/sensor_pose_json_v2/main.py:

Traceback (most recent call last):
  File "test/sensor_pose_json_v2/main.py", line 423, in <module>
    main()
  File "test/sensor_pose_json_v2/main.py", line 367, in main
    'x_scale': 'jac'})
  File "/home/andre/Source/OptimizationUtils/OptimizationUtils/OptimizationUtils.py", line 367, in startOptimization
    bounds=(bounds_min, bounds_max), method='trf', args=(), **optimization_options)
  File "/home/andre/.local/lib/python2.7/site-packages/scipy/optimize/_lsq/least_squares.py", line 871, in least_squares
    J0 = jac_wrapped(x0, f0)
  File "/home/andre/.local/lib/python2.7/site-packages/scipy/optimize/_lsq/least_squares.py", line 865, in jac_wrapped
    kwargs=kwargs, sparsity=jac_sparsity)
  File "/home/andre/.local/lib/python2.7/site-packages/scipy/optimize/_numdiff.py", line 400, in approx_derivative
    groups, method)
  File "/home/andre/.local/lib/python2.7/site-packages/scipy/optimize/_numdiff.py", line 500, in _sparse_difference
    df = fun(x) - f0
  File "/home/andre/.local/lib/python2.7/site-packages/scipy/optimize/_numdiff.py", line 348, in fun_wrapped
    f = np.atleast_1d(fun(x, *args, **kwargs))
  File "/home/andre/Source/OptimizationUtils/OptimizationUtils/OptimizationUtils.py", line 268, in internalObjectiveFunction
    self.vis_function_handle(self.data_models)  # call visualization function
  File "/home/andre/Source/OptimizationUtils/test/sensor_pose_json_v2/visualization.py", line 293, in visualizationFunction
    dataset_graphics = data['dataset_graphics']
KeyError: 'dataset_graphics'
miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

view optimization is ok.

The main.py has some problem. I will fix it as soon as possible (probably tonight).

miguelriemoliveira commented 4 years ago

@aaguiar96 ,

I cannot replicate your error. I launch like this:

clear && test/sensor_pose_json_v2/main.py -json ~/datasets/agrob_first_test/data_collected.json -si -rv -ssf "lambda name: name in ['left_camera', 'vlp16']"

How do you run the optimizer?

aaguiar96 commented 4 years ago

How do you run the optimizer?

roslaunch interactive_calibration agrob_view_optimization.launch

And the main.py exacly as you... Are you using this dataset?

aaguiar96 commented 4 years ago

Hi @miguelriemoliveira

This morning I implemented the first version of the velodyne cost function (you can pull it from optimization utils). I used pretty much the same approach that you used in the planar laser case, for the point-to-plane distance.

I will summarize the approach so that you can tell me if I understand what's going on:

I've got some questions now (sorry if they are stupid):

Also, I'm still stuck with that bug ...

Thanks in advance Miguel! :)

miguelriemoliveira commented 4 years ago

Hi André,

sorry for the late reply. I am not working today.

I just pushed a new version. Can you try now?

I will read your email more carefully later an dcall you ok?

Miguel

On Tue, 12 May 2020 at 11:37, André Aguiar notifications@github.com wrote:

Hi @miguelriemoliveira https://github.com/miguelriemoliveira

This morning I implemented the first version of the velodyne cost function. I used pretty much the same approach that you used in the planar laser case, for the point-to-plane distance.

I will summarize the approach so that you can tell me if I understand what's going on:

  • Compute the chessboard plane in the sensor reference frame
  • Calculate the interception of the line that goes through the sensor origin to the point in the plane, with the estimated plane in the previous step
  • Residual is the difference between the distance from the sensor origin to the intercepted point and the actual range measure taken by the sensor

I've got some questions now (sorry if they are stupid):

  • The transformation between the chessboard and the sensor is computed locally on the objective_function.py. The "global/public" transformation are the ones present in the chain of the json file for each sensor. So, my question is, do we optimize the entire chain or the transformation between the target sensors?
  • Also, all the objective functions are always computed in relation to the chessboard? Does this mean that the calibration procedure requires at least a camera?
  • How do I test my objective function?

Also, I'm still stuck with that bug https://github.com/lardemua/AtlasCarCalibration/issues/149#issuecomment-626777827 ...

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/lardemua/AtlasCarCalibration/issues/149#issuecomment-627260407, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWTHVW5KUT643QQ26WL4LLRRERH5ANCNFSM4M3EAP3Q .

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

Compute the chessboard plane in the sensor reference frame Calculate the interception of the line that goes through the sensor origin to the point in the plane, with the estimated plane in the previous step Residual is the difference between the distance from the sensor origin to the intercepted point and the actual range measure taken by the sensor I've got some questions now (sorry if they are stupid):

You are right. We call this the beam distance. It is an improvement over the orthogonal distance which was our first distance and is not used anymore (i forgot about this and sent you hunting for the orthogonal distance, sorry).

If you want to implment the orthogonal distance, this was how:

  1. The goal is to bring the laser measurements to the chessboard reference frame.
  2. Thus we multiply the measurements (3D points in the lidar reference frame) by the root_T_sensor transformation, and later by the chessboard_T_root transformation.
  3. With this we have transported those 3D points from the laser reference frame to the chessboard reference frame.
  4. Note that, since hte points are now on the chessboar reference frame, we just have to look at the z coordinate to get the distance to the chessboard plane.

The transformation between the chessboard and the sensor is computed locally on the objective_function.py. The "global/public" transformation are the ones present in the chain of the json file for each sensor. So, my question is, do we optimize the entire chain or the transformation between the target sensors?

Atomic transformations are the actual transforms that exist of the tf tree, as oposed to aggregate transformations. No, we optimize only the atomic transformations which are marked for optimization. For example in your case, from your config.json:

  "sensors": {
        "right_camera": {
            "link": "zed_right_camera_optical_frame",
            "parent_link": "zed_camera_center",
            "child_link" : "zed_right_camera_frame",
            "topic_name" : "/zed/zed_node/right/image"
        },

        "left_camera": {
            "link": "zed_left_camera_optical_frame",
            "parent_link": "zed_camera_center",
            "child_link" : "zed_left_camera_frame",
            "topic_name" : "/zed/zed_node/left/image"
        },

    "vlp16": {
       "link": "velodyne",
       "parent_link": "tower_link",
       "child_link": "vlp16_frame",
       "topic_name": "/velodyne_points"
    }

you can see that we marked 3 atomic transformations to be optimized. All the others are untouched.

zed_camera_center to zed_right_camera_frame, zed_camera_center to zed_left_camera_frame and tower_link to vlp16_frame"

Also, all the objective functions are always computed in relation to the chessboard?

Yes, we also estimate the pose of the chessboard (actually, lets start using the term pattern since it covers charucos and chessboards) during the optimization. As such, a first guess also for these parameters must be fed into the system to initialize the optimization.

To solve this we use a solvePNP function that needs at least one camera to detect the pattern.

Does this mean that the calibration procedure requires at least a camera?

Yes, for now yes. If / when we need we can come up with an alternative way of initializing the pattern poses.

How do I test my objective function?

That's a difficult one, you can print the errors but that will only take you so far ... the best is to observe the system during optimization and to try to understand if the system is doing what we want it to do. Also you can view those graphs and check if the error is descending. But this is one of the tricky parts, we actually are never sure ... @eupedrosa can you add something here?

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

How do you run the optimizer?

roslaunch interactive_calibration agrob_view_optimization.launch And the main.py exacly as you... Are you using this dataset?

Yes, I am using the same dataset as you.

I run this test, it works ...

https://youtu.be/fHILXskESA0

Perhaps your pycharm? Check the video ...

Also, about your question if the objective function is working. I told you it is hard to know when its working. Luckly, it is easier to know when it is not. Take a look and this image

image

aaguiar96 commented 4 years ago

Hi @miguelriemoliveira

Thanks for the great replies. You clarified all my questions. :) I'll try again to run the system using your video. This is weird...

Also, about your question if the objective function is working. I told you it is hard to know when its working. Luckly, it is easier to know when it is not. Take a look and this image

I thought I also had to implement the setter and getter functions to see the calibration working. Did you use setter and getter functions that were already implemented? :)

Anyway, we are moving forward. Do we talk tomorrow?

aaguiar96 commented 4 years ago

Another thing @miguelriemoliveira

I'm returning to my hometown this weekend. So, from monday onward I'm available to go to the Lab to record new rosbags.

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

we can zoom tomorrow 14h45?

Good news you can take a new bagfile. I was planning to give you the charuco we have but I am not sure I will go to Aveiro in the meantime.

When are you going to pass by Aveiro? Can you tell me a day / time?

Thanks, Miguel

aaguiar96 commented 4 years ago

we can zoom tomorrow 14h45?

Yes @miguelriemoliveira, deal!

When are you going to pass by Aveiro? Can you tell me a day / time?

Not sure yet the day and time. I let you know soon.

aaguiar96 commented 4 years ago

Hi @miguelriemoliveira and @eupedrosa

I was thinking... Isn't it enough to align the corners of the pattern with the four extrema beam points, instead of using all the extrema points?

miguelriemoliveira commented 4 years ago

Hi André. The problem is the discretization in the point cloud . You can t be sure that you have a measurement of the real corner.

But if you want to start with that it is a good approximation

On Thu, May 21, 2020, 16:23 André Aguiar notifications@github.com wrote:

Hi @miguelriemoliveira https://github.com/miguelriemoliveira and @eupedrosa https://github.com/eupedrosa

I was thinking... Isn't it enough to align the corners of the pattern with the four extrema beam points, instead of using all the extrema points?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/lardemua/AtlasCarCalibration/issues/149#issuecomment-632147500, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWTHVQK24OAPXC3NY4T37LRSVBQ3ANCNFSM4M3EAP3Q .

aaguiar96 commented 4 years ago

Ok, I've implemented with the corners and obtained this result:

optimization_1 optimization_2

It's funny because the solution converges to an inclination on velodyne, but in reality the ZED contains an inclination of +/- 11º, looking down. So, it seems that this solution is trying to compensate that. @miguelriemoliveira, why does the ZED not moves? Is that because it is anchored?

p.s. this is still with the old dataset, because I'm getting an error while executing with the new one...

eupedrosa commented 4 years ago

Do not trust in your eyes, you are calibrating the sensor constellation, not relative to the robot's body.

why does the ZED not moves? Is that because it is anchored?

YES

miguelriemoliveira commented 4 years ago

Looks good to me

On Thu, May 21, 2020, 17:43 André Aguiar notifications@github.com wrote:

Ok, I've implemented with the corners and obtained this result:

[image: optimization_1] https://user-images.githubusercontent.com/35901587/82582773-6d506980-9b8a-11ea-8475-63d971286cd6.png [image: optimization_2] https://user-images.githubusercontent.com/35901587/82582776-6e819680-9b8a-11ea-8f9b-4a618dd28791.png

It's funny because the solution converges to an inclination on velodyne, but in reality the ZED contains an inclination of 11.3º. @miguelriemoliveira https://github.com/miguelriemoliveira, why does the ZED not moves? Is that because it is anchored?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/lardemua/AtlasCarCalibration/issues/149#issuecomment-632214424, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWTHVXHJAGX4FU6K3F7SMLRSVK4ZANCNFSM4M3EAP3Q .

aaguiar96 commented 4 years ago

Hmmm... I checked the corner distances after converging, and some corners are still a bit distant. Maybe some transformation is wrong. I'll check it out.

eupedrosa commented 4 years ago

How much is a "bit distant"?

aaguiar96 commented 4 years ago

15cm... Clearly wrong, no?

eupedrosa commented 4 years ago

Yes.. Clearly wrong..

aaguiar96 commented 4 years ago

Thanks, I'll check the math.

aaguiar96 commented 4 years ago

Hi guys.

Have you tried to run the optimization with the new dataset? I'm getting this error while executing the optimization:

clear && test/sensor_pose_json_v2/main.py -json /home/andre/Documents/rosbags/agrob_second_dataset/data_collected.json -si -rv -ssf "lambda name: name in ['left_camera', 'right_camera', 'vlp16']"
Argument list={'skip_vertices': 1, 'use_incomplete_collections': False, 'single_pattern': False, 'json_file': '/home/andre/Documents/rosbags/agrob_second_dataset/data_collected.json', 'collection_selection_function': None, 'view_projected_vertices': False, 'ros_visualization': True, 'z_inconsistency_threshold': 0.05, 'sensor_selection_function': <function <lambda> at 0x7f145f8e2f50>, 'show_images': True, 'view_optimization': False}

Collections studied:
 [u'11', u'10', u'13', u'12', u'15', u'14', u'17', u'16', u'18', u'1', u'0', u'3', u'2', u'5', u'4', u'7', u'6', u'9', u'8']
Traceback (most recent call last):
  File "test/sensor_pose_json_v2/main.py", line 426, in <module>
    main()
  File "test/sensor_pose_json_v2/main.py", line 148, in main
    dataset_sensors['chessboards'], dataset_chessboard_points = createChessBoardData(args, dataset_sensors)
  File "/home/andre/Source/catkin_ws/src/OptimizationUtils/test/sensor_pose_json_v2/chessboard.py", line 278, in createChessBoardData
    ret, rvecs, tvecs = cv2.solvePnP(objp, corners, K, D)
cv2.error: OpenCV(4.2.0) /io/opencv/modules/calib3d/src/solvepnp.cpp:754: error: (-215:Assertion failed) ( (npoints >= 4) || (npoints == 3 && flags == SOLVEPNP_ITERATIVE && useExtrinsicGuess) ) && npoints == std::max(ipoints.checkVector(2, CV_32F), ipoints.checkVector(2, CV_64F)) in function 'solvePnPGeneric'

I checked and the dataset has always more than 30/40 pattern points... Any idea?

aaguiar96 commented 4 years ago

Also,

Due to the bug, I could not do many debug, but what I referenced here:

Hmmm... I checked the corner distances after converging, and some corners are still a bit distant. Maybe some transformation is wrong. I'll check it out.

might be related to wrong chessboard dimensions in the config file for the first dataset. Also (and this is just a doubt), since in the config file we always specify the number of squares minus 1, shouldn't we add 1 while computing the limit points here? (for the cases where the pattern is a charuco):

https://github.com/miguelriemoliveira/OptimizationUtils/blob/a2055032709a80556670ae989228cdedb7cef87f/test/sensor_pose_json_v2/chessboard.py#L37

aaguiar96 commented 4 years ago

Another thing (sorry about the spam...)

Since the ZED camera has an inclination in reality which is not represented in the agrob description, shouldn't it be better to anchor the velodyne?

eupedrosa commented 4 years ago

About the error, try replacing

cv2.solvePnP(objp, corners, K, D)

with

cv2.solvePnP(objp, np.array(corners, dtype=np.float32), K, D)

The error that you have is usually a simptom of bad numpy type in one of the arguments.

might be related to wrong chessboard dimensions in the config file for the first dataset.

Unrelated.

Also (and this is just a doubt), since in the config file we always specify the number of squares minus 1, shouldn't we add 1 while computing the limit points here? (for the cases where the pattern is a charuco):

What limits are you taking about?

Since the ZED camera has an inclination in reality which is not represented in the agrob description, shouldn't it be better to anchor the velodyne?

It is neither better nor worse. But yeah, you can change to the velodyne.

aaguiar96 commented 4 years ago

Thanks @eupedrosa

Unrelated.

Why do you say so? I am using the pattern limit points that were already calcutated on the optimization code. However, these calculations use the pattern sizes set on the config, that were wrong in the first dataset since I did not measure the actual pattern square sizes.

What limits are you taking about?

The pattern limits calculated on the optimization framework. In the link that I sent before.

It is neither better nor worse. But yeah, you can change to the velodyne.

Visually, the ZED will never become inclined if it is anchored right? Of course if the optimization is right, the solution is equal for both cases! :)

eupedrosa commented 4 years ago

Unrelated.

Why do you say so?

Because the number of objp should be equal to the number of corner. Even if the number was wrong for the given pattern they should match.

However, using a charuco they may not match. Charuco has partial detections. That can also be the proble. I solved that in general_hand_eye.py

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 and @eupedrosa ,

sorry, I was with the kids until now. From what I read there is a path to a solution already.

Do you need something from my side?

aaguiar96 commented 4 years ago

About the error, try replacing

cv2.solvePnP(objp, corners, K, D)

with

cv2.solvePnP(objp, np.array(corners, dtype=np.float32), K, D)

@miguelriemoliveira and @eupedrosa, Eurico's suggestion did not work. I still got the same error when running the optimization in the new dataset. Can you try it out and check if you got it as well? :)

aaguiar96 commented 4 years ago

Also,

I have the same question that I had before... I'm using the pattern extrema points calculated in chessboard.py that are also used in the LaserScan.

I was checking why the corners of the pattern and the corners of the points of the lidar on the patter are so distant and I noticed that:

Of course I may be doing something wrong, but my question is. @miguelriemoliveira, are the pattern limit points calculation general enough to be used for both a chessboard and a charuco? For example, did you saw what I mentioned here?

Also (and this is just a doubt), since in the config file we always specify the number of squares minus 1, shouldn't we add 1 while computing the limit points here? (for the cases where the pattern is a charuco): https://github.com/miguelriemoliveira/OptimizationUtils/blob/a2055032709a80556670ae989228cdedb7cef87f/test/sensor_pose_json_v2/chessboard.py#L37

Thanks guys!

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

I will be teaching tomorrow and wednesday morning. @eupedrosa can you try to meet with @aaguiar96 to help him with this?

If you can't solve it until wednesday afternoon, then we can meet and I will try to help ok?

eupedrosa commented 4 years ago

I think I can help. @aaguiar96 tomorrow after 17h I am available. You you still need my help just send me an email to set up a meeting.

aaguiar96 commented 4 years ago

Nice @eupedrosa

Can you meet at 18h00? We can also talk about the colada file generation! :)

eupedrosa commented 4 years ago

Ok, 18h00.

aaguiar96 commented 4 years ago

159 has to be solved first.

aaguiar96 commented 4 years ago

With the new dataset. Not looking good by now.

bad_res

miguelriemoliveira commented 4 years ago

You already implemented #159? These results are for that? We can talk at 15h?

aaguiar96 commented 4 years ago

You already implemented #159? These results are for that? We can talk at 15h?

I did not @miguelriemoliveira. These results are still with the problematic approach.

aaguiar96 commented 4 years ago

Hi @miguelriemoliveira

Using the code that you had for visualizing the limit points, I got this:

calib

I have to transform dataset['chessboards']['limit_points'] to the root frame right?

miguelriemoliveira commented 4 years ago

No, they should be on the local frame. Rviz transforms the points using the tfs...

aaguiar96 commented 4 years ago

Ok, got them.

points

I'll start to work on the LiDAR corners calculation! :)

miguelriemoliveira commented 4 years ago

Great. You should also draw the horizontal lines ...

aaguiar96 commented 4 years ago

You commented the vertical lines on the calculation of the limit_points. Can I uncomment?

eupedrosa commented 4 years ago

What are the news????? What did I miss?

miguelriemoliveira commented 4 years ago

Sure

On Wed, May 27, 2020, 18:48 André Aguiar notifications@github.com wrote:

You commented the horizontal lines on the calculation of the limit_points. Can I uncomment?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/lardemua/AtlasCarCalibration/issues/149#issuecomment-634831220, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWTHVQYSWR2VMO56SEIZC3RTVG6HANCNFSM4M3EAP3Q .

aaguiar96 commented 4 years ago

What are the news????? What did I miss?

We noticed that the corners are not being well calculated. Also, we are now drawing the corners of the Lidar points and the pattern limits.

I'll start the fix of the corners calculation.

eupedrosa commented 4 years ago

Did you discoverd why it was being miscalculated?