iaslab-unipd / rgbd_calibration

RGBD Calibration
142 stars 38 forks source link

Failed to parse camera calibration #10

Open johaq opened 6 years ago

johaq commented 6 years ago

I used the sensor_data_collection to record some images. The camera info files look like this:

frame_id: xtion_rgb_optical_frame
height: 480
width: 640
distortion_model: plumb_bob
D: [0.0242488607686558, -0.101606204066611, 0.00729738654453138, 0.000871487597046265, 0]
K: [535.044041709227, 0, 319.05230353441, 0, 535.207054100367, 252.902906117044, 0, 0, 1]
R: [1, 0, 0, 0, 1, 0, 0, 0, 1]
P: [532.30517578125, 0, 319.043534864526, 0, 0, 533.854919433594, 254.863566213093, 0, 0, 0, 1, 0]
binning_x: 0
binning_y: 0
roi:
  x_offset: 0
  y_offset: 0
  height: 0
  width: 0
  do_rectify: False

Trying to run the offline calibration gives the following error:

[ERROR] [1517587105.515380165]: Exception parsing YAML camera calibration:
yaml-cpp: error at line 0, column 0: bad conversion
[ERROR] [1517587105.515464671]: Failed to parse camera calibration from file [/home/johannes/pepper_depth/depth_camera_info.yaml]

Am I giving the wrong yaml file?

johaq commented 6 years ago

Ok I realized I need to give the calibration files from my camera launch. Doing this still gives me this error:

[ INFO] [1517588955.039495766]: Estimating undistortion map...
================================================================================REQUIRED process [calibration/rgbd_offline-1] has died!
process has died [pid 9608, exit code -11, cmd /home/johannes/robocup/ws_calib/devel/lib/rgbd_calibration/rgbd_offline_calibration __name:=rgbd_offline __log:=/home/johannes/.ros/log/a22782d6-082c-11e8-935a-0028f84ce000/calibration-rgbd_offline-1.log].
log file: /home/johannes/.ros/log/a22782d6-082c-11e8-935a-0028f84ce000/calibration-rgbd_offline-1*.log
Initiating shutdown!
================================================================================

I'll run it in gdb and report what I find.

johaq commented 6 years ago

So the segmentation fault is happening deep in pcl. I made sure that pcl was built for the same instruction set as the calibration packages but the error is still happening. Is there a specific pcl version that you usually use? Also In the readme you mention to use calibration_toolkit version 0.3.2 but the linked git only has a tag for 0.3.1. My build workd with branch indigo_v0.3. Might that be the problem? If so, where can I get calibration_toolkit v0.3.2?

johaq commented 6 years ago

Manually building pcl 1.7.2 worked for me. No crashes anymore. I now get this:

Error in evaluating the ResidualBlock.

There are two possible reasons. Either the CostFunction did not evaluate and fill all    
residual and jacobians that were requested or there was a non-finite value (nan/infinite)
generated during the or jacobian computation. 

Residual Block size: 4 parameter blocks x 200430 residuals

For each parameter block, the value of the parameters are printed in the first column   
and the value of the jacobian under the corresponding residual. If a ParameterBlock was 
held constant then the corresponding jacobian is printed as 'Not Computed'. If an entry 
of the Jacobian/residual array was requested but was not written to by user code, it is 
indicated by 'Uninitialized'. This is an error. Residuals or Jacobian values evaluating 
to Inf or NaN is also an error.  

This seems to be the same error described in #9

Could this be caused by too few data points? I only recorded 5 images+pointclouds just to see if everything works. Do I need more so a solution is findable? Are there any guidelines on how many images from which distances I need? Also if I record a tilted view of the checkerboard is the distance I should give from the camera to the center of the board?

vladsterz commented 6 years ago

Try it with around 20 pairs, It should work.

johaq commented 6 years ago

Thank you it worked for me with 25 pairs. How did you test your calibration?

vladsterz commented 6 years ago

Firstly I am working it on Windows 10, but the concept is the same. After calibration you have some files with local and global matrices, camera pose and something else that I dont remember. With these files you have to undistort depth samples , the code is in the test node if I remember corectly. So you have to create a test set (or use the existing training set) , provide the path to the test directory. With the undistorted depth data, you have to find a way to see if its working corectly. One way is to check the registration RGB to depth data. Another way is to have ground truth distance.

johaq commented 6 years ago

Ok I think I found the relevant code. Do you know what the camera_pose means? Does not really make sense to me to be a result of calibration. Did you get good results from 25 pairs or did you record a larger set? If so how large?

vladsterz commented 6 years ago

camera_pose is the pose of the RGB camera with respect to the depth camera. My results were not so good with about 80 pairs, but there were some this wrong and I am going to work on it further.

johaq commented 6 years ago

Ok. Thank you! That did not occur to me because then the pose I got is way off. The distorted point cloud looks decent though. I guess I'll have to write a node that converts the pointclouds online to really see the results.

vladsterz commented 6 years ago

Yeah, great idea. Try for your dataset to have only the wall (without floor and ceiling). I'll stay tuned

jotaraul commented 6 years ago

Hi guys!

Did you finally managed to have this working? I'm trying to calibrate an rgb-d camera (Asus xtion pro live) but, as you (@vladsterz @johaq) commented, I'm getting a crazy extrinsic calibration (nonsense translation and rotation values), and the depth calibration seems to make planar the scenes. At the moment I'm using 25 pairs, but I don't know if it's worth spending more time on this to have this working or I would end up with similar results.

Any insight about this would be appreciated!

BTW: I'm using the test_calibration node to take a look at the results.

johaq commented 6 years ago

Did you finally managed to have this working?

No, unfortunately I had to put this on hold and work on other things.

and the depth calibration seems to make planar the scenes

What do you mean by that exactly?

vladsterz commented 6 years ago

I have done anything since then myself actually, doing other stuff. That's the point of the whole idea, to make them as planar as possible at the training stage. Your dataset has to be pointclouds of a planar object (a wall) with a checkerboard on it. Is it making everything planar on a test set?

jotaraul commented 6 years ago

Hi guys!

Thank you for your reply. I was testing a few things before writing again. I've collected a pair of datasets:

With the first one, the algorithm seems to converge to a valid solution (the z coordinate is a bit crazy, but it seems to work):

[ INFO] [1526455696.788252979]: Optimized transform: position: x: 0.0298691 y: 0.0242659 z: 0.0924567 orientation: x: -0.00753137 y: 0.0127186 z: -0.00362412 w: 0.999884

But with the second one the results are wrong.

I was wondering if you know how robust is the method against other planes/objects appearing in the scene. I mean, I tried to take images in such a way that the plane with the checkerboard on it occupied most of the image, but when I moved "far" from the wall other objects appeared in them.

I have upload the RGB images of both datasets here, it would be so nice if you could take a look at them and give me some feedback.

Also, while using the tester, they appear two windows showing the following info, how should I interpret them? (I took a look at papers and the thesis doc, but it is still not clear for me)

testing_window_1 testing_window_2

Thank you so much for your time!!

vladsterz commented 6 years ago

Hello there @jotaraul , as I remember the paper, it assumes that everything is on the same plane. That was the whole point, it tries to fit every pixel->point to the correct plane. If other objects appear in there, there is no planarity anymore. I know how hard is to find such a big wall with nothing else near. :(

Try it with a small test set, even with repeated images if necessary, but with fully planar pointclouds. One more point to be careful about is to check if checkerboards are actually located. Good luck vlad

HenryFOCUS commented 5 years ago

Hello @johaq Excuse me, You said in the second question above: > Ok I realized I need to give the calibration files from my camera launch.

[ERROR] [1517587105.515380165]: Exception parsing YAML camera calibration: yaml-cpp: error at line 0, column 0: bad conversion [ERROR] [1517587105.515464671]: Failed to parse camera calibration from file [/home/johannes/pepper_depth/depth_camera_info.yaml]

what should I do with this specificity? I don't understand too much.

johaq commented 5 years ago

I simply used the wrong kind of file. A calibration file like this is required: https://github.com/CentralLabFacilities/tobi_robot/blob/kinetic/tobi_bringup/config/depth.yaml

HenryFOCUS commented 5 years ago

Thank you @johaq According to your advice, I reset the "camera_calib_url" and "depth_camera_calib_url" in kinect_47A_tro.launch (offline calibration startup file). It's good to know that the modified startup file worked, but in the end there were some problems .

I saw in your answer above that you solved this crash by manually compiling pcl1.7.2, Manually building pcl 1.7.2 worked for me. No crashes anymore. but my situation seems to be calibratable, but the calibration result does not converge, because the optimization iterations reached 10 times and crashed. .

Do you know what caused the error? May be a problem with the data?

image

Tiansong97 commented 4 years ago

giving the wrong yaml f

How did you solve this problem? I also met this problem. Can you give some more details? Thanks

tiexuedanxin commented 3 years ago

Thank you @johaq According to your advice, I reset the "camera_calib_url" and "depth_camera_calib_url" in kinect_47A_tro.launch (offline calibration startup file). It's good to know that the modified startup file worked, but in the end there were some problems .

I saw in your answer above that you solved this crash by manually compiling pcl1.7.2, Manually building pcl 1.7.2 worked for me. No crashes anymore. but my situation seems to be calibratable, but the calibration result does not converge, because the optimization iterations reached 10 times and crashed. .

Do you know what caused the error? May be a problem with the data?

image

hello, could you give me some guide on how to run the code?

tiexuedanxin commented 3 years ago

Yeah, great idea. Try for your dataset to have only the wall (without floor and ceiling). I'll stay tuned

hello, why the dataset should ongly have the wall (without floor and ceiling). Besides ,have you run the code successfully, I download the 0.3 toolkit and the calibrate code, but when i run the code there is full screen error, could you give me a correct code.

xinyangrobotic commented 2 years ago

Hi, I have collected my own dataset, but it cannot be used with the code. I noticed that you have uploaded your dataset, but the link shows that it has expired. Can you refresh the link again? Thank you so much!

jotaraul commented 2 years ago

Hi @zxy-HIT ,

I'm so sorry but it looks like I don't keep that files.

Best regards.