ethz-asl / rovio

Other
1.15k stars 507 forks source link

Camera calibration problems? #62

Open maciejmatuszak opened 8 years ago

maciejmatuszak commented 8 years ago

Hi , I am evaluating rovio for our mining project. I spend more than 2 weeks on it and can not make it work with my camera. I am embedded software engineer but quite new to ROS and machine vision. In short I can play one of the Euroc test dataset and is seems to be happy with it Youtube Euroc but when I run the it on my vehicle the odometry that comes back is just drifting in some particular direction (10'th of meters in minute). I recorded bag and played it on my laptop, I think this may be the biggest clue but I do not know what I am looking at Youtube lab set My intuition tells me it is either camera calibration or imu:

1) Camera calibration I tried Kalibre and http://wiki.ros.org/camera_calibration. The videos are with the later. The Kalibre output only radial-tangental model to convert it to plumb-bob that is supported by rovio by adding 0.0 in the last parameter (plumb-bob requires 5 and Kalibre radial-tangental only gives 4) Is that correct assumption?.

2) Test failed test during start, In both cases euroc: rovio_eurac.txt and my lab set: rovio_lab_set.txt there is test failed at the startup: Testing pixelOutputCT (can sometimes exhibit large absolut errors due to the float precision) ==== Model jacInput Test failed: 55.5229 is larger than 1 at row 1(def_0) and col 0() ==== 72399.3 72343.8 What does that signify?

3) Imu, I am using pixhawk with PX4 flight stack and mavros to get the imu data to odroid computer where rovio runs. The Kalibr tool when running dynamic calibration report ~15ms delay in imu data result-cam_dynamic.txt . is it usable? Or I need another Imu. Would be nice to get hands on VI Sensor but I do not see it available anywhere. Looks like GoPro snatch the company?

I notice going trough other issues the rotation and translation between imu and camera is important. I basically took the original rovio.info file and replaced the quaternions and translations with the output from Kalibr dynamic calibration. I also calculated it manually and those align well so I am confident in this part.

If this help here are the launch files: miner_rovio.zip

Can you please point me in right direction? P.S. If I get this working we may use it in commercial project. I assume your license statement is current?

andre-nguyen commented 8 years ago

@maciejmatuszak my experience is that right now out of the box the parameters aren't quite right. And things diverge even with the euroc datasets. Try going back to commit 61cde9dc5845963d7782e0558ed46d1a4dde084d or 02753db02854549c48fc3a5850a60b29feaf3dd1 (around january) and things should work essentially as is.

maciejmatuszak commented 8 years ago

Thanks @andre-nguyen ! do you mean just the parameter files or the entire code base?

bloesch commented 8 years ago

@maciejmatuszak : in your video rovio drifts right from the start, I doubt that this has something to do with the parameters only. Bad message time alignment cannot be the reason for the drift either since there is no motion at the beginning of your dataset (but as soon as you have motion 15ms missalignment will be too much). Try inverting the quaternion qCM and test it again (might be a convention issue), also do you have a 90 degree rotation around z between camera and IMU?

If you want you can also share your dataset and calibration so I can try it on my machine.

A general comment concerning the current setup: reduce 'initCovFeature_0' to 0.5 and increase 'penaltyDistance' to 100, this typically increases robustness.

maciejmatuszak commented 8 years ago

got it working on the laptop :) https://www.youtube.com/watch?v=4aCuXMGDcUs

I come back to version suggested by @andre-nguyen. @bloesch are you suggesting that with modifying the initCovFeature_0 and penaltyDistance it should work with latest version?

The datasets are on my dropbox: sample_2016-06-08-10-43-24.bag.gz sample_2016-06-08-16-20-31.bag.gz The first one does not have any motion at the beginning.. is that a problem? The second one I got it working.

Time to go to bed, will play with it tomorrow :)

bloesch commented 8 years ago

Thanks for the update. The missing motion can be an issue with the newest version as the depth prior (initCivFeature_0) is reduced. It would be great if you could briefly try it out with the modified parameters.

maciejmatuszak commented 8 years ago

OK here are two videos: Modified params and Original parameters

I used the dataset with motion from very start. It does not like it.

bloesch commented 8 years ago

Thanks for the videos. Are the video recorded right from the start? Also I see that the extrinsics have rather special values, are those correct?

Edit: I had a look at the dataset and had issues since the timestamps of imu vs camera differed in more than 60 seconds. How did you align these in order to make it work with the "old" version?

maciejmatuszak commented 8 years ago

Yes those videos are recorded from beginning. I started the rosbag record while moving the quad, just to make sure there is movement from start. The extrinsic are are worked out by Kalibr tool using dynamic calibration. The rovio prints very close values when it runs correctly. Here is close-up of the front: 20160608_162701 Basically the camera points forward a bit down. I worked out the quaternions by hand when I had camera pointed stright forward. The imu has z up and x forward. camera has z forward and y down. To get there I rotated -90geg z thats gets you the x to the right and then 0deg y and -90deg x that gets that z facing forward. In current config the last rotation is bit more than -90deg. The January version works well with those extrinsic. On the January version video rovio gets the lock from the start. All those video use the same extrinsics.

As for the time stamps 60 sec seems excessive I assume you looked at the rqt_bag raw values? I will check them tomorrow.

bloesch commented 8 years ago

I've looked at the headers in the imu and image msgs. Basically started the bag and compared the first incomming header time stamps, this is what I get:

$ rostopic echo /cam0/image_raw/header seq: 0 stamp: secs: 1465346665 nsecs: 212000000

frame_id: /camera

$ rostopic echo /px4/imu/data/header seq: 44897 stamp: secs: 1465346605 nsecs: 168767104

frame_id: base_link

I think your extrinsics must be right if you managed to make it work with the old version.

maciejmatuszak commented 8 years ago

That's would be good reason why it is not working... the other dataset is better: working_set_begin

I manage to make it working on odroid - somehow... it very unreliable. I am having trouble synchronising camera and imu but even If I get it to work it starts drifting after a while. Especially after "bump", i.e. I will carry it around the lab and it will hold position. If I put it on the table gently then it is ok but after a bump it will start drifting. See here: @1.55 I give it couple of bumps This is still the January version. I can not get the latest to work at all. Bag file for the video 2 Questions please: 1) What's the role of the "-DMAKE_SCENE=ON" is it efficiency by running on gpu or it will improve robustnest as well? I have problems with opengl on my odroid so I run without it. 2) I do not see way forward without hardware synchronised imu and camera what's your take on it? Is there alternative to VI sensor? P.S. we have holiday on Monday, I am back on Tuesday.

bloesch commented 8 years ago

Concerning your questions:

Thank you for providing the dataset, I will have a look at it when I find some time.

andre-nguyen commented 8 years ago

@maciejmatuszak Are you triggering your camera using http://dev.px4.io/advanced-camera-trigger.html ?

mhkabir commented 8 years ago

He's not, I think. (yet)

maciejmatuszak commented 8 years ago

Argh... argh... argh... Thanks Kabir!

maciejmatuszak commented 8 years ago

@andre-nguyen @mhkabir I took the Kabir branch of ueye_cam and merged latest changes. There were couple of merge conflicts but kind of trivial. I got the initial sequence running. I can see the signal on the camera trigger line and confirmed that the data is flowing into the 3 buffers (imagebuffer, cinfobuffer, timestampbuffer). The problem is that the publish thread run only once. I put ROS_INFO call at the beginning and end of the UEyeCamNodelet::framePublishLoop() function and I can see only one trace of both. Any idea what can be happening. I am not C++ expert but checked the thread class interface and can not see anything wrong. Did you came across something like that? I am thinking now thread starving may be a problem, but it does not seems right... The code can be found in my fork: https://github.com/maciejmatuszak/ueye_cam/tree/artemis-devel

Maciej

longbowlee commented 8 years ago

@maciejmatuszak, I see you rotation IMU coordinate to Camera coordinate with reference to IMU coordinate. My question is, when you rotate IMU coordinate, looks you refer to the changed IMU coordinate, rather than the original one. I also trying to do the same thing as you did, but I have some concern on this.

maciejmatuszak commented 8 years ago

Hi @longbowlee, what do you mean by original and changed IMU coordinate?

longbowlee commented 8 years ago

@maciejmatuszak my point is when you rotate the IMU coordinate with reference to one of IMU axis, like X for 90 degree. Then the consequent second rotation with reference to, like Y, this Y is the original reference axis, or the Y is after first rotation which is already changed 90 degree.

maciejmatuszak commented 8 years ago

Yes the order is important. In the JPL standard you do them in zyx order. If you have access to matlab here is link to matlab doco that converts Euler angles to quaternions. The left or right handed coordinate system is important. I do not remember which one JPL doco uses. Google the JPL quaternions and you should find old scanned pdf with more info.

On Tue, Jun 21, 2016, 17:41 longbowlee notifications@github.com wrote:

@maciejmatuszak https://github.com/maciejmatuszak my point is when you rotate the IMU coordinate with reference to one of IMU axis, like X for 90 degree. Then the consequent second rotation with reference to, like Y, this Y is the original reference axis, or the Y is after first rotation which is already changed 90 degree.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ethz-asl/rovio/issues/62#issuecomment-227364333, or mute the thread https://github.com/notifications/unsubscribe/ACkHa1yBphmZBTYIRyQxyX7W7Tv8jihgks5qN5WxgaJpZM4IwmD- .

longbowlee commented 8 years ago

@maciejmatuszak Thanks for comments. I will check on that. And have you successfully ran the latest rovio on your drone? I see a lot drift on the odometry. And it's reported there are some issues on latest code. We are still trying to debug. If you know a good version, could you point us the tag or changlist?

maciejmatuszak commented 8 years ago

No not the latest, I am using this commit:02753db02854549c48fc3a5850a60b29feaf3dd1. I am fighting drift issues. I think those are problems with cam/imu synchronization. I did not have recently time to work on it.

andrejpan commented 8 years ago

@maciejmatuszak from the picture I can see that are you using some UEye camera on the drone? Did you implemented a hardware connection for trigger synchronization? I would appreciate if somebody could give me more information about the step3. Which wires goes from camera to AUX(music connector port?) and which protocol is used? Are wires going to Pixhawk board or to computer where ROS runs?

maciejmatuszak commented 8 years ago

@andrejpan I never get it to work reliably. The documentation of the synch code is week I kind of figure out how to use it but I do not have confidence I did it wight. This is what I know or do not know:

So no it is not working for me.

jasonghhuang commented 8 years ago

Hi @maciejmatuszak ,I am confused by setting the IMU intrinsics before dynamic calibration. How did you get the IMU noise_density and random_walk of you PX4?

maciejmatuszak commented 8 years ago

I think you talking about Kalibr procedure. You run kalibr_calibrate_cameras to get the intrinsic and then kalibr_calibrate_imu_camera passing intrinsic as input. As for the noise parameters I took them from the chip specs. This is what I use for Pixhawk:

Accelerometers

accelerometer_noise_density: 0.013 #Noise density (continuous-time) accelerometer_random_walk: 0.000108 #Bias random walk

Gyroscopes

gyroscope_noise_density: 0.00065 #Noise density (continuous-time) gyroscope_random_walk: 2.12e-06 #Bias random walk update_rate: 176.5 #Hz (for discretization of the values above) rostopic: /px4/imu/data_raw

I got the ROVIO working quite well, I have created different version of ueye_cam synchronisation algorithm based to assumption that timestamp from Pixhawk is arriving first. Works well for my setup. I am in process of tuning the craft to use the visual position info for stability.

maciejmatuszak commented 8 years ago

Those are 2 scripts I am using for calibration, Watch out! they delete old files, make sure you understand them before using it. kalibr_static.sh.txt kalibr_dynamic.sh.txt

Also the kalibr_dynamic.sh is using converter to create rovio config, I do not use the whole config, just copy the extrinsic.

jasonghhuang commented 8 years ago

@maciejmatuszak Thanks for your reply. I have tried the same Kalibr process as yours. However, I want to port the project to a android phone. The noise parameter could not be found on the chip specs. I am wandering how to get these parameters. It seems a tough job to dig into the noise model of the IMU.. Based on your calibration procedure, have you changed the precision noise in rovio.info?As @bloesch mention in https://github.com/ethz-asl/rovio/issues/38: (Then you should probably adapt the values of the PredictionNoise as well. Since you typically have a cheap IMU on smartphone you should increase the values of vel, acb, gyb, and att.)

Thanks!

longbowlee commented 7 years ago

the point, for consumer imu, it is hard to provide a unified noise parameter for vendor, because of consistency. so, we need to calibrate case by case. you increase the noise level by 10 times based on the calibrated value, this only affect the weight for imu value, those parameter will be refined as more imu data are accumulated and integrated dynamically.

发自:PHAB2 Pro在 jasonghhuang notifications@github.com,2016年11月17日 下午12:13写道:@maciejmatuszak Thanks for your reply. I have tried the same Kalibr process as yours. However, I want to port the project to a android phone. The noise parameter could not be found on the chip specs. I am wandering how to get these parameters. It seems a tough job to dig into the noise model of the IMU.. Based on your calibration procedure, have you changed the precision noise in rovio.info?As @bloesch mention in #38: (Then you should probably adapt the values of the PredictionNoise as well. Since you typically have a cheap IMU on smartphone you should increase the values of vel, acb, gyb, and att.)

Thanks!

—You are receiving this because you were mentioned.Reply to this email directly, view it on GitHub, or mute the thread.

{"api_version":"1.0","publisher":{"api_key":"05dde50f1d1a384dd78767c55493e4bb","name":"GitHub"},"entity":{"external_key":"github/ethz-asl/rovio","title":"ethz-asl/rovio","subtitle":"GitHub repository","main_image_url":"https://cloud.githubusercontent.com/assets/143418/17495839/a5054eac-5d88-11e6-95fc-7290892c7bb5.png","avatar_image_url":"https://cloud.githubusercontent.com/assets/143418/15842166/7c72db34-2c0b-11e6-9aed-b52498112777.png","action":{"name":"Open in GitHub","url":"https://github.com/ethz-asl/rovio"}},"updates":{"snippets":[{"icon":"PERSON","message":"@jasonghhuang in #62: @maciejmatuszak Thanks for your reply. I have tried the same Kalibr process as yours. \r\nHowever, I want to port the project to a android phone. The noise parameter could not be found on the chip specs. I am wandering how to get these parameters. It seems a tough job to dig into the noise model of the IMU..\r\nBased on your calibration procedure, have you changed the precision noise in rovio.info?As @bloesch mention in https://github.com/ethz-asl/rovio/issues/38: (Then you should probably adapt the values of the PredictionNoise as well. Since you typically have a cheap IMU on smartphone you should increase the values of vel, acb, gyb, and att.)\r\n\r\nThanks!"}],"action":{"name":"View Issue","url":"https://github.com/ethz-asl/rovio/issues/62#issuecomment-261150901"}}}

ashish-kb commented 7 years ago

@maciejmatuszak Could you please share your final results, I am trying to calibrate Flea3 camera with pixhawk but I am getting camera reprojection errors of 0.5 order of magnitude. I have seen other threads people getting camera_reprojection for their smartphones of the order 0.05. I am not sure if I can do better with Pixhawk.

mhkabir commented 7 years ago

The reprojection error for the intrinsics should not be affected in any way by the Pixhawk or the sync mechanism. Make sure you do the calibration in a well lit place, with a good quality, large target. Make sure your camera exposure is set to properly expose the scene and cause no motion blur. I would also advise using a Aprilgrid target to improve calibration.

ashish-kb commented 7 years ago

@mhkabir I am talking about the camera reprojection provided by the toolbox in the text file. Aren't these calculated after taking into account imu to camera transformation?

maciejmatuszak commented 7 years ago

I think you are talking about the kalibr toolkit Kalibr. The Camera calibration is used for getting the intrinsic values. Camera-IMU calibration is used to figure out the extrinsic - relative position of camera and IMU. I do not think the IMU plays a role in re-projection error. It is reported there to make sure the camera images are good quality.

ashish-kb commented 7 years ago

So in order to know how good is the calibration, I have to check accelerometers and gyro errors from the file? Since I calibrated Intrinsics separately cam reprojection are irrelevant to Kalibr toolkit?