-
Early results with the calibration of the zau rgbd hand camera showed that the camera was badly positioned (#970).
I then worked on #971 to be able to correct the initial guess so that it is more o…
-
I am trying to calibrate os1 64 and USB camera. I got camera intrinsic parameter and saved in .yaml file. I also make global variable Ouster_LIdar = true in lidar_camera.py file.
My bag file contai…
-
I currently try to use the lidar information together with the cameras, but I got stuck figuring out the right transformations.
@adamek727 mentioned a lidar to camera projection example in #1. To y…
jgrnt updated
2 years ago
-
what amazing job! thank you!
However,i want to know different data set means different camera calibration config,and do you think if
there is a separate training for each different camera in 3-D d…
-
is there any easy way to test the resulting transformation matrix ?
-
Hi,
Using the camera_calibration packages, we get a 3*3 **camera matrix
430.215550 0.000000 306.691343
0.000000 430.531693 227.224800
0.000000 0.000000 1.000000**
https://wiki.ros.org/camera_ca…
-
Could it works if I only put the `label_2` and `velodyne` folders in it?
-
First of all, thank you for the wonderful work you have done here!
I have a question regarding the mono_detector though.
My calibration results are as follows
![calibration](https://user-images.g…
-
Hello, I tried to run your excellent works.
I ran
rosrun lidar_camera_calibration calibrate_camera_lidar.py --calibrate and pick points for extrinsic parameter.
However, after picked points, I…
-
Thank you for making the velodyne and SICK timestamps available, but do you think you might make the camera timestamps available too? In the original KITTI Tracking & Object Benchmarks, [I observed t…
pwais updated
4 years ago