Elucidation / StereoColorTracking

3D point tracking using stereo camera with color thresholding and disparity
27 stars 19 forks source link

More details in README #1

Open zardchim opened 9 years ago

zardchim commented 9 years ago

Hi! Lovely to see this project! I am an undergraduate student and currently want to do a similar project to track a badminton :)

To better catch up your code, would you mind to add more details and how to use your code?

Elucidation commented 9 years ago

Sure, I'll write an installation readme at some point. To get you up to speed you'll need a linux machine with ROS (ros.org) installed.

Download the code and put it into a new catkin package (I followed this for example: http://wiki.ros.org/catkin/Tutorials/CreatingPackage)

Once you're able to compile the package using catkin_make you can start the launch file by calling roslaunch StereoColorTracking tracker2.launch

You'll need two cameras (I used Logitech C270 webcameras) that are available on /dev/video# , and you'll need to calibrate them, I used http://wiki.ros.org/camera_calibration for example

zardchim commented 9 years ago

Successfully installed ros indigo on my machine.

Which usb camera driver are you using? I have tried usb_cam, but was not able to use the image_view package to view the image. @@

<-solved-> by updating all the packages, can see the image now

sudo apt-get update sudo apt-get dist-upgrade

zardchim commented 9 years ago

Hey :) I got 2 logitech C270 too, and am trying to do the stereo calibration according to the tutorial.

But after I type the command " rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.025 right:=/right_camera/image_raw left:=/left_camera/image_raw right_camera:=/right_camera left_camera:=/left_camera" , the terminal only show "('Waiting for service', '/left_camera/set_camera_info', '...') OK ('Waiting for service', '/right_camera/set_camera_info', '...') OK" and doesn't pop up and windows."

However, i am able to do the monocular calibration. Have you faced this problem before?

<> http://ros-users.122217.n3.nabble.com/stereo-camera-calibration-td2423324.html "rosrun camera_calibration cameracalibrator.py [other otions] --approximate=0.01

This allows a "slop" of 0.01s between image pairs. If you still don't get a window pop up, you may need to increase the slop.

With the other stereo nodes in image_pipeline, you can enable approximate matching by setting the approximate_sync parameter:

ROS_NAMESPACE=my_stereo rosrun stereo_image_proc stereo_image_proc _approximate_sync:=True"

zardchim commented 9 years ago

Finally got my c270 pairs calibrated!

Could you tell me more about how do you apply the transformation matrix of the cameras? I found there are two static_transform_publisher under tracker2.launch with 9 parameters, are they the projection matrix?

And how should I go ahead? :+1: Thx!

Elucidation commented 9 years ago

Well done! I used monocular only as well.

The static transform publishers are providing the transformation matrix from the /map (world coordinates) to the individual camera frames.

When using the calibration.py you can save the projection matrix and what it does is save a yaml file which usb_cam loads automatically, I included the yaml files which are searched for in the launch script (such as https://github.com/Elucidation/StereoColorTracking/blob/master/launch/tracker2.launch#L26 )

I had to manually twea the projection matrices as you may see in those files.

After that you just launch the tracker2.launch file! :)

Let me know if you succeed, I'll put it on the readme then.

On Fri Dec 05 2014 at 5:51:58 AM zardchim notifications@github.com wrote:

Finally got my c270 pairs calibrated!

Could you tell me more about how do you apply the transformation matrix of the cameras? I found there are two static_transform_publisher under tracker2.launch with 9 parameters, are they the projection matrix?

And how should I go ahead? [image: :+1:] Thx!

— Reply to this email directly or view it on GitHub https://github.com/Elucidation/StereoColorTracking/issues/1#issuecomment-65791954 .

zardchim commented 9 years ago

I have succesfully calibrates with the camera with the stereo calibration,py, and has changed the corresponding yaml file.

But i faced the problem when trying to form the depth image, I believe i missed the translation & rotation matrix which are essential to define the stereo camera.

How do you tackle the above problem? And have you use the stereo proc pipeline package to apply a layer after the raw image from usb_cam?

Elucidation commented 9 years ago

Hey, I haven't used the stereo proc pipeline package, I used just the monocular to generate the yaml files found here https://github.com/Elucidation/StereoColorTracking/tree/master/camera_info I determined translation and rotation of the two cameras manually using the static transform publishers (no rotations since they're parallel to one another), to determine depth I use the disparity between the two images, the ratio of which is found mathematically based on the distance between the two cameras, but in practice needs to be manually tweaked via GUI at https://github.com/Elucidation/StereoColorTracking/blob/master/scripts/disparity_track.py#L63

zardchim commented 9 years ago

Hey, prompt reply this time ;)

I see! As I am interested to track a ball inside a box, I would like to add the rotational matrix, I believe I should delve into the tf package to see what i can do with it :P

I am going to have my school exam >.< will continue to work on this code after next week!

Elucidation commented 9 years ago

Good luck! I'm not sure what you mean by adding the rotational matrix here, I'll try to help where I can.

On Sat Dec 06 2014 at 1:04:39 AM zardchim notifications@github.com wrote:

Hey, prompt reply this time ;)

I see! As I am interested to track a ball inside a box, I would like to add the rotational matrix, I believe I should delve into the tf package to see what i can do with it :P

I am going to have my school exam >.< will continue to work on this code after next week!

— Reply to this email directly or view it on GitHub https://github.com/Elucidation/StereoColorTracking/issues/1#issuecomment-65890573 .

zardchim commented 9 years ago

It just means as my cameras are not in parallel so in my case the rotation matrix should be involved.

BTW, could you tell me why in practical we need to tweak the camera manually? Becoz after I have calibrated the cams through the stereo calibration package, I am able to get the distance between the cams @@ I am thinking if manual tuning is not necessary by using the stereo calib pkg result

Elucidation commented 9 years ago

Very possible, I was using the monocular calibration, and the centerpoint of the generated projection matrix was off so I had to move it manually. I'll solve that puzzle another time.

On Sat Dec 06 2014 at 1:51:58 AM zardchim notifications@github.com wrote:

It just means as my cameras are not in parallel so in my case the rotation matrix should be involved.

BTW, could you tell me why in practical we need to tweak the camera manually? Becoz after I have calibrated the cams through the stereo calibration package, I am able to get the distance between the cams @@ I am thinking if manual tuning is not necessary by using the stereo calib pkg result

— Reply to this email directly or view it on GitHub https://github.com/Elucidation/StereoColorTracking/issues/1#issuecomment-65891570 .