hansonrobotics / robo_blender

ROS node for blender face-head-neck control
12 stars 11 forks source link

Blender can show all detected faces by pi_vision #14

Closed Gaboose closed 10 years ago

linas commented 10 years ago

Thanks. I added some minor documentation to the code, per the above. I also added a cookbook section to the README, but got stuck in two different places.

-- Who publishes the /tracking_event ? I looked all over for it, but could not find it.

-- To look at thing x, I guess I should say something like rostopic pub /tracking_action eva_behavior/tracking_action thing_x but what are the valid values for thing_x ? is there a list published somwhere, of what is being tracked, and what these are called? Are they just roi_1 roi_2, etc. or something else?

-- I tried a dual camera setup. Viz, this:

export ROS_NAMESPACE=eye
roslaunch uvc_cam uvc_cam.launch device:=/dev/video1
rosrun image_view image_view image:=/eye/camera/image_raw
# Above works great, below fails
roslaunch pi_face_tracker face_tracker_uvc_cam.launch input_rgb_image:=/eye/camera/image_raw

I thought that I could just overload the input_rgb_image parameter at the command line with the correct topic for the second camera, but that didn't work ....

vytasrgl commented 10 years ago

-- Tracking action is published by eva_behaviour -- The message and values published described on README here https://github.com/hansonrobotics/eva_behavior

linas commented 10 years ago

wait, what? I just found out that /tracking_event is puiblished by pi_vision. is it also published by eva_behaviour? I guess I will have to look at that ...

vytasrgl commented 10 years ago

tracking_event (new person enters the scene, or leaves the scene) is published by pi_vision to eva_behaviour. Eva_behavior publishes tracking_action (gaze to the person or look to the person) based on some fuzzy logic