Closed Rafcin closed 5 years ago
Did you find a fix for this? Unfortunately, I don't have other sensors apart from the Kinect, so I can't test this, but you can try to flip the frame corresponding to your camera, as it seems to be upside down. That should fix everything.
I didn't understand why you made it z x y but after some time I realized it was z x y not x y z and that fixed it.
In my code, an object's tf is published as (z, -x, -y)
, actually. That's because the Kinect's camera_link
frame, which is created by another package, is always flipped like that. When I published an object's tf wrt. to camera_link
, which seemed like the right thing to do, objects were published in the wrong place. Since people are now using other cameras/IR sensors with the package, I'll try to publish frames correctly from now on and fix the Kinect's camera_link
orientation on my end.
Sounds good. It's odd only the kinect does this but good to know! Also take a look at the other issue I created, from what I understand if you get to close to the pointcloud the transform gets created incorrectly and RVIZ can't handle it.
For some reason with my Zed camera setup, objects appear to be moved behind me and on the wrong side of my robot. If I step in front of the camera and i'm on the left, the transform appears on the right and instead of being frontwards its backwards behind the robot.
Also the node has a tendency to crash Rviz when it passes an incorrect vector.