Open DavidB-CMU opened 8 years ago
Here's a screenshot ~30 seconds apart, showing the large amount of drift:
It's better now that the stargazer is detecting two landmarks again.
Perhaps this should be a discussion for herbpy?
Nothing in prpy
has knowledge about what the preferred base frame should be. herbpy
makes the decision that this should be a "world" frame, and the robot should move in it. However, it could just as easily leave the robot at a static position in the base frame and move other objects around it as HERB moves.
This discussion definitely belongs here because prpy
is the home for prpy.perception
. This code is used on robots other than HERB.
In response to @DavidB-CMU's original question:
@Shushman already added this functionality to ApriltagsModule
in ?? by adding the reference_link
parameter to the constructor. We should make the same change to the VNCC, Rock, and SimTrack modules.
However, @cdellin made a argument that we should fix the underlying oscillation, not hack around it by running perception in the robot frame. This work-around will break as soon as we move the base which, hopefully, will become more common with the upcoming hardware charges. Instead, we can unsubscribe to the Stargazer in OpenRAVE when we know the robot is stationary.
@jeking04 Are we already doing this in the table clearing demo?
I'm confused. Why does prpy.perception
have any idea if it is operating in the robot or world base-frame? It seems like any notion of that is encoded in the TF transforms that are in use.
I don't think planning in the robot frame is necessarily a bad idea: it means that object persistence is more challenging, but it avoids compounding localization error with perception error.
But I do not understand why anything in prpy
would be specifically tied to anything other than the OpenRAVE world
frame, which can be related to the robot base frame in any way that we want. We have consciously chosen to relate HERB -> world
via his odometry/the stargazer, but you could also teleport the lab kinbodies around and keep HERB fixed to the world if you wanted.
First, I agree that we should fix the underlying oscillation.
Second, the reference_link
parameter change has been made to VNCC, simtrack and chisel. These vncc/chisel change is working its way through via PR. The simtrack change will be in a seperate PR coming as soon as the chisel one completes.
@psigen I'm not sure if this answers your question, but I'll give it a shot. prpy.perception
is where we take data from perception modules and turn it into OpenRAVE kinbody
objects. Thus prpy.perception
need to know where to put these bodies. We wanted the ability to do exactly what @DavidB-CMU suggested, detect in robot frame rather than world frame. This allows us to hack around the localization error by running herbpy
with the base simulated and still detect objects and get them into OpenRAVE in the right place relative to HERB.
I had some problems with HERB's pose drifting because it could only see 1 stargazer landmark. I cleaned the sensor lens and the IR stickers and it works better, it's seeing a minimum of 2 markers with the robot in a similar position.
Tom gave us a lot of help in this thread, should we give him a status update? https://github.com/cra-ros-pkg/robot_localization/issues/221
Also I propose that we do planning and perception activities in the robot's base frame. I think we should leave the map-to-robot frame transform as something that only concerns the localization and mapping problem.