Open mayacakmak opened 3 years ago
With the navigation actions still being triggered through overlay regions (for the targeted version of 2.00) this is how I would recommend adding the 90 degree rotation actions to the interface.
Later if/when we switch to a Beam-like driving interface we can re-think.
I've added the buttons and front-end code for rotating 90deg in both directions.
For the backend, it looks like we are currently only sending angular velocity goals to the robot. Using stretch's IMU, I can put together a PID loop that would achieve the 90 rotation, but that feels like a pretty clunky solution.
What branch should I be committing this too @mayacakmak? The camera following and sim code is all setup, so I think the best option would be to merge gazebo_sim into master or latest (looks like you already did that, thanks!) create a branch for the three tasks in this issue off of that
I implemented a function that turns the robot to by a specific angle offset, using a proportional control loop that generates repeated goals until the robot achieves the desired rotation. I'm getting rotation info from tf
, which should work in the sim and on the robot.
https://github.com/hcrlab/stretch_web_interface/blob/%2311-decrease-manipulation-burden/robot/ros_connect.js#L356 (the p value should be tuned on the actual robot)
The loop stops automatically if the robot is translated or rotated via the web interface.
Cool! I'm excited to test this on the robot.
@KaviMD I'm testing this merge now. Nothing's happening when I trigger the 90 degree rotation, I'm getting this error that might be relevant. Let me know if you have any thoughts on what might this mean/how to fix.
I think the issue is with what transform I was getting the robot orientation from. Gazebo publishes as the root transform odom
, which is what I was using in simulation. (I used rosrun tf view_frames
to find the root transform)
Updating odom
this line: https://github.com/hcrlab/stretch_web_interface/blob/master/robot/ros_connect.js#L119, with whatever the non-sim root transform is should fix the problem. My guess is that its world
.
Yes! I also did rosun tf view_frames
and tried to look for something like robot_base
but couldn't find it. World is typically a fixed frame on the ground where the robot starts, not sure it's equivalent to odom
I think the base of the robot is base_link
, which is what I'm using as the fixed frame when I get all the transforms. In order to get absolute orientation, I think that base_link
would need to be compared to something fixed that is off the robot, like world
, I may also be misunderstanding something about how transforms work though.
That is correct! But typically we use world as the fixed frame and that allows the TFs that we read to be relative to that. Anyway, I tried that and it didn't work :-) it says "world" passed to lookUpTransform does not exist.. Also looking at the TF tree I don't see world or something separate from the robot, so perhaps the estimated pose of the robot is just not included in the TF. Let's ask Charlie what to do here.
@elliston-f Quick question for you, since you're a collaborator on the repo ;) Is there an equivalent of odom
TF frame for the (real) Stretch robot? Things were working in simulation with odom
but not on the robot. If not, how can we access the base pose estimated with odometry on the Stretch? Thanks!
Hmm, just noticed this line in the launch file:
<param name="/stretch_driver/broadcast_odom_tf" type="bool" value="false"/>
...will try changing it to true and go back to using odom
.
Okay, that resolved the issue of not having the odom frame! The feature not working yet though (multiple issues, will describe soon).
Hi everyone!
I have a couple of suggestions.
@KaviMD, it's great that you've been working on feedback control for the mobile base, since it could help the robot better achieve its goals on some floors. I recently wrote a relevant forum post that I recommend you read. Among other things, my post points to example code that uses a scan matcher with Stretch's laser range finder to provide improved odometry while controlling the mobile base.
That said, for the web interface, I recommend starting with a simple open-loop rotation command to the robot. It won't be perfect, but it will do pretty well on most surfaces, since Stretch's low-level control uses wheel odometry to achieve the commanded angle. Also, the human operator will be able to make corrections after the autonomous attempt.
For the real robot, a mobile base rotation command will rotate the robot by the commanded angle in radians. So, if you want to rotate Stretch by 90 degrees using wheel odometry, you won't need to consider reference frames. The web interface provides an example. The robot's browser executes rotation commands using the baseTurn function in ros_connect.js
, which you could use to send a rotation command for ±π/2 radians.
baseTurn
uses the generatePoseGoal function in ros_connect.js
to move Stretch via the FollowJointTrajectory action server provided by stretch_ros. If you're ever curious about the details, you can look at joint_trajectory_server.py.
Outside of JavaScript, keyboard_teleop provides a Python example that uses the FollowJointTrajectory action server to command the mobile base. It first defines a command based on the user's keyboard input and then sends the command via the FollowJointTrajectory client in the HelloNode class.
I hope this helps!
Best wishes, Charlie
Thank you! The forum post was very helpful in understanding the different ways to get rotation data.
For some reason I thought that generatePoseGoal was setting velocity targets for rotation and translation rather than position targets. I can definitely implement sending a command for much ±π/2 radians like you suggested, that seems like a much cleaner solution.
That's super useful @hello-ck, thank you!
@KaviMD I was part way debugging the feedback control with odom fixed in the master branch yesterday. I can get back to it and fix this and test now if you like, but let me know if you're already working on it (just be sure to merge with master first to avoid conflicts), I can look into other stuff in the meantime. thanks!
Indeed generatePoseGoal
already using odometry greatly simplifies this feature.
I'm working on implementing the two necessary functions right now, if its okay I'll just push them to master, this should be a pretty small fix.
Sounds good, let me know when it's ready to test.
Sorry for the delay, I just modified commands.js to use baseTurn instead of the turnAngleOffset. It doesn't turn exactly 90 degrees, but that may be a simulation artifact or require some on-robot tuning.
I'm in the process of removing all the code attached to turnAngleOffset and will push a commit for that in a bit.
I'm glad things are going well and you're close to an initial implementation to test on the robot! Three issues worth considering in the future follow:
Best wishes, Charlie
@KaviMD Let me know if we have something ready to test for tomorrow.
Super important to think about the hazards that Charlie pointed out.
The turn fix should already be committed to master (these two commits)
I can work on adding 2 and 3. I did have something in place to cancel the rotation. Previously only base rotations or translations would interrupt the movement, but that is easy to extend to any event. (I temporarily removed the interrupt functionality because it caused some errors with the new movement system, but I will add it back once its working)
I just tested this and it works, yay!
For 2, it would be useful to first address Issue #13 (automatic stowing), add to that some way of tracking if the arm has moved away from its stowed pose and then implement this warning. Let me know if/when you want to work on that--I have a few minor fixes on other things but can get to is shortly.
Excellent! The following forum post seems relevant to the accuracy of the rotation:
@nickswalker These two files contain most of the 90 degree rotation code https://github.com/hcrlab/stretch_web_interface/blob/928573a67e58ff9af36f4142845ee18f7b849e43/shared/commands.js#L530-L539
I think that the variable vel
is poorly named, from my testing that actually controls the position.
Problem description
The Stretch cannot move in the direction that its arm is extended. Because of that the user needs to drive towards a target then rotate 90 degrees towards the target to do manipulation. When the user fails to accurately position the base before rotating 90 degrees, they need to re-stow the arm, rotate back, drive closer, rotate again.. which is frustrating. This issue was observed in our user study and was mentioned by Henry Evans as a source of inefficiency and frustration (leading him to wish the Stretch has an omnidirectional base).
Possible ways to address
Some ways to reduce the inefficiency and burden on the user:
@KaviMD All three of these are nicely modular functionalities that you can get started on. I'll create to additional issues for the later two items and reference this issue for the problem description.