Closed mayacakmak closed 3 years ago
Upstream changes have been merged, needs to be tested on the robot.
This is currently not working even though it was tested successfully earlier for the Hand mode. Not sure when it stopped working, Vy said it never worked on their end, but will dig in now to debug.
Getting closer... for some reason the getJointEffort()
function in ros_connect.js
is not implemented anywhere. I wonder if some merge caused this.
Brought back the getJointEffort()
function (Commit d168a86) and things are back to working.
This is broken again :) works for wrist but not arm--looking into it now.
Fixed now, working for all joints. I moved the hand actions (open/close) back onto the overlay, which is a bit less intuitive in terms of where to look for the action but more intuitive for the visual feedback.
Charlie has implemented visual feedback about joint effort (torque) for a few critical joints in the form of transparent color overlays in the screen region corresponding to that joint. The first few are integrated (in the "Hand" tab) and the next few are ready to be merged in the upstream master.
This should be done before some of the other major changes. This functionality will transfer to the manipulation part of the interface upon switching to non-modal layout, will need to figure out what to do with hand torque feedback (make the button red?).
There is a discussion about a nonlinear function (perhaps simply some sort of threshold?) to do something in the interface as a more serious warning.