IFL-CAMP / iiwa_stack

ROS integration for the KUKA LBR IIWA R800/R820 (7/14 Kg).
Other
337 stars 249 forks source link

Not able to make iiwa_moveit work correctly on real Iiwa #99

Open robotgradient opened 6 years ago

robotgradient commented 6 years ago

I am not sure if the moveit_planning_execution.launch file is only for simulation.

In case it is available also for real robot, it would be nice to add a tutorial to make it work with real robot.

SalvoVirga commented 6 years ago

You can launch it with

roslaunch iiwa_moveit moveit_planning_execution sim:=false

to use the real robot (that has to be connected to the machine running ROS as described in the setup and running the application ROSSmartServo).

You should find some info here.

robotgradient commented 6 years ago

Thanks for the info, it was really helpful.

Anyway, I have another issue. Seems like the iiwa is doing S form paths even if I am fixing a linear trajectory in moveit. This problem only appears on the real robot. Applying the same linear trajectory Plan in simulated enviroment robot follows it perfectly. I dont know if you have faced similar problems.

SalvoVirga commented 6 years ago

Hum, this happens quite often when sending directly cartesian commands to the robot that are on a linear path. E.g. (without taking into account rotations), if you send A: X=0.1, Y=0.5, Z=0.4 and then B: X=0.2, Y=0.5, Z=0.4 (so you want to move 10cm in X)

the KUKA planner would go from A to B without guaranteeing a linear motion and make that "S" path you are describing.

If you are sending a linear path computed by MoveIt! you are then sending commands in joint space and that should not happen.

Another solution is to directly use the Linear motions from KUKA, which are implemented already in the development branch and hopefully one day will move to master when I find some time to work on iiwa_stack again :( There you have a topic/iiwa/command/CartesianPoseLin to which you can send cartesian poses (as with iiwa/command/CartesianPose) but the goal pose will be reaches using a linear motion.

robotgradient commented 6 years ago

After a research, this S appearance movements appeared if I was pushing new trajectories before ending the previous one. The problem I am facing with moveIt is that in real robot and only in real robot(not in simulation), moveIt tends to publish Execution completed: SUCCEEDED before ending the trajectory.

Tomorrow I will face this problem in order to find a solution.

Best!

SalvoVirga commented 6 years ago

Did you try to use the TimeToDestination service? It might not work well since the Sunrise controller might just check if one single step sent from MoveIt in joint space is executed, but check. Else, you might need to sync the settings for joint speed/acceleration used on the real robot and within MoveIt.

SalvoVirga commented 6 years ago

updates?

robotgradient commented 6 years ago

None. But actually seems a general problem. A friend was also trying to implement it and he was facing the same S movements when trying to do linear cartesian movements.

Seems like some joints were not moving synchronously. I didn 't have time to get into the rosjava packages. I would face this as long as I get more free time :P

sven-hoek commented 6 years ago

We are having the same problem. Even when giving a path of e.g. 10 poses along a line (only differing in their position.z coordinate) and planning a trajectory with computeCartesianPath(), the movement shown in RViz looks like it's supposed to, but the endeffector of the real robot moves along a curve. Could it be the ROS controller? Always wondered, why the "# VALUES ARE NOT CORRECT !" but never tried changing them.

SalvoVirga commented 6 years ago

Could it be the ROS controller? Always wondered, why the "# VALUES ARE NOT CORRECT !" but never tried changing them.

Are you using a different controller than PositionJoint ones? Those values are only for Effort/Velocity controllers.

If I remember correctly, the output of computeCartesianPath() if a trajectory in joint space. So yes, if you send joint commands the robot should just move using those. When sending a Cartesian command directly to the controller (no using MoveIt!), even if along a line, that goes through the KUKA planner and you have no guarantees that it will generate a linear motion. But for joint space commands I agree that it doesn't make much sense.

What would be nice to do: generate a joint trajectory for a Cartesian linear motion, write down the joint space commands that are sent to the robot and write a tiny RobotApplication that just uses a SmartServo motion and those hard-coded joint space commands. If the robot doesn't move in a linear motion, it's a problem for KUKA. I won't have much time in the near future to make this test myself, but I could help if someone needs feedback/guidance on writing the RobotApplication. Having the code on this issue would be nice to reproduce the problem. Also, I could directly bug the devs at KUKA if the issue if clearly on their side.

P.S. If that work for your use case, using the development branch you can also send Cartesian commands to /iiwa/command/CartesianPoseLin and that will be used with a SmartServoLin motion, which guarantees linear motions. That means not using MoveIt! but directly the KUKA planner.

sven-hoek commented 6 years ago

Yeah, I'm using the PositionJoint ones. Will try both the RobotApplication and the CartesianPoseLin and report the results.

sven-hoek commented 6 years ago

Okay, so I haven't tried the CartesianPoseLin motion yet but I put the joint positions of the trajectory from computeCartesianPath() into Sunrise by hand like you suggested. The endeffector's motion was linear, no curving like with the iiwa_stack.

SalvoVirga commented 6 years ago

@sven-hoek Could you share the code of the RoboticApplication you used for that test?

sven-hoek commented 6 years ago

Of course. Just moved between them via PTP:

`public class trajectoryTest extends RoboticsAPIApplication { @Inject private LBR lbr;

private JointPosition startPosition;
private JointPosition jointPositionsUP[] = new JointPosition[32];

@Override
public void initialize() {
    startPosition = new JointPosition(-0.360541164875, 0.781382322311, -0.119466125965, -0.998517155647, 0.0837756767869, 1.3416301012, -1.26847708225);

    jointPositionsUP[0] = new JointPosition(-0.360513,0.781618, -0.119489, -0.998376, 0.0835517, 1.34141, -1.26836);
    jointPositionsUP[1] = new JointPosition(-0.36047, 0.78132, -0.119466, -0.998453, 0.0835619, 1.34164, -1.26836);
    jointPositionsUP[2] = new JointPosition(-0.360598, 0.777839, -0.119696, -0.991784, 0.08329, 1.35108, -1.26806);
    jointPositionsUP[3] = new JointPosition(-0.360783, 0.774828, -0.119907, -0.984298, 0.083036, 1.36124, -1.26774);
    jointPositionsUP[4] = new JointPosition(-0.361289, 0.77014, -0.120291, -0.967028, 0.082551, 1.38351, -1.26704);
    jointPositionsUP[5] = new JointPosition(-0.361828, 0.765675, -0.12065, -0.949437, 0.0821163, 1.4055, -1.26635);
    jointPositionsUP[6] = new JointPosition(-0.362446, 0.761793, -0.120972, -0.930889, 0.0817427, 1.42786, -1.26565);
    jointPositionsUP[7] = new JointPosition(-0.363149, 0.758523, -0.121257, -0.911332, 0.0814339, 1.45061, -1.26495);
    jointPositionsUP[8] = new JointPosition(-0.363941, 0.755892, -0.121504, -0.890712, 0.0811941, 1.47379, -1.26424);
    jointPositionsUP[9] = new JointPosition(-0.36483, 0.753933, -0.12171, -0.868966, 0.0810277, 1.4974, -1.26352);
    jointPositionsUP[10] = new JointPosition(-0.365228, 0.752572, -0.121821, -0.858773, 0.0809458, 1.50829, -1.26318);
    jointPositionsUP[11] = new JointPosition(-0.365733, 0.752009, -0.121899, -0.847198, 0.0809091, 1.52036, -1.2628);
    jointPositionsUP[12] = new JointPosition(-0.366214, 0.751245, -0.121981, -0.83598, 0.0808758, 1.5319, -1.26244);
    jointPositionsUP[13] = new JointPosition(-0.366819, 0.751388, -0.122024, -0.823187, 0.0808959, 1.54479, -1.26204);
    jointPositionsUP[14] = new JointPosition(-0.367356, 0.750994, -0.122083, -0.811332, 0.0809043, 1.55658, -1.26167);
    jointPositionsUP[15] = new JointPosition(-0.368164, 0.752594, -0.122061, -0.796009, 0.0810215, 1.5715, -1.2612);
    jointPositionsUP[16] = new JointPosition(-0.368702, 0.752154, -0.122114, -0.784237, 0.0810535, 1.58283, -1.26084);
    jointPositionsUP[17] = new JointPosition(-0.369524, 0.753787, -0.122081, -0.76885, 0.0812025, 1.59737, -1.26037);
    jointPositionsUP[18] = new JointPosition(-0.370248, 0.754657, -0.122073, -0.754786, 0.0813276, 1.61044, -1.25994);
    jointPositionsUP[19] = new JointPosition(-0.371031, 0.755931, -0.122045, -0.74001, 0.0814881, 1.62396, -1.25949);
    jointPositionsUP[20] = new JointPosition(-0.37183, 0.75728, -0.122008, -0.725093, 0.0816672, 1.63739, -1.25903);
    jointPositionsUP[21] = new JointPosition(-0.372697, 0.759084, -0.121948, -0.709364, 0.0818861, 1.65133, -1.25855);
    jointPositionsUP[22] = new JointPosition(-0.37358, 0.760968, -0.121879, -0.693477, 0.0821248, 1.66517, -1.25806);
    jointPositionsUP[23] = new JointPosition(-0.374542, 0.763374, -0.121782, -0.67665, 0.0824088, 1.67961, -1.25754);
    jointPositionsUP[24] = new JointPosition(-0.375521, 0.765854, -0.121676, -0.659667, 0.0827131, 1.69393, -1.25701);
    jointPositionsUP[25] = new JointPosition(-0.37659, 0.768944, -0.121537, -0.641576, 0.0830697, 1.70895, -1.25643);
    jointPositionsUP[26] = new JointPosition(-0.377677, 0.772097, -0.121388, -0.623337, 0.0834471, 1.72381, -1.25585);
    jointPositionsUP[27] = new JointPosition(-0.378872, 0.775977, -0.1212, -0.603764, 0.083886, 1.73953, -1.25521);
    jointPositionsUP[28] = new JointPosition(-0.380082, 0.779902, -0.121, -0.584058, 0.0843459, 1.75504, -1.25455);
    jointPositionsUP[29] = new JointPosition(-0.381425, 0.784718, -0.120752, -0.562704, 0.08488, 1.77161, -1.25383);
    jointPositionsUP[30] = new JointPosition(-0.38278, 0.789549, -0.120491, -0.541249, 0.0854347, 1.7879, -1.25309);
    jointPositionsUP[31] = new JointPosition(-0.384302, 0.795505, -0.120167, -0.517692, 0.0860814, 1.80553, -1.25225);
}

@Override
public void run() {
    lbr.move(ptp(startPosition));

    for (int i = 0; i < jointPositionsUP.length; ++i) {
        lbr.move(ptp(jointPositionsUP[i]));
    }
}

}`

SalvoVirga commented 6 years ago

Thanks a lot! From a very quick look, this makes sense but you are not using SmartServo! Your run() should looks something like:

@Override
public void run() {
    lbr.move(ptp(startPosition));

        SmartServo motion = new SmartServo(robot.getCurrentJointPosition());
        lbr.moveAsync(motion);

    for (int i = 0; i < jointPositionsUP.length; ++i) {
            if (robot.isReadyToMove()) {
                motion.getRuntime().setDestination(jp); 
            }
         }
}

I just wrote that from the top of my head, don't take this code as working. But it should be something similar.

sven-hoek commented 6 years ago

Ok, so I haven't worked with SmartServo motions in Sunrise yet, so my code is a little dirty and the first version where the robot moved smoothly. Here the endeffector actually went along a curve again. This was dependend on the delay between the setDestination commands. With enough time inbetween, the endeffector moved almost on a line but of course with breaks between the points. With 40ms delay the curve got stronger.

public void run() {
    lbr.move(ptp(startPosition));
    SmartServo motion = new SmartServo(lbr.getCurrentJointPosition());
    lbr.moveAsync(motion);

    for (int i = 0; i < jointPositionsUP.length; ++i) {
        while (!lbr.isReadyToMove()) {}
        motion.getRuntime().setDestination(jointPositionsUP[i]);

        try {
            Thread.sleep(80);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}
SalvoVirga commented 6 years ago

I see, weird indeed. It might help to play around with the settings of the SmartServo motion (you can read about that in the KUKA manual): setMinimumTrajectoryExecutionTime(), setSpeedTimeoutAfterGoalReach(), etc.

I will try to find some time to prepare those two example with the joint position you posted and ask some guys at KUKA, they might have a quick answer on this.

Thanks a lot for your efforts!

Oh, are you using the master branch? I would suggest to use development in any case.

sven-hoek commented 6 years ago

Okay, maybe I'll try that out. Apparently we are working with a version of _iiwastack that was forked a while ago (I dared to write here, as it seemed that the problem persisted). We have some minor modifications, I'll see if I can find the time to merge the newer versions of this repo into ours somehow or how to do that somewhat continuously. We are somewhat new to git/github. I might even do a pull request, we added the functionality that the state of a MediaFlange user button is published to ROS.

SalvoVirga commented 6 years ago

No worries! Pull requests are always welcome!

I would just love to merge development to master at some point (a lot of things changed there), I've been too busy to work on iiwa_stack actively for too long.

sven-hoek commented 6 years ago

Well, it's awesome to see that it is still being worked on and I really appreciate all the work that went into it! Thanks for that and for the quick answers!

sven-hoek commented 6 years ago

Okay, so I tried the setMinimumTrajectoryExecutionTime and set it to the same delay I am sleeping between the setDestination commands or a little less. This of course made the motion pretty smooth when using a larger delay. With a big enough delay >150ms, the motion is almost linear (still wiggling around the actual line a bit), with <100ms delay, the motion is less curvy but still not really linear.

robotgradient commented 6 years ago

Hi, I am back in the iiwa_stack.

I have tried modifying the setMinimumTrajectoryExecutionTime to 200 ms but not a lot have changed. Any other suggestions?

I am going to test the development branch meanwhile

sven-hoek commented 6 years ago

200 seems a lot. I was using those times in my testing application. It might make sense to adjust this to the frequency with which ROSSmartServo receives its goals but I don't know that value. With a big value for the minimumTrajectoryExecutionTime I also used a long delay between sending the trajectory points, therefore the resulting motion was really slow. With SmartServo as it is, it might not even be possible to create a fast, linear motion but @SalvoVirga might know a little more about that.

robotgradient commented 6 years ago

As far as I went into the ROSSmartServo file didn't found any sleep function between each setDestination calls. Did you add one by yourself? and which values did you give to it?

I will try the reducing my minimumTrajectoryExecutionTime and let you know.

sven-hoek commented 6 years ago

There probably won't be a sleep function, it probably will be called when a new trajectory point was received and I don't know if ROSSmarServo receives its goals synchronously or asynchronously.

I used a sleep (Thread.sleep(80);) in my test example above, to check if the erroneous motion comes from MoveIt or ROSSmartServo but it seems to come from how KUKA's SmartServo works.

robotgradient commented 6 years ago

I am with you @sven-hoek about that.

As far as I get, whenever SmartServo is receiving a new jointPosition, it will discard the last jointPosition and set the new one. Having the control of the times seems quite complicated, so probably the only way to get Linear movements is with SmartServoLIN.

Any suggestions?

sven-hoek commented 6 years ago

Yeah, @SalvoVirga might know more about the timing. In the corresponding lines in the Java code there are comments that the minimumTrajectoryExecutionTime still needs to be parameterized.

I recently started using the SmartServoLIN topic. It sometimes works pretty well but there are issues when changing back and forth between SmartServoLIN and the controlling via MoveIt! which will be executed as "normal" SmartServo motions. The robot randomly did very weird (and dangerous as they were pretty quick sometimes), uncontrollable motions when executing a linear motion or shortly before/after that. It happened at different times running the same program without any changes. I suspect that there are conflicts between SmartServo and SmartServoLIN on the SunriseCabinet side but I did not inspect it any further yet.

robotgradient commented 6 years ago

As far as I reach, the only way I found to get linear movements controlling in joints(smartServoLin only allows you to move in Cartesian) , is controlling the robot with directServo. Drawback from this, is that directServo is quite restrictive in terms of setting the destination jointPosition.

Working on obtaining this control right now. If you get some relevant news, just let me know :)

robotgradient commented 6 years ago

After long travel, finally found the problem. It was purely delay.

Do to delay, the robot was not able to follow closely the path commanded and so, he was receiving too far commands doing ptp movement.

I set the moveit jointLimits to 0.1 in order to make robot path slower and robot now is moving in linear path through it, but sooo slow.

@SalvoVirga Are you able to move robot fast in linear movements?

So, SmartServo is fine, is just about setting joint positions close enough to current position

robinedwards commented 4 years ago

I was just wondering if anyone had found a solution to this issue? I've tried a number of things, reducing the velocity via joint limits still resulted in a jerky motion.

robotgradient commented 4 years ago

The problem is related with the frequency rate you control the robot. Try controlling the robot to at least real 500Hz.

On Mon, 2 Nov 2020, 17:26 Robin Edwards, notifications@github.com wrote:

I was just wondering if anyone had found a solution to this issue? I've tried a number of things, reducing the velocity via joint limits still resulted in a jerky motion.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/IFL-CAMP/iiwa_stack/issues/99#issuecomment-720577720, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEV2IELRHIH2PP5NLELYWKLSN3MTPANCNFSM4ENVPFHA .

robinedwards commented 4 years ago

Thank you may I ask what may seem a silly question where do you set this? I can't see anything within iiwa_hw or iiwa_control to take a rate parameter? Is the ROS Java node able to operate at such a high frequency?

robotgradient commented 4 years ago

In my case, in order to achieve high-frequency control, I isolated the low-level control PC. Try having an isolated PC, just for low-level command sending, running with a Real-Time Kernel. And try visualizing which is the frequency you are sending the commands now. Visualizing the communication pipeline between your robot controller and the pc will help you understand the performance.

JimmyDaSilva commented 4 years ago

@TheCamusean Approved !

@robinedwards I ran on a similar issue with the iiwa_ros package. I guess some links in this comment https://github.com/epfl-lasa/iiwa_ros/issues/36#issuecomment-662991825 should help.

Good luck!