stack-of-tasks / pinocchio

A fast and flexible implementation of Rigid Body Dynamics algorithms and their analytical derivatives
http://stack-of-tasks.github.io/pinocchio/
BSD 2-Clause "Simplified" License
1.92k stars 395 forks source link

Calculating the xyz coordinates of the joints #1018

Closed dadouz closed 4 years ago

dadouz commented 4 years ago

cr35ia.urdf.txt fanuc_cr35ia.pdf

Hello together, i have following issue and dont know how to solve it. We have a fanuc robot CR35iA which is providing through OPC

What we need is to know every time the xyz coordinates of all the joints. the model of the robot is available as a urdf file. We looked into your examples and didnt find a python or a c++ code for solving this issue. Please help me by providing a code.

Regards, joints tcp

gabrielebndn commented 4 years ago

Actually, there is an example. You can find it at https://gepettoweb.laas.fr/doc/stack-of-tasks/pinocchio/master/doxygen-html/index.html#OverviewIntro under the section "More complex example with C++ and Python". The examples are overview-urdf.py and overview-urdf.cpp (these files are slightly more up to date than what you find in the online doc, but nothing important).

dadouz commented 4 years ago

thank you very much. Can you tell me how to pass real time data to the variable q. How to build this q?. if there any command for the initiating this with the joints values (see picture) in the code you provided the q is created randomly --> q = pinocchio.randomConfiguration(model)

i m using python.

thanks in advance.

nim65s commented 4 years ago

q is the configuration of your robot. In your case, this is just a list of your «joint angles (for all 6 joints).»

dadouz commented 4 years ago

but if i want to initiate q with specific fixed joint angle values, how this looks like in python? how can i declare it? thanks in advance

nim65s commented 4 years ago

To initiate a list in Python, here is the documentation: https://docs.python.org/3/tutorial/introduction.html#lists ; in your case, you have to change those numbers by the value of your angles.

dadouz commented 4 years ago

the void: forwardKinematics(const ModelTpl<Scalar,Options,JointCollectionTpl> & model, DataTpl<Scalar,Options,JointCollectionTpl> & data, const Eigen::MatrixBase & q)

is expecting q with following type: const Eigen::MatrixBase

after declaring a list the error looks like this

Traceback (most recent call last): File "/home/autark/robot/python/overview-urdf.py", line 10, in pinocchio.forwardKinematics(model,data,q) Boost.Python.ArgumentError: Python argument types in pinocchio.libpinocchio_pywrap.forwardKinematics(Model, Data, list) did not match C++ signature: forwardKinematics(pinocchio::ModelTpl<double, 0, pinocchio::JointCollectionDefaultTpl> Model, pinocchio::DataTpl<double, 0, pinocchio::JointCollectionDefaultTpl> {lvalue} Data, Eigen::MatrixBase<Eigen::Matrix<double, -1, 1, 0, -1, 1> > Configuration q (size Model::nq), Eigen::MatrixBase<Eigen::Matrix<double, -1, 1, 0, -1, 1> > Velocity v (size Model::nv), Eigen::MatrixBase<Eigen::Matrix<double, -1, 1, 0, -1, 1> > Acceleration a (size Model::nv)) forwardKinematics(pinocchio::ModelTpl<double, 0, pinocchio::JointCollectionDefaultTpl> Model, pinocchio::DataTpl<double, 0, pinocchio::JointCollectionDefaultTpl> {lvalue} Data, Eigen::MatrixBase<Eigen::Matrix<double, -1, 1, 0, -1, 1> > Configuration q (size Model::nq), Eigen::MatrixBase<Eigen::Matrix<double, -1, 1, 0, -1, 1> > Velocity v (size Model::nv)) forwardKinematics(pinocchio::ModelTpl<double, 0, pinocchio::JointCollectionDefaultTpl> Model, pinocchio::DataTpl<double, 0, pinocchio::JointCollectionDefaultTpl> {lvalue} Data, Eigen::MatrixBase<Eigen::Matrix<double, -1, 1, 0, -1, 1> > Configuration q (size Model::nq))

gabrielebndn commented 4 years ago

In Python, q is a numpy matrix or array, depending on the eigenpy settings. In the provided example, we use matrices

nim65s commented 4 years ago

My bad, that would be:

import numpy as np
q = np.matrix([4.27492493, -1.32722328, 1.77876488, 3.75030807, 5.17291326, -3.80068159]).T
dadouz commented 4 years ago

thanks. it works. BUT what about w ,p and r of the TCP. are these not important for the calculation? where they have to be passed?

florent-lamiraux commented 4 years ago

If you mean the orientation of the joint, you can get it by data.oMi[k].rotation as a rotation matrix. If you mean something else, please explain what w, p and r are.

jcarpent commented 4 years ago

@dadouz Is your issue now solved?

dadouz commented 4 years ago

Hi again, based on the urdf file and your function the result is different to the output of the PLC. I dont know where is the problem. Here is one example. Why the xyz of the TCP totally different from the j6 xyz provided by your function? experiment4_python output joints from robot robot position tcp position

dadouz commented 4 years ago

Hi florent-lamiraux with w, p and r i mean yaw pitch and roll values for the TCP as you can see in the last figure.

dadouz commented 4 years ago

Do you have an alternative to extract xyz of all the joints starting with the xyz yaw, roll and pitch of the TCP?. I mean inverse kinematic.

jcarpent commented 4 years ago

It seems that you're providing a value for the configuration vector qwhich are in degrees, while Pinocchio is using radians. Please consider using the formula:

q_rad = np.pi / 180. * q_deg 
dadouz commented 4 years ago

thank you very much for this information. I will try it and keep you informed with the result

dadouz commented 4 years ago

we changed the values to radian. We got in specific positions correct values but by moving the robot to different positions, the values of joint 5 and joint 6 are not compatible with the reality. There is a big difference. mainly in X and z coordinates. I don't know where is the problem. Can you help me.

dadouz commented 4 years ago

here are some data regarding the problem of joint 5 and 6, as you can see the joint 3, 4, 5 and 6 are aligned on the same level. Normally the value of Z for joint 5 and 6 should be close to joint 4 values in Z.

WhatsApp Image 2020-01-24 at 13 37 08

problem

gabrielebndn commented 4 years ago

@dadouz, are you sure the value for joint 3 is correct? Based on the pictures you sent, it seems you set it to 1 degree, but it looks closer to

dadouz commented 4 years ago

here are screenshots from the opc server and the PLC itself with the data we are getting WhatsApp Image 2020-01-24 at 14 07 54 WhatsApp Image 2020-01-24 at 14 07 32

dadouz commented 4 years ago

what about this j2/j3 interaction value, mentioned in the second picture? is there any relation maybe with the problem. This value is changing from movement to movement.

gabrielebndn commented 4 years ago

It seems to me that the orientation of the joint frames on the real robot is unusual. There is some liberty in the positioning of the joint frames, and this might affect the position of the joint zero in particular. You should check whether the URDF frame placement is coherent with what you have on the robot. It is possible that there might be some offsets to take into account. This is a job only you can do, as it will require many checks on the real robot. To do this, I encourage you to rely on the visualization functionalities provided by Pinocchio (GepettoViewer or Meshcat, install either one of them depending on your choice) as shown at pinocchio/examples/gepetto-viewer.py or pinocchio/examples/meshcat-viewer.py or pinocchio/examples/robot-wrapper-viewer.py (the last example employs RobotWrapper)

dadouz commented 4 years ago

is it possible to use these examples with only the urdf file i already have or do I need also the same directory structure including the files like for romeo and ur5

gabrielebndn commented 4 years ago

is it possible to use these examples with only the urdf file i already have or do I need also the same directory structure including the files like for romeo and ur5

You will need the meshes. The directory structure needs not be identical, but it will be similar (you need to pass the location of the meshes as an argument)

dadouz commented 4 years ago

how can I get them? are these specific to our robot? I dont have an idea about that. I m only applying your approach and need to show the results in a journal paper related to my researches.

Neotriple commented 4 years ago

@dadouz The meshes you will need will be specific to your robot, either something made by you, or something that can be provided by the robot manufacturer.

dadouz commented 4 years ago

The problem has been solved. The issue was the urdf file that was needed to be adjusted. Thank you for your support. GREAT support.

gabrielebndn commented 4 years ago

Excellent, happy of being able to help. Thanks for the feedback!