Phylliade / ikpy

An Inverse Kinematics library aiming performance and modularity
http://phylliade.github.io/ikpy
Apache License 2.0
695 stars 152 forks source link

Not position but Orientation for IK solver #61

Closed AhmetMericOzcan closed 4 years ago

AhmetMericOzcan commented 4 years ago

I tested your library, pretty decent by selecting active joints. However your library minimize its IK solver error by position not by its Orientation. In other words it solves for the Position of the Homo. Transformation Matrix and after that it checks orientation.

How I can solve for orientation as priority?

As I see in the code:

    def optimize_target(x):
        # y = np.append(starting_nodes_angles[:chain.first_active_joint], x)
        y = chain.active_to_full(x, starting_nodes_angles)
        squared_distance = np.linalg.norm(chain.forward_kinematics(y)[:3, -1] - target)
        return squared_distance

This section is specified for target which is position. Any suggestion for Orientation?

AhmetMericOzcan commented 4 years ago

Hello for all faces same problem as me, I found the solution by changing optimization function for orientatition priority. I changed he code as follows:

    # Only get the orientation
    target = target_frame[:-1, 0:3]

    if starting_nodes_angles is None:
        raise ValueError("starting_nodes_angles must be specified")

    # Compute orientation difference
    def optimize_target(x):
        # y = np.append(starting_nodes_angles[:chain.first_active_joint], x)
        y = chain.active_to_full(x, starting_nodes_angles)
        squared_distance = np.linalg.norm(chain.forward_kinematics(y)[:-1, 0:3] - target)
        return squared_distance

With these lines you can use it for orientation too. In this code I cut the rotation matrix from HM and I optimized the code through the difference of target rotation and current rotation. I believe this is how I did it at least :)

It worked now my local library uses this orientation matrix but not the position. Perfect! Cheers

MrDadaGuy commented 4 years ago

hi, I'm trying to do the same, I can't get ikpy to pay any attention to my orientation commands. I've made your changes above and still has zero effect. Here is my chain, any ideas?

`

Kinematic chain name=chain links=[ Link name=Base link bounds=(None, None), Link name=Joint_1 bounds=(-2.0944, 2.0944), Link name=Joint_2 bounds=(-2.2, 2.2), Link name=Joint_3 bounds=(-2.0944, 2.0944), Link name=Joint_4 bounds=(-3.14159, 3.14159), Link name=Joint_5 bounds=(-2.00713, 2.00713), Link name=Joint_6 bounds=(-3.14159, 3.14159), Link name=Joint_Grippper bounds=(None, None), Link name=Joint_Grip_Servo bounds=(-1.57, 0.0), Link name=Joint_Tip_Servo bounds=(None, None) ] active_links=[ True True True True True True True True False False] `

AhmetMericOzcan commented 4 years ago

Are you sure? I git clone the library to a local directory, made the change on the python file and I used pip command for that local directory.

Two place should be changed:

**1.** target = target_frame[:-1, 0:3]

**2.** squared_distance = np.linalg.norm(chain.forward_kinematics(y)[:-1, 0:3] - target)

And I am using a URDF file for the robot. When I enter 4x4 HM matrix, it ignores the position but tries to decrease the error for Orientation. What is your problem?

EDIT: Lastly use the Optimized version of the I.K Solver function. Ibelieve both of them uses the same function at the end however this is How I did it. You even made me doubt myself. Let me check my code and copy paste here

AhmetMericOzcan commented 4 years ago
#!/usr/bin/env python

import matplotlib.pyplot
from mpl_toolkits.mplot3d import Axes3D

from ikpy.chain import Chain
from ikpy.link import OriginLink, URDFLink
from ikpy.inverse_kinematics import inverse_kinematic_optimization

import math
import numpy

my_chain = Chain.from_urdf_file("$PATH/ur5_robot_generated.urdf",
                                active_links_mask = [False, False, False, False, True, True, True, True])

position = [10,10,10]

ax = matplotlib.pyplot.figure().add_subplot(111, projection='3d')

my_h_m = [[-0.25    ,  -0.8660254, -0.4330127,position[0]],
         [ 0.4330127 ,-0.5       , 0.75     ,position[1]],
         [-0.8660254 , 0.        , 0.5      ,position[2]],
         [  0, 0 ,0 ,         1.     ]  ]

my_h_m = numpy.array(my_h_m)
my_chain.plot(my_chain.inverse_kinematics(my_h_m), ax)

print "Input Hm:", my_h_m
return_list2 = inverse_kinematic_optimization(my_chain, my_h_m,[0, 0, 0, 0, 0 , 0 , 0, 0],  max_iter = 100)

my_hm  = my_chain.forward_kinematics(return_list2, math.degrees)
test_forward_kin_joints_2 = my_chain.forward_kinematics([0, 0, 0, 0,return_list2[4], return_list2[5],  return_list2[6], 0])
print "F.K for found I.K solution: /n", test_forward_kin_joints_2

matplotlib.pyplot.show()

Here is my code this is how I did it 3 active joints finds it for orientation. This is the output

Input Hm: 
[[ -0.25       -0.8660254  -0.4330127  10.       ]
 [  0.4330127  -0.5         0.75       10.       ]
 [ -0.8660254   0.          0.5        10.       ]
 [  0.          0.          0.          1.       ]]
F.K for found I.K solution: /n 
[[ -2.49999991e-01  -8.66025404e-01  -4.33012707e-01   7.05738222e-01]
 [  4.33012714e-01  -5.00000000e-01   7.49999993e-01   1.44786946e-01]
 [ -8.66025400e-01  -1.55578426e-08   5.00000006e-01   4.41362956e-02]
 [  0.00000000e+00   0.00000000e+00   0.00000000e+00   1.00000000e+00]]

As you can see I set position to 10, 10, 10 something very big. Normally the ikpy code finds the closest point possible for the input position. However it doesn't matter you can set to something smaller. At the end it solves for orientation. I still believe I did it, If I am not seeing something wrong

MrDadaGuy commented 4 years ago

Hi, thanks for the quick response!

To clarify, what I'm trying to do is to provide a position & orientation for my end effector, and have the IK provide the joint states to get there.

When I use this library as-is, it does a fantastic job of getting my EE to the correct position, with complete disregard to any orientation information I provided.

With your changes, it seems to do a great job of positioning the EE orientation, but with complete disregard to the position information. In other words, if I provide the same X, Y, Z position, but 4 different EE orientations, (with your changes) I get 4 dramatically different robot poses, most or all of which do not respect the EE position I requested.

What would be amazing is if I could provide the transformation matrix and the IK solver did its best to get me to the position & orientation I requested. Unfortunately as good as this library is at the position, I can't seem to get it to work with orientation.

Based on the author's answer to another issue #48 , I conclude that orientation is not implemented.

Thanks

MrDadaGuy commented 4 years ago

The URDF for RBX1 robot is here: https://drive.google.com/file/d/1sA0nC40Fz5wT5WE6xazwUNtX-nEtlkOs/view?usp=sharing

MrDadaGuy commented 4 years ago
#!/usr/bin/env python

import time, math
import numpy as np

import rospy
from sensor_msgs.msg import JointState
from std_msgs.msg import Header

import ikpy 
from ikpy import geometry_utils
from ikpy.chain import Chain

from scipy.spatial.transform import Rotation as R

class Rbx1_Kinematics(object):

    def __init__(self):
        rospy.init_node('kinematics')
        rospy.loginfo("Kinematic thingy initializin' here...")
        self.chain = Chain.from_urdf_file("urdf/rbx1_urdf.urdf", active_links_mask=[False, True, True, True, True, True, True, True, False, False])
        self.publisher = rospy.Publisher('/joint_states', JointState, queue_size=10)
        self.seq = 0

        print(self.chain)

    def move(self, x, y, z, orientation_matrix = np.eye(3)):        

        joint_states = self.chain.inverse_kinematics(geometry_utils.to_transformation_matrix(
            [x, y, z],
            orientation_matrix))

        print(joint_states)

        hdr = Header()
        hdr.seq = self.seq = self.seq + 1
        hdr.stamp = rospy.Time.now()
        hdr.frame_id = "My-Kinematic-Thingy"
        js = JointState()
        js.header = hdr
        js.name = ["Joint_1", "Joint_2", "Joint_3", "Joint_4", "Joint_5", "Joint_6"] #, , "Joint_Gripper" 
        js.position = joint_states[1:7]
        js.velocity = []
        js.effort = []
        self.publisher.publish(js)

if __name__ == "__main__":
    kin = Rbx1_Kinematics()
    x = 0.5
    y = 0
    z = 0.1
    kin.move(x, y, z)                                                     # identity matrix
    time.sleep(2)
    kin.move(x, y, z, np.array([[1, 0, 0], [0, -1, 0], [0, 0, -1]]))      # 180 around X
    time.sleep(2)
    kin.move(x, y, z, np.array([[-1, 0, 0], [0, 1, 0], [0, 0, -1]]))      # 180 around Y
    time.sleep(2)
    kin.move(x, y, z, np.array([[-1, 0, 0], [0, -1, 0], [0, 0, 1]]))      # 180 around Z
AhmetMericOzcan commented 4 years ago

Ok mate, Did you understand the logic behind my code? how Itry to decrease the error for orientation? You should do the same for position amd orientation!! Mate check how I cut the HT matrix. I cut it from only rotational Matrix in other words 3×3 Matrix. other 3×1 matrix is position. If you cut the target as 3×4 matrix with out the last line, you can make the error function for your orientation and position together. It is simple try it yourself maybe it xan solve yoyr problem.

More over even if doesnt, try to make the error function depended to the position and orientation together. I believe it can solve your problems. Just try a little

Phylliade commented 4 years ago

Hey,

Thanks for your contribution @AhmetMericOzcan, this is a nice approach indeed.

To answer the original issue, support for orientation is coming in the next release of ikpy (one of the solutions involves something very similar to your solution :) )

AhmetMericOzcan commented 4 years ago

@Phylliade when will the next release come, do you plan to work with ROS framework?

Phylliade commented 4 years ago

Do you have any particular ROS integrations in mind?

Le mer. 16 oct. 2019 à 13:57, AhmetMericOzcan notifications@github.com a écrit :

when will the next release come, do you plan to work with ROS framework?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/Phylliade/ikpy/issues/61?email_source=notifications&email_token=AAI2EATAICCDLL74EVVA553QO36TZA5CNFSM4IN3WNQKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEBMGS6A#issuecomment-542665080, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAI2EAXSJ66YV7DYWF5PQXTQO36TZANCNFSM4IN3WNQA .

AhmetMericOzcan commented 4 years ago

Mate, Moveit package in ros cannot find inverse kinematics when you lock certain joints. But your library have a boolean mask that can make certain joints static. That is a good aspect of your library

Phylliade commented 4 years ago

Hey @AhmetMericOzcan, orientation has landed in master!