roboticslab-uc3m / teo-bimanipulation

Demonstration of Teo manipulating objects with two arms
GNU Lesser General Public License v2.1
1 stars 1 forks source link

Main ideas for this demostration and steps to follow #1

Open rsantos88 opened 6 years ago

rsantos88 commented 6 years ago

We are interested of create a demostration which we can see Teo manipulating an object with two arms. It can be something with handles or something that it's necessary to grasp with two hands at the same time. The objective is to check the correct functionality of the robot, manipulating with two arms. It's not necessary to use path-planning. We can grab waypoints (see this old issue for that) like teo-self-presentation and make something simple and nice to see.

UPDATED:

Next to do:

rsantos88 commented 6 years ago

One possible idea of a demo is:

  1. Teo grasps and object which is necessary to use the two hands to hold it. It can be a tray for example. The object will be known and it cannot be change. It's necessary to position the object with respect to the robot always in the same distance and location.
  2. The robot will elevate the tray. Once the robot holds it in the air, you will can place an object over the tray.
  3. At this point, I thought that Teo can do something like:
    • Say the weight of the object using the jr3 sensors
    • Try to set the object in the center, balancing the tray and using jr3 to know the forces in each hand. In this case, the object should be a sliding object.
rsantos88 commented 6 years ago

Talking with @smcdiaz, we've reached to a conclusion of a possible demo. First approximation is:

  1. Teo takes a tray with the two hands holding it. Once it's in the air, Teo moves the tray, doing movements with both arms in the air. He will do some displacement of the tray to ensure proper operation of the arms. This movement will be record using waypoints.
  2. The second goal is to use the JR3 sensors. Once the arms are raised, apply a force in one hand and move it in the direction of the applied force. This force and it direction will be measure with the JR3 sensor. This can be do with the other hand too. For example, you move the right hand, and the left hand moves in the same direction with the same velocity.
rsantos88 commented 6 years ago

Pending of buy a tray to start and get to work with this demo

rsantos88 commented 6 years ago

The tray has been bought. Some handles have been added to the tray adapted to the current hands of Teo.

20180514_110820

rsantos88 commented 6 years ago

Blocked by https://github.com/roboticslab-uc3m/teo-hardware-issues/issues/22

rsantos88 commented 6 years ago

Unblocked thanks to https://github.com/roboticslab-uc3m/teo-hardware-issues/issues/22

rsantos88 commented 6 years ago

Blocked again by https://github.com/roboticslab-uc3m/teo-hardware-issues/issues/14#issuecomment-382696473

rsantos88 commented 6 years ago

Unblocked thanks to https://github.com/roboticslab-uc3m/teo-hardware-issues/issues/14#issuecomment-392781382 but blocked anyway by https://github.com/roboticslab-uc3m/teo-hardware-issues/issues/26

rsantos88 commented 6 years ago

Next to do:

rsantos88 commented 5 years ago
  • [x] close the control loop using the torque force sensors

In order to close the control loop, I've done some advances. I've understood the performance of the JR3 sensor. I have linked the output of the sensor with the movement of the arm in Cartesian positions, so when we push the hand in one direction of the three axes, the arm will follow that direction in the Cartesian axes, using position mode.

--> See video ! :smiley:

rsantos88 commented 5 years ago

I've edited a small video that shows the advances that have been made for this demo so far :smiley:

480

rsantos88 commented 5 years ago

I have already made several advances, which were on standby due to various changes that have had to be made in the robot. Changes at the software level, such as the review and unify joint limits , the change of the direction of rotations of various joints in order to unify all in the same sense with respect to their axes of coordinates, the change and unify TEO kinematic model issue, and the test of different operating modes in position (Make position direct mode usable) with the aim of achieving the best movement in cartesian coordinates. Also, there have been several problems at the hardware level that have stopped the tests, such as the break of the right forearm or problems with values obtained by some absolute encoders. Finally, and after having solved these problems, here I can show a video of the application working:

480

Some notes: