enyen / NewStableTactileGrasp

Implementation of **Learning of Efficient Stable Robot Grasping Approach Using Transformer-based Control Policy**
https://stable-tactile-grasp.github.io/
2 stars 0 forks source link

The problem about the tactile images in the simulation environment #1

Closed GAOYUAN-robot closed 5 months ago

GAOYUAN-robot commented 9 months ago

Dear enyen puang

How can I adjust the tactile image in the simulation environment to the result shown in the picture? 图片

I referred to README.md and used env.data_stat() to update the values of self.tactile_means and self.tactile_stds, but it seems that I cannot achieve the result shown in the picture above. 图片

Therefore, the tactile images in the simulation environment cannot perfectly match those in the real environment.

enyen commented 9 months ago

After getting the means and stds of real sensor, you just need to match the arrow direction between sim and real sensors. This is to make sure the real sensor is installed in the correct side and orientation. Refer to _examples/UnstableGraspExp/experiment/ugreal.gif for the correct left side.

For more detailed sim and real matching, we might have to change the tactile-parameters in simulation. But this will be the last resort.

GAOYUAN-robot commented 9 months ago

If I don't change the tactile-parameters in gelslim_left.obj and gelslim_right.obj. How can I match the arrow direction between sim and real sensors more accurately?
I used env.data_stat() to update the values of self.tactile_means and self.tactile_stds,but the arrow direction of the sim sensor seems very confusing. With the same parameters, the arrow direction of the sim sensor is sometimes the same as the real sensor and sometimes opposite.

GAOYUAN-robot commented 9 months ago

Sometimes tactile images like this appear in the simulation environment. 图片

enyen commented 9 months ago

How can I adjust the tactile image in the simulation environment to the result shown in the picture? 图片

Hi Gao Yuan, Can you make the same picture above but with real robot and sensor?

Simulations go wild sometimes in extreme cases, we just need to focus on the normal cases. Please help to investigate in what circumstances will you observe opposite arrow on real sensor.

Thanks.

GAOYUAN-robot commented 9 months ago

图片 After I run "python train_sb3.py ./storage/ug_01-15_13-16 show". The arrow direction in the simulation environment seems to be opposite to the arrow direction in the paper. As shown in the picture, assuming that the gelsight sensor is on the side close to the person, the direction should be clockwise instead of counter clockwise. How can I change the arrow direction of the sensor in the simulation environment? Swapping the sensor locations in the real environment seems to have little effect, because if we assume that the gelsight sensor on the side close to the person will be used, the situation seems to be the same for left and right sensors.

The arrow direction of the sensor in the paper is the same as the arrow direction of the real sensor : 图片

GAOYUAN-robot commented 9 months ago

图片 After I run "python train_sb3.py ./storage/ug_01-15_13-16", the arrow direction in the simulation environment seems to be correct. But when I run "python train_sb3.py ./storage/ug_01-15_13-16" , the arrow direction in the simulation environment seems to be opposite.

As shown in the picture below, is the left sensor(the blue one) used in the simulation environment? 图片

GAOYUAN-robot commented 9 months ago

May I ask if I can have the complete code for this project via e-mail? I remember that I seemed not download the latest version when downloading the open tactile simulator , which may have an impact on the simulation results.

GAOYUAN-robot commented 9 months ago

图片 图片

After I run "python train_sb3.py ./storage/ug_01-15_13-16", the arrow direction in the simulation environment seems to be correct. But when I run "python train_sb3.py ./storage/ug_01-15_13-16 show" , the arrow direction in the simulation environment seems to be opposite.

enyen commented 9 months ago
  1. In simulation left is the blue sensor: https://github.com/enyen/TactileSimulation/blob/02884291b42af636eae25f003357855002840c9a/envs/assets/unstable_grasp/unstable_grasp.xml#L35

  2. On real robot left is the lit-up sensor: examples/UnstableGraspExp/experiment/ug_real.gif

  3. It seem that you are not using the latest version, try git pull and refer to the README on how to do inference.

    After I run "python train_sb3.py ./storage/ug_01-15_13-16", the arrow direction in the simulation environment seems to be correct. But when I run "python train_sb3.py ./storage/ug_01-15_13-16" , the arrow direction in the simulation environment seems to be opposite.

  4. if you have not setup ssh for github, you can try download zip in the green code button on top of the repo page.

GAOYUAN-robot commented 9 months ago

图片 I have downloaded the latest version and I use the correct left sensor. After running "python train_sb3.py ./storage/ug_01-15_13-16 show". Is the arrow direction in the simulation environment correct? (In the picture, the closer side to people is the blue sensor). In a real environment, the arrow direction of the blue sensor seems to be counter-clockwise.

I used env.render_tactile() to draw the picture. 图片

enyen commented 9 months ago

Yes, it is correct. You have to look at the front, not from back of the sensor.

GAOYUAN-robot commented 9 months ago

图片 If you want to complete the simplest task related to center of mass movement, how many obs do I need to take and average them?

enyen commented 9 months ago

30~50 with variable weight

GAOYUAN-robot commented 9 months ago

Thank you for your response!

When obtaining the observation values, Is the action taken each time random?

Should I only modify the weight of the container, or is it also necessary to adjust the center of mass of the object?
图片

enyen commented 9 months ago

the random action is suppose to change the center of mass, but you could adjust it after every grasp as you like.

GAOYUAN-robot commented 9 months ago

Is there any files named "means.npy", " stds.npy" that I can use as a reference? I took many groups of the means and stds of the obs of real sensors. But the same RL strategy seems difficult to achieve the same results on real sensors. There seems to be nothing wrong with the RL strategy itself.

enyen commented 9 months ago

I guess the system is not very particular about the exact means and stds. What do you mean by 'difficult'? Do you have any numbers or videos?

GAOYUAN-robot commented 9 months ago

It seems that the means and stds of obs in the real world will influence the effect of RL strategy in the real world.

  1. When I just modify the weight of the container to achieve the obs in the real world, the effect of RL strategy in the real world seems not very well(The next line is a video, the side of GS2 is the left sensor).

https://github.com/enyen/TactileSimulation/assets/148218659/dc3a0101-6cea-428a-b5c2-2dd059e30aa8

  1. When I adjust the center of mass of the container in the real world, the effect of RL strategy in the real world seems become a bit better. But it still did not achieve the desired results.(The next line is a video, The next line is a video, the side of GS2 is the left sensor).

https://github.com/enyen/TactileSimulation/assets/148218659/1251ac94-7ff4-4dc3-bf40-b04a469c98bd

So I was wondering if you have any files about the obs in the real world that I can use as a reference.

GAOYUAN-robot commented 9 months ago

There may be other problems that affect the performence of RL strategies in the real world.

enyen commented 9 months ago

Have you visualized the real sensor before?

cd examples/UnstableGraspExp/marker_flow
python marker_flow.py

We are not using the right sensor, so it is better to unplug it to prevent confusion.

GAOYUAN-robot commented 8 months ago

After I run "python marker_flow.py", the tactile image of the sensor on the left will be displayed(GS2).

GAOYUAN-robot commented 8 months ago

Before running "python test_ur5", I confirmed that the left sensor(GS2) was used. Did the data from the left and right sensors interfere with each other during the operation?

enyen commented 8 months ago

No, left and right sensors are totally independent. But there is no reason to keep the right sensor pluged it. Have you tried to match the real sensor image with the simulated ones? Are the arrows pointing in the same direction under the same situation?

GAOYUAN-robot commented 8 months ago

Thank you for your response! There are pictures of real sensors and simulated sensors, they seem to match well. Is it correct?

  1. In the simulation environment: After running "python train_sb3.py ./storage/ug_01-15_13-16 show", the arrow direction of the blue sensor is counter-clockwise.(In the picture, the left sensor is the blue sensor, which is on the side away from people). 2830_1706598677_hd 图片

  2. In the real environment: when we use the left sensor(which is closer to people), the arrow direction of the left sensor is counter-clockwise. 2812_1706597685_hd

图片

enyen commented 8 months ago
  1. No, simulated (right sensor nearer) bar tilted left, real (left sensor nearer) bar tilted right.
  2. the marker displacement is not very obvious, try to reduce the gripping force, or increase the weight of the load
GAOYUAN-robot commented 8 months ago

Thank you for your response! 1. We discuss this problem last week, the conclusion is that the rotation direction of the blue sensor in the simulation environment is correct, and the blue sensor in the simulation environment corresponds to the left sensor in reality,Is it correct? 图片

图片 图片

2. Do you mean the tactile sensors in the two pictures below don't match? In the simulation environment, the arrow direction of the blue sensor is counter-clockwise. 3078_1706614436_hd In the real environment, the arrow direction of the left sensor is also counter-clockwise. 3052_1706612367_hd

3. Is there any files named "means.npy", " stds.npy" that I can use as a reference? The obs of the real sensor may affect the performence of RL strategies in the real world.

I have closed the right sensor, only the left sensor is reserved ( the reference document is /examples/UnstableGraspExp/experiment/ug_real.gif):
3071_1706613976_hd

enyen commented 8 months ago
  1. No, simulated (right sensor nearer) bar tilted left, real (left sensor nearer) bar tilted right.

  2. the sensor is assigned correctly, but based on your images the bars are tilted in opposite direction between sim and real. So you might need to test again with the same bar tilt setting.

  3. the marker displacement is not very obvious, try to reduce the gripping force, or increase the weight of the load

  4. you could also reduce the stds temporary and only for better visualization, change it back later https://github.com/enyen/TactileSimulation/blob/8014df67a736d30bae45675d60c5c54afbbaee23/examples/UnstableGraspExp/marker_flow/marker_flow.py#L161 flow = (flow - self.tactile_norm[0]) / (self.tactile_norm[1] / 3)

  1. No, I do not have a copy of means and stds. I got them using the same method.
  2. make sure you updated the cam_idx accordingly https://github.com/enyen/TactileSimulation/blob/8014df67a736d30bae45675d60c5c54afbbaee23/examples/UnstableGraspExp/test_ur5.py#L155 https://github.com/enyen/TactileSimulation/blob/8014df67a736d30bae45675d60c5c54afbbaee23/examples/UnstableGraspExp/test_ur5.py#L176
GAOYUAN-robot commented 8 months ago

Thank you for your response!

  1. I think your idea is correct, the bars are tilted in opposite direction between sim and real. Yesterday I repaired a hidden hardware problem with the GS2 sensor and reran the experiment with new observations extracted, then I found that in the real world, the RL policy would make the robot arm move away from the center of mass.

In the real environment, when the action[0] representing the "gripper position" is inverted, the gripper of the manipulator will move closer to the center of mass.

  1. Can you tell me how to inverse the rotation direction of the blue sensor in the simulation environment? As shown in the picture below, the blue arrow should rotate clockwise so that it can match the real sensor.

图片

Best regards!

GAOYUAN-robot commented 8 months ago

As shown in the picture below, I modified the program "unstable_grasp_env.py", and inverse the rotation direction of the blue sensor in the simulation environment, Is this modification correct? 图片

After this modification, in the real environment, the gripper of the manipulator moves closer to the center of mass.

enyen commented 8 months ago
  1. What is causing the invertion? You might want to investigate it by comparing the sim and real tactile images again.
  2. If you really have to hack the arrow direction due to hardware installation issue, you should modify the real data in https://github.com/enyen/TactileSimulation/blob/8014df67a736d30bae45675d60c5c54afbbaee23/examples/UnstableGraspExp/marker_flow/marker_flow.py#L147
    flows = self.normalize_flow(flows)
    flows[:, :, 1, :, :] *= -1
    return flows

    to match with sim, not the correct sim env in unstable_grasp_env.py.

GAOYUAN-robot commented 7 months ago

Dear Dr enyen

I encountered some new difficulties while attempting to modify the "unstable_grasp.xml" file

  1. I imported the file "qumian2.obj" into the file "unstable_grasp.xml" and created an object named "box_mesh". But this object named "box_mesh" does not appear to be an entity, because the object named "load" will pass through the object named "box_mesh" and they cannot contact with each other. 图片
  1. I noticed that I can import the obj file into the xml file using the method shown in the picture below (adopted from a task named "dclaw_rotate"), How is the collision file in this code generated? such as contacts/base_link.txt,contacts/one0_link.txt,contacts/one1_link.txt ...( I generated .obj files from Solidworks and 3D maxs) 图片

  2. If I want to insert an obj file into xml file and generate an object, how can I define its collision parameters so that it can come into contact with other objects?

Could you please provide guidance on how to address these issues? Your assistance would be greatly appreciated.

Thank you kindly.

enyen commented 7 months ago

Hi,

you can study more about the xml modelling through the examples in the original repo this work adopted from: https://github.com/enyen/TactileSimulation/blob/8014df67a736d30bae45675d60c5c54afbbaee23/README.md?plain=1#L2

As well as the one it adopted from: https://github.com/eanswer/DiffHand