rlee3359 / folding-by-hand

1 stars 0 forks source link

about calib.reg #2

Closed zcswdt closed 1 month ago

zcswdt commented 1 month ago

Hello, I encountered an issue while reviewing the code in panda_fold_env.py. It appears that the file calib.reg is required for the line: self.calib_reg = torch.load("calib.reg") However, I was unable to locate this file in the repository. Could you please provide some guidance on how to obtain or generate this file? Is it supposed to be included in the repository, or is there a specific procedure to create it?

Thank you for your assistance.

rlee3359 commented 1 month ago

Hi, "calib.reg" was a calibration file I used to store the camera to workspace calibration. This will depend on your robot and camera set up. You will need to calibrate your camera to the workspace so that you can transform pick and place actions from image coordinates to points on the work surface. Since the the mapping here is from 2D pixel coordinates to 2D points on the table with a fixed height, I just used a simple linear regression to map the points, but the more standard approach would be a proper checkboard camera calibration.

Please note there are other hard-coded aspects to the robot control code that you will need to change also, such as homing joint configurations etc. I recommend creating a new environment and implement the rough structure of the env I provided (mainly the reset and step functions) with your own robot control code.

zcswdt commented 1 month ago

Please note there are other hard-coded aspects to the robot control code that you will need to change also, such as homing joint configurations etc. I recommend creating a new environment and implement the rough structure of the env I provided (mainly the reset and step functions) with your own robot control code.

Thank you for your response, I understand now. This is akin to a camera calibration file. Since I'm new to operating robotic arms, I have a few more questions: 1. During the data collection phase and the fabric manipulation phase with the robotic arm, is the camera position the same and unchanged? 2. Are the rgb_keypoints and ir_keypoints used to align the fabric work areas in different spaces? I did not understand the crop_keypoints parameter. Could you explain that?

rlee3359 commented 1 month ago

Yes, the data collection was performed with the same camera set up. The camera was mounted above the workspace pointing down at the table. In principle however I don't think it needs to be collected with the exact same setup, as long as it is a camera pointing downward, because all the data collection and training is performed in image space, and this is only translated to real world points when the robot is involved.

The keypoints are hard coded points from the corners of the square workspace. They are used to crop the image to the workspace and warp them to a perfect square.

zcswdt commented 1 month ago

Yes, the data collection was performed with the same camera set up. The camera was mounted above the workspace pointing down at the table. In principle however I don't think it needs to be collected with the exact same setup, as long as it is a camera pointing downward, because all the data collection and training is performed in image space, and this is only translated to real world points when the robot is involved.

The keypoints are hard coded points from the corners of the square workspace. They are used to crop the image to the workspace and warp them to a perfect square.

Thank you very much for your guidance, I have roughly understood. Your work is really great and it plays a key role in my learning. Thank you!