The keyboard control require and it is not convenient to use keys to control the arm joints. So I change that to using PyUserInput to grab mouse and keyboard input so that we don't need to focus on the terminal window, and easier arm joints controlling.
I've also add an optional action wrapper. The human demonstrations is not the same as raw joints. By this action wrapper, we can transform the action to the same space as demonstrations. And sometimes primitive joints is not wanted, e.g., has redundant dimensions or offset, like extra wheel joints, or 2 joints of finger on a gripper which can be combined to one single action dimension. In these cases, the wrapper can also be used to simplify the actions.
This PR also includes some changes to youbot model, like adding camera and better joints range and initial position.
Changes:
Arm joints demo generation by mouse
Use PyUserInput to grab mouse and keyboard input so that we don't need to focus on the terminal window
Add an optional action wrapper
Youbot model adjustment:
Add camera;
Back off the base of robot arm a little bit to spare the space for camera;
Better joints range and initial position: The original model range always starts from 0, eg., (0, 2*pi); but the commonly used range (where gripper is in front of the robot) is not sure, might be like 150-210 degree or 0 ~ 30 and 330-360 degree. That is both un-convenient for human control and RL agent. The policy net work has to learn the offset, and sometimes has to overcome the uncontinous jump between 6.28 and 0. This cause some wired behavior when training the agent. I've center the commonly used range to 0 and make the joints range continuous at the center using action wrapper.
Camera position:
Now we can generate demonstrations using the keyboard key "W, A, S, D", "E" and mouse by watching the 128x128 image easily. This can be used to demonstrate many versatile teacher tasks.
The keyboard control require and it is not convenient to use keys to control the arm joints. So I change that to using PyUserInput to grab mouse and keyboard input so that we don't need to focus on the terminal window, and easier arm joints controlling.
I've also add an optional action wrapper. The human demonstrations is not the same as raw joints. By this action wrapper, we can transform the action to the same space as demonstrations. And sometimes primitive joints is not wanted, e.g., has redundant dimensions or offset, like extra wheel joints, or 2 joints of finger on a gripper which can be combined to one single action dimension. In these cases, the wrapper can also be used to simplify the actions.
This PR also includes some changes to youbot model, like adding camera and better joints range and initial position.
Changes:
Camera position:
Now we can generate demonstrations using the keyboard key "W, A, S, D", "E" and mouse by watching the 128x128 image easily. This can be used to demonstrate many versatile teacher tasks.