erwincoumans / motion_imitation

Code accompanying the paper "Learning Agile Robotic Locomotion Skills by Imitating Animals"
Apache License 2.0
1.16k stars 291 forks source link

New robot URDF and robot orientation #23

Closed edgarcamilocamacho closed 4 years ago

edgarcamilocamacho commented 4 years ago

Hello.

I'm trying to change the URDF for other robot, and my idea is to use the laikago code and change files and descriptions.

I changed the urdf location here, and I ensured that all joint names coincide with the original laikago (new robot is 12 DOF too).

The new robot urdf has different orientation that laikago urdf, so I need to change the initial orientation, I have done that here and here.

I changed this:

[math.pi / 2.0, 0, math.pi / 2.0]

for this:

[0.0, 0.0, 0.0]

When pybullet starts, the robot starts with the correct orientation (looking to the positive X axis):

image

But when training or testing start, the orientation of reference and simulated robot change (after this line ):

image

I haven't found why the initial orientation is not kept.

nicrusso7 commented 4 years ago

Hi @edgarcamilocamacho, I'm experiencing the same problem using my own URDF. I've created my own robot model (looking at laikago.py) and edited the _reset_ref_motion method in imitation_task.py - here I just set a fixed position/orientation to the reference model.

However this doesn't work properly.. some steps have the reference slightly over the plane, while others are totally wrong: 1 2

I guess the reference transformations are done specifically for the Laikago URDF that comes with a specific initial position as described in the comments.

BTW thanks @erwincoumans for sharing this amazing work - there is always something new to learn following your github/youtube channels :)

erwincoumans commented 4 years ago

There is indeed some reference frame change happening for Laikago. We are adding the A1 robot, which doesn't have this issue, once I get to training that one, you'll have some demo code.

edgarcamilocamacho commented 4 years ago

@erwincoumans thanks very much!, I will be waiting for it.

RobertSWagoner commented 4 years ago

The addition of the A1 robot will be welcome. We received our physical A1 robot and we have begun setting up ROS with the A1 SDK on the NVIDIA AGX Xavier for new gait training. Is there a collaboration group list or forum we can discuss and share setup and operation information with?

I likewise say thank you to erwincoumans and team for sharing research and contributions.

erwincoumans commented 4 years ago

The A1 simulation and also physical robot is in this repository now, thanks to Yuxiang. See this commit: https://github.com/google-research/motion_imitation/pull/29

Not aware of other forums/lists. There is a PyBullet forum, but I'm often too busy to keep up-to-date with everything online. Feel free to share info in this thread, or start a discussion on twitter.

RobertSWagoner commented 4 years ago

Thank you Yuxiang for the A1 simulation and physical robot contributions.

Erwin, Thank you for the suggestions. I will confer with colleagues on the next steps for starting public collaborations.

erwincoumans commented 4 years ago

Sounds good. Actually, I created the A1 sim, Yuxian added the A1 interface for the real robot. We have some A1's as well to work with.

RobertSWagoner commented 4 years ago

My apologies Erwin, your substantial amount of work is greatly appreciated. Do you have any tips on coding with the physical A1 and SDK?

erwincoumans commented 4 years ago

The third_party/unitree_legged_sdk has C++ and Python bindings to use on the real A1. So with this repo you can use both simulated A1 and also real A1 using the same interface: For simulation use 'a1.py' and for real A1 use 'a1_robot.py'. Simulation example: motion_imitation/examples/whole_body_controller_example.py Real A1 example: motion_imitation/examples/whole_body_controller_robot_example.py

Since the A1 uses regular Z-axis up, you can use that as example and we can close this issue.