Closed wd-hub closed 1 year ago
Hi! could you elaborate on the issue at hand?
When you train with GT data and test on GT data, is it still jittery?
It's much less jittery when trained with GT and test on GT data (I got the stable result after about 600 training epoches). However, when I trained with noisy mocap data (with foot sliding), the output is less sliding but with more jittery. I supposed the jitter might caused by the noisy reference motion, so I tried to use the GT data as reference motion for reward calculation. Unfortunately, the jitter still exists. Could you please give some suggestions about this problem, is there any effective way to reduce the jitter when comes with nosiy mocap data?
I see. If the problem is just jitter, maybe apply some filtering to the data? You can also try some projection layer to project the jittery motion to a more plausible manifold (like denoising auto encoders).
I think the denosing autoendoer maybe a possible solution. By the way, I read your latest work PHC in which you took some approches to improve the robustness for nosiy input, such as added energy penalty, removed the reference pose and meta-PD control in action computing. To what extent could these methods solve the jitter problem? Is it worth trying these tricks in UHC?
Thanks for taking an interest in PHC!
Adding the energy penalty is to resolve the problem when the humanoid will resort to high-frequency jitter to stay balanced, this happens even when using GT pose as training sequences and is pretty apparent when not having RFC. PHC also does not use the residual action (action is not computed with respect to the reference pose). This is for better fail-state recovery, as when failed, the reference motion does not provide any useful information.
That being said none of them really aims at solving the problem of noisy input. The fail-state recovery part focuses on that and is trying to recover from noisy input rather than denoise it. I think the energy penalty is probably worth trying to encourage the humanoid not trying to follow jittery motion too much.
Hope it helps!
I see, thanks a lot for your detailed answer!
Hi Zhengyi, thanks for your excellent work. I'm trying to employ UHC to remove the foot sliding and floating in mocap sequence data. Actually, I noticed your answer https://github.com/ZhengyiLuo/UniversalHumanoidControl/issues/6#issuecomment-1503432638 with some insightful analysis in that similar issue. In my opinion, it's may not a appropriate way to take the noisy mocap seq as reference motion as it is not the motion what we want the robot to imitate. Therefore, I used the GT pose of H36M dataset as reference and the mocap noisy pose as input (the poses were aligned), trained a policy based on 'uhc_implicit_shape' with default setting. However, the movement is still jittery, particularly in the walking sequence. I'm wondering if it is a reasonable way, do you have any idea about making the robot move smoothly? thanks a lot.