Open yuanyaaa opened 1 year ago
Find reacher_7dof-v0.yml, then copy the 'base_action : 'null' to mppi, like: mppi: horizon : 16 init_cov : 1.0 filter_coeffs : [0.25, 0.8, 0.0] gamma : 1.0 n_iters : 1 step_size : 1.0 lam : 0.2 alpha : 1 num_cpu : 8 particles_per_cpu : 4 base_action : 'null'
then it will be work
Hello. I have tried all branch, but all can not run correctly from the tutorials. How can I run it? There I have wrong"init() missing 1 required positional argument: 'base_action'" for random_shooting with reacher_7dof "ValueError: zero-dimensional arrays cannot be concatenated" for mppi For different MPC, the errors are different. How can i fix it? Thank you very much