resibots / blackdrops

Code for the Black-DROPS algorithm: "Black-Box Data-efficient Policy Search for Robotics", IROS 2017/ICRA 2018
Other
64 stars 22 forks source link

What optimizer should be used for SPGP model #14

Closed urnotmeeto closed 4 years ago

urnotmeeto commented 4 years ago

I am trying to use the SPGP model, which is implemented in limbo/experimental/model, on the cartpole simulation. But it seems that I cannot make the hyperparmater-optimizer work.

This is how I defined the GP template in cartpole.cpp: using GP_t = limbo::model::MultiGP<Params, limbo::model::SPGP, kernel_t, mean_t, limbo::model::multi_gp::ParallelLFOpt<Params, blackdrops::model::gp::KernelLFOpt<Params, limbo::opt::NLOptGrad<Params, nlopt::LD_LBFGS>>>>;

And I got errors like this: `../exp/blackdrops/src/classic_control/cartpole.cpp:444:26: required from here /home/jingyi/Desktop/blackdrops/deps/limbo/exp/blackdrops/include/blackdrops/model/gp/kernel_lf_opt.hpp:110:48: error: ‘class limbo::model::SPGP<Params, limbo::kernel::SquaredExpARD, limbo::mean::NullFunction, limbo::model::gp::NoLFOpt >’ has no member named ‘matrixL’; did you mean ‘_matrixL’? Eigen::MatrixXd l = gp.matrixL();


                                             _matrixL
/home/jingyi/Desktop/blackdrops/deps/limbo/exp/blackdrops/include/blackdrops/model/gp/kernel_lf_opt.hpp:113:68: error: ‘class limbo::model::SPGP<Params, limbo::kernel::SquaredExpARD<Params>, limbo::mean::NullFunction<Params>, limbo::model::gp::NoLFOpt<Params> >’ has no member named ‘alpha’
                         double a = (gp.obs_mean().transpose() * gp.alpha())
                                                                 ~~~^~~~~
/home/jingyi/Desktop/blackdrops/deps/limbo/exp/blackdrops/include/blackdrops/model/gp/kernel_lf_opt.hpp:143:28: error: ‘class limbo::model::SPGP<Params, limbo::kernel::SquaredExpARD<Params>, limbo::mean::NullFunction<Params>, limbo::model::gp::NoLFOpt<Params> >’ has no member named ‘matrixL’; did you mean ‘_matrixL’?
                         gp.matrixL().template triangularView<Eigen::Lower>().solveInPlace(w);
                         ~~~^~~~~~~
                         _matrixL
/home/jingyi/Desktop/blackdrops/deps/limbo/exp/blackdrops/include/blackdrops/model/gp/kernel_lf_opt.hpp:144:28: error: ‘class limbo::model::SPGP<Params, limbo::kernel::SquaredExpARD<Params>, limbo::mean::NullFunction<Params>, limbo::model::gp::NoLFOpt<Params> >’ has no member named ‘matrixL’; did you mean ‘_matrixL’?
                         gp.matrixL().template triangularView<Eigen::Lower>().transpose().solveInPlace(w);
                         ~~~^~~~~~~
                         _matrixL
/home/jingyi/Desktop/blackdrops/deps/limbo/exp/blackdrops/include/blackdrops/model/gp/kernel_lf_opt.hpp:147:32: error: ‘class limbo::model::SPGP<Params, limbo::kernel::SquaredExpARD<Params>, limbo::mean::NullFunction<Params>, limbo::model::gp::NoLFOpt<Params> >’ has no member named ‘alpha’
                         w = gp.alpha() * gp.alpha().transpose() - w;
                             ~~~^~~~~
/home/jingyi/Desktop/blackdrops/deps/limbo/exp/blackdrops/include/blackdrops/model/gp/kernel_lf_opt.hpp:147:45: error: ‘class limbo::model::SPGP<Params, limbo::kernel::SquaredExpARD<Params>, limbo::mean::NullFunction<Params>, limbo::model::gp::NoLFOpt<Params> >’ has no member named ‘alpha’
                         w = gp.alpha() * gp.alpha().transpose() - w;
`

Could you give me some advices on this issue?
costashatz commented 4 years ago

@urnotmeeto a quick reply to let you know that I have seen your issue and will reply during this week. Thanks for using our code!

costashatz commented 4 years ago

@urnotmeeto thanks for using the Black-DROPS code. The SPGP model does not have all the required fields to be used with the hyper-parameter optimization pipeline; this is why it is still in the experimental folder. SPGP automatically optimizes its hyper-parameters (see here), so just pass a limbo::model::gp::NoLFOpt in SPGP creation. Moreover, the SPGP model will not work out of the box with the MultiGP; as it is apparent SPGP is still highly experimental, I would not use it as is. Also, we have not extensively tested the SPGP model, and thus it might have bugs and maybe be suboptimal in computation time (we did observe it to take more time than it should in some cases). You can try the SparsifiedGP (see here): this one has been extensively tested and it is working quite well, but of course it is a much simpler technique (just removing points in an "intelligent" way).

costashatz commented 4 years ago

@urnotmeeto I am closing this issue. Feel free to post any other issue..