facebookresearch / nevergrad

A Python toolbox for performing gradient-free optimization
https://facebookresearch.github.io/nevergrad/
MIT License
3.94k stars 353 forks source link

Optimize all parameters in Pytorch module that require gradient #1602

Open JackCaster opened 6 months ago

JackCaster commented 6 months ago

I would like to optimize a Pytorch neural network with Nevergrad. How can I tell Nevergrad to optimize all parameters in the module that require grad (e.g., params_to_optimize = filter(lambda p: p.requires_grad, model.parameters()))?

changdaeoh commented 4 months ago

@JackCaster Hi

It's been a while, so you've probably resolved it on your own, but I found the tutorial below useful.

https://medium.com/@joragasy/optimize-neural-network-with-gradient-free-methods-using-pytorch-and-nevergrad-399a9f4a5c21