leopard-ai / betty

Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization
https://leopard-ai.github.io/betty/
Apache License 2.0
329 stars 27 forks source link

How to control the times of optimizer.step() for different level? #14

Closed 1292224662 closed 1 week ago

1292224662 commented 1 year ago

Hello, I'm not an expert on MLO. If I understand correctly, in one iteration, the level-2 module and level-1 module are all updated once (that is, their optimizer.step() is called both once)?

My question is how can I control this. For example, I want to have level-2 calls optimizer.step() for 100 times and then calls optimizer.step() 1 time for level-1. Thanks!

sangkeun00 commented 8 months ago

Sorry for the late response! Unfortunately, we only support larger lower-level optimizer.step(). Basically, you can control this with unroll_steps in Config. For example, if you set unroll_steps=100 for level 1, you will perform level 1 optimizer.step() 100 times before performing optimizer.step() for level 2. Sorry for the inconvenience. Let me know if you have further questions!

Best, Sang