leopard-ai / betty

Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization
https://leopard-ai.github.io/betty/
Apache License 2.0
329 stars 27 forks source link

Any plan to support SAMA? #20

Closed 1Konny closed 10 months ago

1Konny commented 10 months ago

It would be really nice if betty supports SAMA as well! Thanks!

sangkeun00 commented 10 months ago

@1Konny Thanks for your interest. SAMA is already supported in Betty.

If your goal is to reproduce results in Making Scalable Meta Learning Practical, you may use the following Config setup.

from betty.configs import Config

# if your base-level optimizer is Adam or its variants
myconfig = Config(type="sama", sama_multitask=True)
# if your base-level optimizer is SGD. Even though its name is 'darts',
# our implementation follows SAMA instead of original DARTS. We will
# fix this naming in the future.
myconfig = Config(type="darts", darts_multitask=True)

Let me know if you have any additional questions.

Best, Sang

1Konny commented 10 months ago

I appreciate your quick reply.

Oh, I should have double-checked that. I'll give it a try. Thanks!