Closed 1Konny closed 10 months ago
@1Konny Thanks for your interest. SAMA is already supported in Betty.
If your goal is to reproduce results in Making Scalable Meta Learning Practical, you may use the following Config
setup.
from betty.configs import Config
# if your base-level optimizer is Adam or its variants
myconfig = Config(type="sama", sama_multitask=True)
# if your base-level optimizer is SGD. Even though its name is 'darts',
# our implementation follows SAMA instead of original DARTS. We will
# fix this naming in the future.
myconfig = Config(type="darts", darts_multitask=True)
Let me know if you have any additional questions.
Best, Sang
I appreciate your quick reply.
Oh, I should have double-checked that. I'll give it a try. Thanks!
It would be really nice if betty supports SAMA as well! Thanks!