dbaranchuk / memory-efficient-maml

Memory efficient MAML using gradient checkpointing
MIT License
83 stars 6 forks source link

clip gradient for the inner loop of maml? #3

Closed liutianlin0121 closed 3 years ago

liutianlin0121 commented 3 years ago

Hi!

Thanks for this awesome resource! May I ask if there is a neat way to clip the gradient norm of the inner loop (the fast adaptation steps) of MAML? Currently, we can pass max_grad_grad_norm into the meta-model to control the gradient norm of the outer-loop of MAML. I am wondering if there is a neat way to also clip the gradient norm for the inner-loop, that is, the gradient used in fast adaptations. In particular, I am wondering if this can be applied to IngraphRMSProp.

Thanks so much! ;-)

Cheers, Tianlin

liutianlin0121 commented 3 years ago

I figured that we can do this type of clipping in get_updated_model. Thanks anyways :-)