Closed luyao777 closed 6 years ago
@luyao777 meta_lr is the meta learning rate for meta update.(outer update). update_lr stands for the task learning rate for each task(inner update), which can be updated by the meta update(outer update)(this is the the key idea of meta-SGD).
hi,foolyc. I have read your Meta-SGD code recently, that's a good project. But I have some questions about the code and paper. In the paper, update_lr and meta_lr seem like same item, and used element wise product with network parameters. The line here seems not conform to the paper? And set self.meta_lr and self.update same tf.variable? https://github.com/foolyc/Meta-SGD/blob/4922a8dab9bf6368654f174b9d3976dc77627012/maml.py#L64