Closed dingli-dean closed 6 years ago
In the meta-update step, the gradient is taken of the loss computed with the adapted parameters, with respect to the original parameters. So the meta-update step adjusts the parameters such that when they are adapted to a task, the loss is minimized.
Please reserve Github issues for bugs, feature requests, etc. You can ask questions about machine learning on sites such as Cross Validated, or Reddit. Questions about PyTorch should be asked on the PyTorch Forum.
Thank you very much.
In the algorithm 2 mentioned in this paper,we have computed adapted parameters θ' in step 7,and I think we have updated the model for task T_i. What does the meta-update mean?(in step 8)