SingleZombie / DL-Demos

Demos for deep learning
357 stars 88 forks source link

优化器调用问题 #18

Closed fanfanimage closed 1 year ago

fanfanimage commented 1 year ago

for mini_batch_X, mini_batch_Y in mini_batch_XYs: mini_batch_Y_hat = model.forward(mini_batch_X) model.backward(mini_batch_Y) optimizer.zero_grad() optimizer.add_grad(model.get_grad_dict()) optimizer.step() 您好: 已知:model.forward()函数用的参数(w,b)是存在列表里面,经过 optimizer.step()更新后的参数是存在字典里。 疑问:在循环时,model.forward()函数为什么能够使用存在字典里的参数呢? 谢谢!

SingleZombie commented 1 year ago

The issue has been resolved on Zhihu. Thanks for sharing your questions.