Closed Zehui-Lin closed 5 years ago
第6章代码 loss.backward() optimizer.step() running_loss += loss.data[0] running_correct += torch.sum(pred == y_train.data)
会报错 改成 loss.backward() optimizer.step() running_loss += loss.data running_correct += torch.sum(pred == y_train.data)
您好,谢谢您指出的问题,已经做出修改。
第6章代码 loss.backward() optimizer.step() running_loss += loss.data[0] running_correct += torch.sum(pred == y_train.data)
会报错 改成 loss.backward() optimizer.step() running_loss += loss.data running_correct += torch.sum(pred == y_train.data)