Closed Yang-L1 closed 4 years ago
Hi, Line 119 in train.py train_op = optimizer.minimize(loss, global_step=batch) Should this loss be loss[0]? Since the loss is a tuple object from the get_loss() function.
train_op = optimizer.minimize(loss, global_step=batch)
Btw, was the pretrained model trained by a huber loss or L2 loss?
loss is a scalar tensor. The pretrained model is trained under L2 loss.
loss
Hi, Line 119 in train.py
train_op = optimizer.minimize(loss, global_step=batch)
Should this loss be loss[0]? Since the loss is a tuple object from the get_loss() function.Btw, was the pretrained model trained by a huber loss or L2 loss?