I keep getting this error every time I try to train the model. Apparently the loss values abruptly become NaN at a random iteration and training stops with this error.
Runtime Error: floatingPointError: Loss became infinite or NaN at iteration=166!
loss_dict = {'loss_cls_ce': nan, 'loss_box_reg': nan, 'loss_ins_con': 0.0, 'loss_cls_up': nan}
I keep getting this error every time I try to train the model. Apparently the loss values abruptly become NaN at a random iteration and training stops with this error.
Runtime Error: floatingPointError: Loss became infinite or NaN at iteration=166! loss_dict = {'loss_cls_ce': nan, 'loss_box_reg': nan, 'loss_ins_con': 0.0, 'loss_cls_up': nan}