What causes train/PackedNLLLoss=-1.640?
At the beginning, Loss was positive, but why did it change from positive to negative after multiple iterations? As far as I know, negative maximum likelihood is similar to cross entropy, which is non-negative. What causes this? Is it a data problem?
Describe the bug
What causes train/PackedNLLLoss=-1.640? At the beginning, Loss was positive, but why did it change from positive to negative after multiple iterations? As far as I know, negative maximum likelihood is similar to cross entropy, which is non-negative. What causes this? Is it a data problem?
Epoch 2: | | 40/? [01:06<00:00, 0.60it/s, v_num=2, train/PackedNLLLoss=0.790] Epoch 48: | | 1/? [01:06<00:00, 0.60it/s, v_num=2, train/PackedNLLLoss=-1.640]