-
when running label smoothing section, I found the code 'crit(x=predict, target=torch.LongTensor([2, 1, 0, 3, 3]))' return inf.
I think the var predict shouldn't add log, for log(0) is -inf. and the …
-
**Original article:** Rafael Müller, Simon Kornblith, and Geoffrey E. Hinton. "When does label smoothing help?." Advances in neural information processing systems 32 (2019). (https://arxiv.org/pdf/190…
-
Some use underscore
```
[--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]]
[--smoothing_fwhm SMOOTHING_FWHM]
```
other use hyphens
```
[--denoise-strategy DENOISE_STRATEGY]
…
-
I'm trying to replicate the ICASSP 2022 paper result (A Lightweight Instrument-Agnostic Model for Polyphonic Note Transcription and Multipitch Estimation).
Having some trouble getting the model to…
-
### 🐛 Describe the bug
Setting the positive value `3.` to `label_smoothing` argument of [nn.CrossEntropyLoss()](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html) gets the er…
-
-
-
I read from line 141 in main.py about label smoothing:
```
e2_multi = ((1.0-args.label_smoothing)*e2_multi) + (1.0/e2_multi.size(1))
```
Isn't it should be the following instead?
```
e2_multi …
-
when compute label smoothing loss, the logit_loss only multiply weight_t while miss the 1/(t+1).
-
# Support label smoothing regularization
**Motivation**
Label smoothing has been shown to boost performance in Inception-V2, as well as in Revisiting ResNets papers.
**Related resources**
PyTo…