luyiyun / NormAE

Batch effects removal method based on deep autoencoder and adversarial learning
MIT License
22 stars 8 forks source link

RuntimeError: Assertion `cur_target >= 0 && cur_target < n_classes' #6

Open oligegilo opened 1 year ago

oligegilo commented 1 year ago

Hi, I wanted to try your tool to normalize a large set of HPLC data run over multiple batches, but ran into this error message you can see below. I tested it also on a small subset and it is consistently occurring right when it reaches 1000/1710 iterations. Do you have an idea what could be the issue?

Traceback (most recent call last): | 1/19 [00:07<02:15, 7.55s/it] File "/PATHTO/software/NormAE-release/main.py", line 83, in main() File "/PATHTO/software/NormAE-release/main.py", line 49, in main best_models, hist, early_stop_objs = trainer.fit(datas) File "/PATHTO/software/NormAE-release/train.py", line 111, in fit self._forward_discriminate(batch_x, batch_y) File "/PATHTO/software/NormAE-release/train.py", line 380, in _forward_discriminate batch_y[:, 1].long()) File "/PATHTOCONDA/anaconda3/envs/NormAE/lib/python3.6/site-packages/torch/nn/modules/module.py", line 547, in call result = self.forward(*input, **kwargs) File "/PATHTOCONDA/anaconda3/envs/NormAE/lib/python3.6/site-packages/torch/nn/modules/loss.py", line 916, in forward ignore_index=self.ignore_index, reduction=self.reduction) File "/PATHTOCONDA/anaconda3/envs/NormAE/lib/python3.6/site-packages/torch/nn/functional.py", line 1995, in cross_entropy return nll_loss(log_softmax(input, 1), target, weight, None, ignore_index, None, reduction) File "/PATHTOCONDA/anaconda3/envs/NormAE/lib/python3.6/site-packages/torch/nn/functional.py", line 1824, in nll_loss ret = torch._C._nn.nll_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index) RuntimeError: Assertion `cur_target >= 0 && cur_target < n_classes' failed. at /tmp/pip-req-build-p5q91txh/aten/src/THNN/generic/ClassNLLCriterion.c:94 disc_pretrain: 58%|███████████████████████████████████████████████▎ | 1000/1710 [7:50:53<5:34:19, 28.25s/it]