open-mmlab / mmsegmentation

OpenMMLab Semantic Segmentation Toolbox and Benchmark.
https://mmsegmentation.readthedocs.io/en/main/
Apache License 2.0
8.04k stars 2.58k forks source link

I can´t train on my Custom dataset #3218

Open CEwoudi opened 1 year ago

CEwoudi commented 1 year ago

My custom dataset is composed of:

CEwoudi commented 1 year ago

Traceback (most recent call last): File "C:\Users\ewoud\Documents\Semester_11\SemEIT\mmsegmentation\tools\train.py", line 104, in main() File "C:\Users\ewoud\Documents\Semester_11\SemEIT\mmsegmentation\tools\train.py", line 100, in main runner.train() File "C:\Users\ewoud\Documents\Semester_11\SemEIT\mmsegmentation\myenv\lib\site-packages\mmengine\runner\runner.py", line 1735, in train model = self.train_loop.run() # type: ignore File "C:\Users\ewoud\Documents\Semester_11\SemEIT\mmsegmentation\myenv\lib\site-packages\mmengine\runner\loops.py", line 278, in run self.run_iter(data_batch) File "C:\Users\ewoud\Documents\Semester_11\SemEIT\mmsegmentation\myenv\lib\site-packages\mmengine\runner\loops.py", line 301, in run_iter outputs = self.runner.model.train_step( File "C:\Users\ewoud\Documents\Semester_11\SemEIT\mmsegmentation\myenv\lib\site-packages\mmengine\model\base_model\base_model.py", line 114, in train_step losses = self._run_forward(data, mode='loss') # type: ignore File "C:\Users\ewoud\Documents\Semester_11\SemEIT\mmsegmentation\myenv\lib\site-packages\mmengine\model\base_model\base_model.py", line 340, in _run_forward results = self(data, mode=mode) File "C:\Users\ewoud\Documents\Semester_11\SemEIT\mmsegmentation\myenv\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(*args, *kwargs) File "c:\users\ewoud\documents\semester_11\semeit\mmsegmentation\mmseg\models\segmentors\base.py", line 94, in forward return self.loss(inputs, data_samples) File "c:\users\ewoud\documents\semester_11\semeit\mmsegmentation\mmseg\models\segmentors\encoder_decoder.py", line 176, in loss loss_decode = self._decode_head_forward_train(x, data_samples) File "c:\users\ewoud\documents\semester_11\semeit\mmsegmentation\mmseg\models\segmentors\encoder_decoder.py", line 137, in _decode_head_forward_train loss_decode = self.decode_head.loss(inputs, data_samples, File "c:\users\ewoud\documents\semester_11\semeit\mmsegmentation\mmseg\models\decode_heads\decode_head.py", line 262, in loss losses = self.loss_by_feat(seg_logits, batch_data_samples) File "c:\users\ewoud\documents\semester_11\semeit\mmsegmentation\mmseg\models\decode_heads\decode_head.py", line 324, in loss_by_feat loss[loss_decode.loss_name] = loss_decode( File "C:\Users\ewoud\Documents\Semester_11\SemEIT\mmsegmentation\myenv\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(args, kwargs) File "c:\users\ewoud\documents\semester_11\semeit\mmsegmentation\mmseg\models\losses\cross_entropy_loss.py", line 271, in forward loss_cls = self.loss_weight * self.cls_criterion( File "c:\users\ewoud\documents\semester_11\semeit\mmsegmentation\mmseg\models\losses\cross_entropy_loss.py", line 45, in cross_entropy loss = F.cross_entropy( File "C:\Users\ewoud\Documents\Semester_11\SemEIT\mmsegmentation\myenv\lib\site-packages\torch\nn\functional.py", line 3029, in cross_entropy return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index, label_smoothing) IndexError: Target 127 is out of bounds.