vqdang / hover_net

Simultaneous Nuclear Instance Segmentation and Classification in H&E Histology Images.
MIT License
537 stars 224 forks source link

Hi, I am getting list index out of range error during training #264

Open Noirombre opened 1 year ago

Noirombre commented 1 year ago

Hi, I am getting list index out of range error during training 微信图片_20231010174430

Noirombre commented 1 year ago

----------------EPOCH 1 Processing: | | 0/49[00:00<?,?it/s]Batch = nan|EMA = nanTraceback (most recent call last): File "run_train.py", line 305, in trainer.run() File "run_train.py", line 288, in run self.run_once( File "run_train.py", line 265, in run_once main_runner.run(opt["nr_epochs"]) File "E:\hover\hover_net-master\run_utils\engine.py", line 172, in run for data_batch in self.dataloader: File "E:\Anaconda\lib\site-packages\torch\utils\data\dataloader.py", line 633, in next data = self._next_data() File "E:\Anaconda\lib\site-packages\torch\utils\data\dataloader.py", line 1345, in _next_data return self._process_data(data) File "E:\Anaconda\lib\site-packages\torch\utils\data\dataloader.py", line 1371, in _process_data data.reraise() File "E:\Anaconda\lib\site-packages\torch_utils.py", line 644, in reraise raise exception IndexError: Caught IndexError in DataLoader worker process 0. Original Traceback (most recent call last): File "E:\Anaconda\lib\site-packages\torch\utils\data_utils\worker.py", line 308, in _worker_loop data = fetcher.fetch(index) File "E:\Anaconda\lib\site-packages\torch\utils\data_utils\fetch.py", line 51, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "E:\Anaconda\lib\site-packages\torch\utils\data_utils\fetch.py", line 51, in data = [self.dataset[idx] for idx in possibly_batched_index] File "E:\hover\hover_net-master\dataloader\train_loader.py", line 98, in getitem type_map = (ann[..., 1]).copy() IndexError: index 1 is out of bounds for axis 2 with size 1

simongraham commented 5 months ago

Looks like your data has not been prepared right. It is trying to grab the second channel for the type map, which apparently isn't there.