I encountered a ValueError when using the transmil MIL method and histossl Extractor method in my code. The error occurs when calling the train_mil function.
The relevant code snippet is:
train_mil(
# ...
)
The error traceback is as follows:
Traceback (most recent call last):
File "myhome/itona/sf-test/mil.py", line 59, in <module>
train_mil(
File "myhome/micromamba/envs/slideflow/lib/python3.9/site-packages/slideflow/mil/train/__init__.py", line 80, in train_mil
return _train_mil(config, **mil_kwargs)
File "myhome/micromamba/envs/slideflow/lib/python3.9/site-packages/slideflow/mil/train/__init__.py", line 157, in _train_mil
return train_fn(
File "myhome/micromamba/envs/slideflow/lib/python3.9/site-packages/slideflow/mil/train/__init__.py", line 728, in train_fastai
learner, (n_in, n_out) = build_fastai_learner(
File "myhome/micromamba/envs/slideflow/lib/python3.9/site-packages/slideflow/mil/train/__init__.py", line 550, in build_fastai_learner
learner, (n_in, n_out) = _fastai.build_learner(
File "myhome/micromamba/envs/slideflow/lib/python3.9/site-packages/slideflow/mil/train/_fastai.py", line 80, in build_learner
return _build_fastai_learner(config, _args,_ *kwargs)
File "myhome/micromamba/envs/slideflow/lib/python3.9/site-packages/slideflow/mil/train/_fastai.py", line 261, in _build_fastai_learner
batch = train_dl.one_batch()
File "myhome/micromamba/envs/slideflow/lib/python3.9/site-packages/fastai/data/load.py", line 171, in one_batch
if self.n is not None and len(self)==0: raise ValueError(f'This DataLoader does not contain any batches')
ValueError: This DataLoader does not contain any batches
The code worked correctly when using the clam-sb or clam-sm MIL methods.
The issue occurs specifically when using the trans/attention MIL method.
Description
I encountered a ValueError when using the transmil MIL method and histossl Extractor method in my code. The error occurs when calling the train_mil function.
The relevant code snippet is:
The code worked correctly when using the clam-sb or clam-sm MIL methods. The issue occurs specifically when using the trans/attention MIL method.