Closed disonmay closed 4 months ago
Can you post the full output of your terminal (including the command you used) as text (not an image)?
thanks for your reply
the commend i used is "_python validation.py dataset=gen1 dataset.path=validata/gen1 checkpoint=ckp/rvt-s.ckpt use_test_set=1 hardware.gpus=0 +experiment/gen1=small.yaml batch_size.eval=8 model.postprocess.confidencethreshold=0.001"
the outputs of this commend are
"Using 16bit native Automatic Mixed Precision (AMP)
Trainer already configured with model summary callbacks: [<class 'pytorch_lightning.callbacks.model_summary.ModelSummary'>]. Skipping setting a default ModelSummary
callback.
GPU available: True (cuda), used: True
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
HPU available: False, using: 0 HPUs
Error executing job with overrides: ['dataset=gen1', 'dataset.path=validata/gen1', 'checkpoint=ckp/rvt-s.ckpt', 'use_test_set=1', 'hardware.gpus=0', '+experiment/gen1=small.yaml', 'batch_size.eval=8', 'model.postprocess.confidence_threshold=0.001']
Traceback (most recent call last):
File "/root/RVT/validation.py", line 84, in main
trainer.test(model=module, datamodule=data_module, ckpt_path=str(ckpt_path))
File "/root/miniconda3/envs/rvt/lib/python3.9/site-packages/pytorch_lightning/trainer/trainer.py", line 780, in test
return call._call_and_handle_interrupt(
File "/root/miniconda3/envs/rvt/lib/python3.9/site-packages/pytorch_lightning/trainer/call.py", line 38, in _call_and_handle_interrupt
return trainer_fn(*args, *kwargs)
File "/root/miniconda3/envs/rvt/lib/python3.9/site-packages/pytorch_lightning/trainer/trainer.py", line 829, in _test_impl
results = self._run(model, ckpt_path=self.ckpt_path)
File "/root/miniconda3/envs/rvt/lib/python3.9/site-packages/pytorch_lightning/trainer/trainer.py", line 1037, in _run
self._call_setup_hook() # allow user to setup lightning_module in accelerator environment
File "/root/miniconda3/envs/rvt/lib/python3.9/site-packages/pytorch_lightning/trainer/trainer.py", line 1284, in _call_setup_hook
self._call_lightning_datamodule_hook("setup", stage=fn)
File "/root/miniconda3/envs/rvt/lib/python3.9/site-packages/pytorch_lightning/trainer/trainer.py", line 1361, in _call_lightning_datamodule_hook
return fn(args, **kwargs)
File "/root/RVT/modules/data/genx.py", line 167, in setup
self.test_dataset = self.build_eval_dataset(dataset_mode=DatasetMode.TESTING,
File "/root/RVT/data/genx_utils/dataset_streaming.py", line 25, in build_streaming_dataset
assert split_path.is_dir()
AssertionError
Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace. == Timing statistics =="
This line is raising an exception: https://github.com/uzh-rpg/RVT/blob/b80f5683a6e2d5de65d4bde8105d796ccb50dbb1/data/genx_utils/dataset_streaming.py#L25
This means that the dataset split was not found, which means that probably you have either set the wrong path to the dataset or you dataset is missing a split. Can you check what split_path
is? Can you also make sure you have the full dataset at the right location?
thanks for your reply i change the path as you said, and the validation.py is running successful. thanks very much
hello @magehrig sorry to bother you when i use the commend "python validation.py dataset=gen1 et.al", it return an error "Error executing job with overrides: ['dataset=gen1'], et.al" how should i solve this problem? thanks to see my issue