This is the directory structure of my project after cloning the repo and following the instructions in the README on downloading the zip files and pre-trained model weights and where to put them.
Then I ran the training code and got the following error:
$ python train.py
INFO - PANet - Running command 'main'
INFO - PANet - Started run with ID "5"
INFO - main - ###### Create model ######
/home/moonlab/sattwik/code/PANet/models/vgg.py:66: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
dic = torch.load(self.pretrained_path, map_location='cpu')
INFO - main - ###### Load data ######
INFO - main - ###### Set optimizer ######
INFO - main - ###### Training ######
ERROR - PANet - Failed after 0:00:01!
Traceback (most recent calls WITHOUT Sacred internals):
File "/home/moonlab/sattwik/code/PANet/train.py", line 89, in main
for i_iter, sample_batched in enumerate(trainloader):
File "/home/moonlab/sattwik/code/PANet/.venv/lib/python3.12/site-packages/torch/utils/data/dataloader.py", line 630, in __next__
data = self._next_data()
^^^^^^^^^^^^^^^^^
File "/home/moonlab/sattwik/code/PANet/.venv/lib/python3.12/site-packages/torch/utils/data/dataloader.py", line 1344, in _next_data
return self._process_data(data)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/moonlab/sattwik/code/PANet/.venv/lib/python3.12/site-packages/torch/utils/data/dataloader.py", line 1370, in _process_data
data.reraise()
File "/home/moonlab/sattwik/code/PANet/.venv/lib/python3.12/site-packages/torch/_utils.py", line 706, in reraise
raise exception
FileNotFoundError: Caught FileNotFoundError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/home/moonlab/sattwik/code/PANet/.venv/lib/python3.12/site-packages/torch/utils/data/_utils/worker.py", line 309, in _worker_loop
data = fetcher.fetch(index) # type: ignore[possibly-undefined]
^^^^^^^^^^^^^^^^^^^^
File "/home/moonlab/sattwik/code/PANet/.venv/lib/python3.12/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
~~~~~~~~~~~~^^^^^
File "/home/moonlab/sattwik/code/PANet/dataloaders/common.py", line 165, in __getitem__
sample = [self.datasets[dataset_idx][data_idx]
~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^
File "/home/moonlab/sattwik/code/PANet/dataloaders/common.py", line 197, in __getitem__
return self.dataset[self.indices[idx]]
~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^
File "/home/moonlab/sattwik/code/PANet/dataloaders/pascal.py", line 48, in __getitem__
image = Image.open(os.path.join(self._image_dir, f'{id_}.jpg'))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/moonlab/sattwik/code/PANet/.venv/lib/python3.12/site-packages/PIL/Image.py", line 3431, in open
fp = builtins.open(filename, "rb")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: '/home/moonlab/sattwik/code/PANet/VOCdevkit/VOC2012/JPEGImages/2008_008522.jpg'
Guess
My guess is that the dataset in the zip file is not according to the format expected by the sacred package.
Setup
Directory Structure
This is the directory structure of my project after cloning the repo and following the instructions in the README on downloading the zip files and pre-trained model weights and where to put them.
Environment
I have setup the virtual environment with the packages specified in the README.
Error
Running the Code
I ran the training code by changing the configuration file as given below
Then I ran the training code and got the following error:
Guess
My guess is that the dataset in the zip file is not according to the format expected by the
sacred
package.Please help. Thank you.