iurada / px-ntk-pruning

Official repository of our work "Finding Lottery Tickets in Vision Models via Data-driven Spectral Foresight Pruning" accepted at CVPR 2024
https://iurada.github.io/PX
21 stars 4 forks source link

KeyError: 'root" #2

Closed EnergeticChubby closed 4 months ago

EnergeticChubby commented 4 months ago

image

I could not find the CONFIG.dataset_args['root'].

Additionally, I tried to run this project in a new env. But after I installed the requirements, it went wrong.

e53013fc09aba0e080a604e433fd284

Thanks!

iurada commented 4 months ago

image

I could not find the CONFIG.dataset_args['root'].

Additionally, I tried to run this project in a new env. But after I installed the requirements, it went wrong.

e53013fc09aba0e080a604e433fd284

Thanks!

Hi!

Did you run the code by editing the example launch scripts provided in the ./launch_scripts/ folder of the repository?

CONFIG.dataset_args('root') should point to the folder where you either downloaded the datasets / where do you want to download and store them automatically.

EnergeticChubby commented 4 months ago

image I could not find the CONFIG.dataset_args['root']. Additionally, I tried to run this project in a new env. But after I installed the requirements, it went wrong. e53013fc09aba0e080a604e433fd284 Thanks!

Hi!

Did you run the code by editing the example launch scripts provided in the ./launch_scripts/ folder of the repository?

CONFIG.dataset_args('root') should point to the folder where you either downloaded the datasets / where do you want to download and store them automatically.

Thanks for your help! I forgot to modify the file in the ./launch_scripts folder. Best wishes!

EnergeticChubby commented 4 months ago

PS C:*\px-ntk-pruning-main> ./launch_scripts/cifar10.bat
Downloading https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz to data/CIFAR10\cifar-10-python.tar.gz 100%|██████████████████████████████████████████████████████████████████████████████| 170498071/170498071 [05:27<00:00, 520776.98it/s] Extracting data/CIFAR10\cifar-10-python.tar.gz to data/CIFAR10 Files already downloaded and verified Traceback (most recent call last): File "C:*
\px-ntk-pruning-main\main.py", line 58, in main() File "C:*\px-ntk-pruning-main\main.py", line 28, in main experiment = Experiment() ^^^^^^^^^^^^ File "C:*\px-ntk-pruning-main\train_classification.py", line 77, in init
self.pruner.score(self.model, self.loss_fn, self.data['train'], CONFIG.device) File "C:*\px-ntk-pruning-main\px-ntk-pruning-main\lib\pruners.py", line 454, in score for batch_idx, data_tuple in enumerate(dataloader): ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users*\AppData\Roaming\Python\Python311\site-packages\torch\utils\data\dataloader.py", line 433, in iter self._iterator = self._get_iterator() ^^^^^^^^^^^^^^^^^^^^ File "C:\Users*\AppData\Roaming\Python\Python311\site-packages\torch\utils\data\dataloader.py", line 386, in _get_iterator
return _MultiProcessingDataLoaderIter(self) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users*
\AppData\Roaming\Python\Python311\site-packages\torch\utils\data\dataloader.py", line 1039, in init w.start() File "C:\ProgramData\anaconda3\Lib\multiprocessing\process.py", line 121, in start self._popen = self._Popen(self) ^^^^^^^^^^^^^^^^^ File "C:\ProgramData\anaconda3\Lib\multiprocessing\context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ProgramData\anaconda3\Lib\multiprocessing\context.py", line 336, in _Popen return Popen(process_obj) ^^^^^^^^^^^^^^^^^^ File "C:\ProgramData\anaconda3\Lib\multiprocessing\popen_spawn_win32.py", line 94, in init reduction.dump(process_obj, to_child) File "C:\ProgramData\anaconda3\Lib\multiprocessing\reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) AttributeError: Can't pickle local object 'SeededDataLoader.init..seed_worker' Traceback (most recent call last): File "", line 1, in File "C:\ProgramData\anaconda3\Lib\multiprocessing\spawn.py", line 122, in spawn_main exitcode = _main(fd, parent_sentinel) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ProgramData\anaconda3\Lib\multiprocessing\spawn.py", line 132, in _main self = reduction.pickle.load(from_parent) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ EOFError: Ran out of input


I tried to run it on my Windows and Linux servers. Unfortunately, it made the same mistakes. I deeply hope you can help me solve the problem.

iurada commented 4 months ago

Oh, I see. It's a known issue in pytorch and it has to do with how the operating system implements inter-process communication. A short-term fix is to either set in the config --num_workers=0 or modify line 45 of datasets/utils.py file as:

         worker_init_fn = None     # previously it was set to seed_worker
EnergeticChubby commented 4 months ago

Oh, I see. It's a known issue in pytorch and it has to do with how the operating system implements inter-process communication. A short-term fix is to either set in the config --num_workers=0 or modify line 45 of datasets/utils.py file as:

         worker_init_fn = None     # previously it was set to seed_worker

It works! I appreciate your patience! :)