Xilinx / logicnets

Apache License 2.0
81 stars 26 forks source link

unable to open input dataset file #26

Closed tsp6 closed 1 year ago

tsp6 commented 1 year ago

When I try to train the jet_substructure I am facing with the reading the dataset.

tsp@06835eb13924:/workspace/logicnets/examples/jet_substructure$ python train.py --arch jsc-s --log-dir ./jsc_s/ Traceback (most recent call last): File "train.py", line 292, in dataset['train'] = JetSubstructureDataset(dataset_cfg['dataset_file'], dataset_cfg['dataset_config'], split="train") File "/workspace/logicnets/examples/jet_substructure/dataset.py", line 35, in init with h5py.File(input_file, 'r') as h5py_file: File "/home/tsp/.local/miniconda3/lib/python3.8/site-packages/h5py/_hl/files.py", line 533, in init fid = make_fid(name, mode, userblock_size, fapl, fcpl, swmr=swmr) File "/home/tsp/.local/miniconda3/lib/python3.8/site-packages/h5py/_hl/files.py", line 226, in make_fid fid = h5f.open(name, flags, fapl=fapl) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5f.pyx", line 106, in h5py.h5f.open OSError: Unable to open file (file signature not found)

can you help me solve this issue

tsp6 commented 1 year ago

here is the more detailed explanation of the error. FileNotFoundError: [Errno 2] Unable to open file (unable to open file: name = 'data/processed-pythia82-lhc13-all-pt1-50k-r1_h022_e0175_t220_nonu_truth.z', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)

nickfraser commented 1 year ago

It sounds like you haven't downloaded the dataset. Or if you have, can you tell me what is the output of these commands.

tsp6 commented 1 year ago

Hello @nickfraser ,

I had downloaded the dataset , here is the output of the download command

tsp@06835eb13924:/workspace/logicnets/examples/jet_substructure$ mkdir -p data tsp@06835eb13924:/workspace/logicnets/examples/jet_substructure$ wget https://cernbox.cern.ch/index.php/s/jvFd5MoWhGs1l5v/download -O data/processed-pythia82-lhc13-all-pt1-50k-r1_h022_e0175_t220_nonu_truth.z --2022-11-11 08:59:31-- https://cernbox.cern.ch/index.php/s/jvFd5MoWhGs1l5v/download Resolving cernbox.cern.ch (cernbox.cern.ch)... 128.142.53.35, 137.138.120.151, 128.142.53.28, ... Connecting to cernbox.cern.ch (cernbox.cern.ch)|128.142.53.35|:443... connected. HTTP request sent, awaiting response... 301 Moved Permanently Location: https://cernbox.cern.ch/s/jvFd5MoWhGs1l5v/download [following] --2022-11-11 08:59:31-- https://cernbox.cern.ch/s/jvFd5MoWhGs1l5v/download Reusing existing connection to cernbox.cern.ch:443. HTTP request sent, awaiting response... 200 OK Length: 3648 (3.6K) [text/html] Saving to: ‘data/processed-pythia82-lhc13-all-pt1-50k-r1_h022_e0175_t220_nonu_truth.z’

data/processed-pythia82-lh 100%[=====================================>] 3.56K --.-KB/s in 0s

2022-11-11 08:59:31 (12.9 MB/s) - ‘data/processed-pythia82-lhc13-all-pt1-50k-r1_h022_e0175_t220_nonu_truth.z’ saved [3648/3648]

tsp6 commented 1 year ago

Here is what I got when I try to debug the error:

Exception has occurred: FileNotFoundError [Errno 2] Unable to open file (unable to open file: name = 'data/processed-pythia82-lhc13-all-pt1-50k-r1_h022_e0175_t220_nonu_truth.z', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0) File "/workspace/logicnets/examples/jet_substructure/dataset.py", line 35, in init with h5py.File(input_file, 'r') as h5py_file: File "/workspace/logicnets/examples/jet_substructure/train.py", line 292, in dataset['train'] = JetSubstructureDataset(dataset_cfg['dataset_file'], dataset_cfg['dataset_config'], split="train")

nickfraser commented 1 year ago

Based on your new issue, I assume you were able to resolve this.

tsp6 commented 1 year ago

Hello @nickfraser,

I got this error with jest substructure example and was not solved yet. And my new issue is on cybersecurity example. Either of them is not working. can you help me solve this unable to open the data set in jet substructure example.

Thank You