Closed NaNBridge closed 11 months ago
This is mostly caused by a wrong dataset link. You could check the dataset path. To ensure you correctly load the images, you can print the size of input_data in line 50 in dataset_loader.py to see whether it is a null tensor.
I created a soft link to dataset PHOENIX14-T as README said. I don't konw if it is correct, in the file system, I can access the dataset normally by the soft link. But after printing I find that the length of input_data
is 0 in line 50 in dataset_loader.py
This means you don't correctly create links to the dataset, and read null data. You may check the path to the dataset.
---Original--- From: @.> Date: Tue, Aug 1, 2023 10:06 AM To: @.>; Cc: @.**@.>; Subject: Re: [hulianyuyy/CorrNet] IndexError (Issue #10)
I created a soft link to dataset PHOENIX14-T as README said. I don't konw if it is correct, in the file system, I can access the dataset normally by a soft link. But after printing I find that the length of input_data is 0 in line 50 in dataset_loader.py
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
Sorry, but I don't understand where the soft link went wrong, I followed the README
to create the soft link. Below my dataset
folder there exists a phoenix2014-T
folder. Is the soft link wrong here?
You should create the soft link to the downloaded phoenix2014-T dataset. Under this folder, there are four subfolders "annotations, evaluation, features, models". You can check your path.
Hi, I had the exact same error. Through debugging I found that the video paths in /annotations/manual/*.csv
did not match the existing paths in /features/fullFrame-210x260px/
, because the directory '1' was missing.
I moved the files in the features folders into a new "1" subdirectory by executing following script for dev, train and test-folders:
import os
import shutil
dir_path = "/home/user/Documents/CorrNet/dataset/phoenix2014-T/features/fullFrame-210x260px/dev"
subfolders = [x for x in os.listdir(dir_path) if os.path.isdir(os.path.join(dir_path, x))]
for dir in subfolders:
if "1" in os.listdir(os.path.join(dir_path, dir)):
continue # 1 dir already exists, assume files are in correct dir
files = [x for x in os.listdir(os.path.join(dir_path, dir)) if os.path.isfile(os.path.join(dir_path, dir, x))]
goal_dir = os.path.join(dir_path, dir, "1")
os.makedirs(goal_dir)
for f in files:
shutil.move(os.path.join(dir_path, dir, f), goal_dir)
Then I ran /preprocessing/dataset_preprocess-T.py --process-image --multiprocessing
again and afterwards python main.py --device 0 --load-weights /weights/dev_18.90_PHOENIX14-T.pt --phase test
worked without exceptions.
Hi, I had the exact same error. Through debugging I found that the video paths in
/annotations/manual/*.csv
did not match the existing paths in/features/fullFrame-210x260px/
, because the directory '1' was missing.I moved the files in the features folders into a new "1" subdirectory by executing following script for dev, train and test-folders:
import os import shutil dir_path = "/home/user/Documents/CorrNet/dataset/phoenix2014-T/features/fullFrame-210x260px/dev" subfolders = [x for x in os.listdir(dir_path) if os.path.isdir(os.path.join(dir_path, x))] for dir in subfolders: if "1" in os.listdir(os.path.join(dir_path, dir)): continue # 1 dir already exists, assume files are in correct dir files = [x for x in os.listdir(os.path.join(dir_path, dir)) if os.path.isfile(os.path.join(dir_path, dir, x))] goal_dir = os.path.join(dir_path, dir, "1") os.makedirs(goal_dir) for f in files: shutil.move(os.path.join(dir_path, dir, f), goal_dir)
Then I ran
/preprocessing/dataset_preprocess-T.py --process-image --multiprocessing
again and afterwardspython main.py --device 0 --load-weights /weights/dev_18.90_PHOENIX14-T.pt --phase test
worked without exceptions.
Thank you, I solved the problem.
Hi ! I tried to run
python main.py --device 0 --load-weights /weitghts/dev_18.90_PHOENIX14-T.pt --phase test
, but I got an IndexError. The detailed error message is as follows:I'm sure my dataset path is correct. The complete error message is as follows:
All errors occur after the dataset has finished loading.