l3das / L3DAS22

48 stars 15 forks source link

preprocessing got num_samples=0 #3

Open yang-china opened 2 years ago

yang-china commented 2 years ago

Hi,I3das I am reproducing the code, but the parameter obtained in the preprocessing process is 0, may I ask whether the result is normal .Here's my logs. (DeepLearning36) D:\sound source localization\L3DAS22-main>python preprocessing.py --task 2 --input_path E://DATASETS//Task2 --num_mics 1 --frame_len 100 Processing E://DATASETS//Task2\L3DAS22_Task2_train folder... Processing E://DATASETS//Task2\L3DAS22_Task2_dev folder... Saving files Matrices successfully saved Training set shape: (0,) (0,) Validation set shape: (0,) (0,) Test set shape: (0,) (0,)

l3das commented 2 years ago

Hi yang-china, have you tried also on a Mac/Linux machine? Unfortunately, our code has not been tested on Windows.

wang-siwen02 commented 2 years ago

Hi,I3das I tried on Linux machine.This phenomenon still exists.Here's my logs. (torchEnv) j@j-System-Product-Name:/media/j/Elements/L3DAS22-main$ python preprocessing.py --task 2 --input_path /media/j/Elements/DATASETS/l3das22/Task2 --num_mics 1 --frame_len 100 Processing /media/j/Elements/DATASETS/l3das22/Task2/L3DAS22_Task2_train folder... Processing /media/j/Elements/DATASETS/l3das22/Task2/L3DAS22_Task2_dev folder... Saving files Matrices successfully saved Training set shape: (0,) (0,) Validation set shape: (0,) (0,) Test set shape: (0,) (0,)

l3das commented 2 years ago

Hi @wang-siwen02, thanks for trying that. This usually happens when the wrong folder is selected as --input_path, or if that folder does not contain the correct files. Could you please try to use a relative path for --input_path? This could be a solution. In case this does not work, could you try to download and preprocess the dataset with the commands we provide (not manually)? Best, L3DAS team

yang-china commented 2 years ago

Hi l3das, Thank you for your help. I use lovemefan/L3DAS22_Challenge preprocessing.It is good running! But we can't figure out what's wrong yet. Best, Yang