tsurumeso / vocal-remover

Vocal Remover using Deep Neural Networks
MIT License
1.47k stars 215 forks source link

Num Samples #102

Open ripnonfegames opened 2 years ago

ripnonfegames commented 2 years ago

Whenever I run a training perimeter, I keep getting an error even though I'm following the exact instructions from the readme: python train.py --dataset path/to/dataset --reduction_rate 0.5 --mixup_rate 0.5 --gpu 0 path/to/dataset in my case is: C:\Users\hinds\Downloads\Compressed\vocal-remover\practice\, Yet I keep getting this specific error: num_samples should be a positive integer value, but got num_samples=0

Heres my Log: train_20220319195133.log

tsurumeso commented 2 years ago

If the number of instruments is different from that of mixtures, the error occurs. Please check your dataset directory and make sure that the numbers are the same.

ripnonfegames commented 2 years ago

Number of instruments as in?

tsurumeso commented 2 years ago

dataset

Nekitt1 commented 2 years ago

@tsurumeso I have the same error, while the number of instruments and mixes files matches

[](url)

AronYstad commented 2 years ago

I get the same error. I have the same number of each, and the output even shows that it paired them up. But it still says that the number of samples is 0.

tsurumeso commented 2 years ago

If you use the default parameters, the size of the dataset should be at least 5 pairs.

AronYstad commented 2 years ago

That fixed the problem, but now it seems like it's only using one pair for the training. At least in the output, it only lists one pair. Will it switch to another one after a while or do I need to do something? I had to change the batch size to not run out of memory, if that is part of the problem.

tsurumeso commented 2 years ago

At least in the output, it only lists one pair. Will it switch to another one after a while or do I need to do something?

The pair is a validation dataset. With default parameters, 20% of the dataset is used for validation, and 80% for training. You don't need to do anything.