HarryVolek / PyTorch_Speaker_Verification

PyTorch implementation of "Generalized End-to-End Loss for Speaker Verification" by Wan, Li et al.
BSD 3-Clause "New" or "Revised" License
575 stars 165 forks source link

error in training #72

Closed Jason-1998 closed 3 years ago

Jason-1998 commented 3 years ago

(torch) [gjj@localhost PyTorch_Speaker_Verification]$ ./train_speech_embedder.py /data/gjj/PyTorch_Speaker_Verification/hparam.py:11: YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details. for doc in docs: Traceback (most recent call last): File "./train_speech_embedder.py", line 158, in train(hp.model.model_path) File "./train_speech_embedder.py", line 44, in train for batch_id, mel_db_batch in enumerate(train_loader): File "/home/gjj/anaconda3/envs/torch/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 346, in next data = self.dataset_fetcher.fetch(index) # may raise StopIteration File "/home/gjj/anaconda3/envs/torch/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/gjj/anaconda3/envs/torch/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/data/gjj/PyTorch_Speaker_Verification/data_load.py", line 77, in getitem utter_index = np.random.randint(0, utters.shape[0], self.utter_num) # select M utterances per speaker File "mtrand.pyx", line 992, in mtrand.RandomState.randint ValueError: Range cannot be empty (low >= high) unless no samples are taken

I met this error in training phrase, but I don't know why.

kouohhashi commented 3 years ago

@Jason-1998 Hi, I have the same issue. Have you solved the problem?

Jason-1998 commented 3 years ago

@Jason-1998 Hi, I have the same issue. Have you solved the problem?

I think it is because of data preparing. If you prepare data unsuccessfully, you will get .npy files with no data, and this will cause the problem we have met.