philipperemy / deep-speaker

Deep Speaker: an End-to-End Neural Speaker Embedding System.
MIT License
905 stars 241 forks source link

Wrong number of layers while doing triplet training after softmax pre-training #82

Closed Szyzapor closed 3 years ago

Szyzapor commented 3 years ago

Hi, I' ve got a question about triplet + softmax pre-training. When I am using only softmax or triplet training everything works perfectly, but when I am trying to do softmax pre-training and then triplet training I am seeing such an error:

ValueError: You are trying to load a weight file containing 58 layers into a model with 57 layers.

How I should do pre-training to avoid this error? Or should I remove one of the layers from saved model? This is not clear for me.

Could you please explain how it should be done properly?

philipperemy commented 3 years ago

@Szyzapor that looks weird to me it should have worked nicely. The fact that it's expecting 58 instead of 57 is probably because we remove the softmax layer before pre-training with the triplets. Maybe:

model.load_weights(filepath, by_name=True)

will work. If it's not already the case.

philipperemy commented 3 years ago

@Szyzapor when you have time please provide more details. I'll close it for now