Open Mte90 opened 4 years ago
What should be done here? Only removing every TRANSFER_LEARNING in the DeepSpeech folder and putting a comment that when DROP_SOURCE_LAYER > 0 is transfer learning? Or should something else be done also?
Basically yes. As written on deepspeech doc when the option DROP_SOURCE_LAYER is > 0 you drop the number of layers specified and then start training with some previouos checkpoint from another train (eg: english checkpoint). Right now the script checks if TRANSFER_LEARNING is true and the starts downloading the ENG deepspeech checkpoints.
There is a recurrent case that need to be handled: how continue a previous training. In fact if I start again with the same flags (DROP_SOURCE_LAYER and TRANSFER LEARNING) the script (if I remember correctly) will skip the ENG checkpoints download but it will DROP the N layers from the previous iteration. And this could be a problem :)
In the second bullet point I added a quite extreme case that is load a previous IT checkpoint and dropping some layers (if we want for example start a new training from the IT release checkpoint while adding new characters to the alphabet this could be an option)