Open srdfjy opened 2 months ago
freeze other modules except the outputlayer(ctc output && attention decoder output), add new words to your unit.txt ,modify the output size and then tune the model
THX @fclearner,I will try out what you suggested later.
Hi,a pre-trained model's unit.txt contains 1000 words. When fine-tuning based on this pre-trained model, there are 10 new words not in the unit.txt. At this point, adding these 10 new words to the end of unit.txt and assigning them new numbers, is this approach feasible?