Closed zsogitbe closed 1 year ago
Hi @zsogitbe
Here is the way to convert your existing trained model to newer format:
Then you should be able to load the updated model by new code. Let me know if it works.
Thanks Zhongkai Fu
Yes, thank you, I could manage to convert my model (I have also deleted clsVocabs from the model).
I have added this issue for other users who do not know how to modify the code well and need an other long term solution for model backward compatibility.
The code has been updated to support old models automatically.
Description: The last update and the removal of ClsVocabs everywhere (for example, in SeqLabelModel code) is preventing formerly trained models to load. The problem is the target vocabulary which is now not being set by reading ClsVocabs.
Model backward compatibility is extremely important. The models we train nowadays are trained for several days with a lot of resources and time. We cannot afford to loose all of this because the library is improving.
Expected behavior Removing clsVocabs to normalize the code and using tgtVocab everywhere is a good improvement, but this should be done in a way that formerly trained models can still be converted or loaded. Solution: add back the clsVocabs serialization and convert old models automatically or provide a console application to convert old models to the new version.