pharo-ai / NgramModel

Ngram language model implemented in Pharo
MIT License
4 stars 4 forks source link

8-Infer-the-order-of-n-grams-when-reading-count-dictionaries-from-files #18

Closed olekscode closed 4 years ago

olekscode commented 4 years ago

Improved the code and fixed #8

olekscode commented 4 years ago

The NoSmoothing technique is selected by default. So now everything works fine. However, if a user chooses a different smoothing and saves the model, then after the trained model is loaded, it will be initialised with default smoothing again. This makes sense because smoothing is not part of training. But in the future we will probably change it because I'm considering saving everything to a single JSON file