This is a major refactoring that doesn't actually change the workings of the algorithm. The primary purpose is to make the parser reusable beyond the training session: it can now load a pre-trained model and run that model on sentences without oracle transitions. This refactoring also removes the requirement that the training model be present at test time, stores all relevant parameters in the saved model, and allows compressing the saved model. Finally, the parser should now be usable by API from other applications.
This is a major refactoring that doesn't actually change the workings of the algorithm. The primary purpose is to make the parser reusable beyond the training session: it can now load a pre-trained model and run that model on sentences without oracle transitions. This refactoring also removes the requirement that the training model be present at test time, stores all relevant parameters in the saved model, and allows compressing the saved model. Finally, the parser should now be usable by API from other applications.