Closed DeNeutoy closed 4 years ago
We would write an archive_model_from_memory
function.
Any updates on this thread? I'd like to save a model trained. From the examples in "train" command, without using "allennlp train" command, we have to save vocabulary, model parameters, configurations separately. Am I right?
This function is really useful. Hope it can be available soon.
The tricky part right now is providing a config during save for a custom model. I cant even find an example.
Do we have anything better to say here? I know we've talked about it a couple of times since this issue was opened, but I don't remember the current state of things.
This is a necessary precondition to better supporting using AllenNLP via python code vs. configuration. Can we leverage PyTorch's save
functionality? Can we just use Python's dill
?
We have a section on this in the upcoming course, showing that it's now pretty easy, with or without config files. It's got example code. I'm closing this issue as finished.
Here is a script I just wrote to load a model with the custom LSTM kernel and re-pack it into a new model so we can put it in the demo and distribute it. Turned out that we don't have any functionality for creating an archive from an in memory
Model
andConfig
and doing so was pretty messy. Hacking stuff like this is pretty key for e.g transfer learning and research in general - "what happens if I take part of this model and put it in this other one".