Closed cy-eom closed 4 years ago
I'm not super familiar with how torch runs on mobile, but my impression is that you have to get your whole model to run under TorchScript. In your example, you're calling torch.jit.trace
, but tracing only works in a subset of cases. There is no guarantee that any of our models work in this case.
I know some people are looking into making the whole thing compatible with TorchScript, but I can't give you a timeline for that.
Closing as this isn't a priority for us presently. If someone is interested in working on this and contributing for TorchScript, please follow up on a separate issue and we can discuss.
System (please complete the following information):
Hi all,
I'm looking for a way to convert an AllenNLP model using Pytorch mobile library, to run the model on an android mobile.
We implemented our model using BiLSTM-CNN-CRF algorithm, and after the training step, results are archived as model.tar.gz file, including weight.th, config.json, and a sort of vocabulary text files.
Below is the structure of our model object.
BTW, I found a few hints from pytorch mobile - android and tutorial, and they explain that torch.jit.save creates a .pt file, a converted pytorch model.
For example, I'm thinking of using HelloWorld app where serialized model is used, and It provides a python script in the root folder of HelloWorld app:
However, for me still it is not clear that whether this approach is also useful to AllenNLP model too, because I think the structure of our model is not that same as general model used in above example.
Could anyone tell me some useful approaches or code sniffet that would work effectively?
I hope your kind responses. Thanks.