Closed rangerbottle closed 4 years ago
At the moment, GPyTorch does not support exporting models to C++ (or at least this has never been tested before). We can add this functionality to a future release, but it is not on our radar right now.
Minor note on what Geoff said: actually I have been working quite a lot on exporting GPyTorch models to torch script. At the moment, exact GPs are fully supported if you use fast pred variances and compute the caches, and variational inference should work now as well.
Would this sort of conversion be sufficient? I'll put up an example notebook under basic usage demonstrating how to do this.
I'll actually just go ahead and add this to my to-do for 1.0. I'll have an example notebook up for both exact and variational GPs either this week or (more likely due to the holiday) early next.
Thanks, that will be great!
Closed via #1026
🚀 Feature Request
Can I use Libtorch to load a Gpytorch model?
Motivation
Since I just want to deploy Gpytorch model using C++ frontend. But after I followed using Libtorch tutorial to get the trained model from gpytorch, "traced_script_module = torch.jit.trace(model, example)" I got "TracedModules don't support parameter sharing between modules"
So is there any way I can export gpytorch to C++ ? Thanks a lot