Closed marvision-ai closed 2 years ago
Thanks for spotting this @marvision-ai! There have been some changes in the OpenVinoInferencer
. These changes should also be applied to TorchInferencer
in post_process
method. Any thoughts @djdameln, @ashwinvaidya17?
Thanks for the great repo. Reporting the same issue.
Thank you @samet-akcay and @ashwinvaidya17 for very fast turn around time!
Is there a reason why when I want to load and infer with a ckpt
file it still needs to import openvino? The documentation states that purely for Pytorch.
If the specified weight path points to a PyTorch Lightning checkpoint file (.ckpt), inference will run in PyTorch. If the path points to an ONNX graph (.onnx) or OpenVINO IR (.bin or .xml), inference will run in OpenVINO.
Hello, thank you for the great repo!
Describe the bug I cannot run inference.
To Reproduce Steps to reproduce the behavior:
conda install openvino-ie4py-ubuntu20 -c intel
conda install pytorch==1.8.0 torchvision==0.9.0 torchaudio==0.8.0 cudatoolkit=11.1 -c pytorch -c conda-forge
Hardware and Software Configuration
Additional context I also noticed that when I do inference from a ckpt file it still imports openvino- i expected this not to since it should run purely in pytorch.