Closed dianyo closed 1 year ago
Welcome to submit a PR directly to this repo, we will review and test your PR.
Hi @dianyo , thank you for your suggestions
For neural_compressor/adaptor/ox_utils/calibration.py
, that's indeed our negligence. You can create a branch on this repo and push your local branch to it then create a PR.
As for neural_compressor/adaptor/onnxrt.py
, users need to create a dataloder or evaluation function to get accuracy, so we assume that users can convert Torch.Tensor to numpy array by themselves. Could you please provide an example to show the necessity that we should do it internally?
Hi @mengniwang95,
I assume that the dataloader argument in the evaluation function is compatible with Pytorch dataloader as shown in your example. Therefore, I suggest that you either write a document to state that using onnx model needs the implementation of your dataloader instead of using Pytorch dataloader directly or implement the Torch.Tensor/numpy conversion code in onnrt.py
. What do you think?
Hi @dianyo , thank you for your suggestions, we plan to add Torch.Tensor to numpy conversion into onnxrt adaptor 😄
implemented in commit a2931eaa4052eec195be3c79a13f7bfa23e54473
Hi, As the onnxruntime cannot take
torch.Tensor
as input, we should add a type check before we feed thedataloader
's output to onnxruntime inference session. I also found another bug when assigning the providers. I'm unsure how to create a pull request as I don't have permission to upload my fixed branch. I just shared with you where the code I've modified as follows:neural_compressor/adaptor/ox_utils/calibration.py
neural_compressor/adaptor/onnxrt.py
If there's any way for me to create a PR, I'd like to do so! Thank you!