chainyo / transformers-pipeline-onnx

How to export Hugging Face's 🤗 NLP Transformers models to ONNX and use the exported model with the appropriate Transformers pipeline.
23 stars 0 forks source link

How to avoid loading of the PyTorch model alongside ONNX model for pipeline #5

Open janmejay03 opened 1 year ago

janmejay03 commented 1 year ago

Hi ,

Nice notebook and blog . Thanks for that . Any updates on implementation of avoiding loading of pytorch model alongside ONNX model to get pipeline working . Any pointers would help too.

chainyo commented 1 year ago

Hi @janmejay03

Thanks for your message. You should consider using the huggingface/optimum package. It allows one user to convert and use any models on the hub to ONNX. It's a lot easier!