microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
13.99k stars 2.83k forks source link

Add onnxruntime_extensions in onnxruntime docker file #8639

Open Sriharan-Manogaran opened 3 years ago

Sriharan-Manogaran commented 3 years ago

Is your feature request related to a problem? Please describe. I was using Universal Sentence Encoder Multilingual using onnxruntime and onnxruntime_extensions python SDKs which is working fine. But when I moved to onnxruntime docker image and received the error for SentencepieceTokenizer. So can we have this part of our docker file and onnxruntime session?

Similarly, TF requires tensorflow-text in SDK but in a docker image, it was supported by default.

System information

Describe the solution you'd like We can include the onnxruntime_extensions module in the docker file and in the ort session.

pranavsharma commented 3 years ago

cc @wenbingl

wenbingl commented 3 years ago

check this dockerfile https://github.com/microsoft/onnxruntime/pull/8690 to see if it meets your requirements?

Sriharan-Manogaran commented 3 years ago

@wenbingl This is good, but this extension should be available in all DockerFile as onnxruntime library exists. At least in CPU, GPU, and server DockerFiles, we should have it to provide support to NLP-based models.

wenbingl commented 3 years ago

Agree, but updating CPU/GPU/TensorRT and others needs an amount of time, let's put it on the backlog. The dockerfile in PR provides an example of how to build them together, if anyone needs it immediately, they can make the change accordingly.

Sriharan-Manogaran commented 3 years ago

Sure, that's perfect. Kindly add this to the backlog or let me know how to do it.

wenbingl commented 3 years ago

@faxu can you help to put it the backlog.

wenbingl commented 3 years ago

@snnn , can we just merge this extension build step into CPU docker? GPU and others could be done later.

snnn commented 3 years ago

I don't know who is using these docker.

faxu commented 3 years ago

ONNX Runtime Server is deprecated. Because onnxurntime-extensions is still experimental at this time, we'd like to keep it out of the CPU/GPU dockerfiles in main onnxruntime. That said, if you have feedback on onnxruntime-extensions, we'd love to hear to help us understand usability and future additions to consider moving it out of the experimental stage. You can send me an email at: image

stale[bot] commented 2 years ago

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.