As a customer,
I want to launch an app implementing Triton Inference Server
In order to
deploy my models in production with optimisation and high availability.
Acceptance criteria
Availability of the Triton Inference Server
Ability to deploy with CPU or GPU resources
Ability to deploy via UI, CLI or API
Follow, vote and give your feedback
You can follow this task with the notification on the right tab.
Ask us anything here in the comments below, and vote with emojis for most requested items !
👍 to vote for this issue
User story
As a customer, I want to launch an app implementing Triton Inference Server In order to deploy my models in production with optimisation and high availability.
Acceptance criteria
Follow, vote and give your feedback
You can follow this task with the notification on the right tab. Ask us anything here in the comments below, and vote with emojis for most requested items ! 👍 to vote for this issue
Discuss on Discord
Feel free to discuss with us on https://discord.com/invite/PwPqWUpN8G