kserve / modelmesh-serving

Controller for ModelMesh
Apache License 2.0
202 stars 114 forks source link

[Triton] Inference Service with multiple models #514

Open haiminh2001 opened 3 months ago

haiminh2001 commented 3 months ago

Is your feature request related to a problem? If so, please describe.

Context:

Describe your proposed solution First of all, excuse me if this Issue is on the wrong project, I think it should be on the adapter project but I also want to know is there an alternative to Model Mesh to solve my problem. I am new to KServe. My proposed solution is to make the Inference Service accept multiple models. The benefits of this approach are: