iterative / mlem

🐶 A tool to package, serve, and deploy any ML model on any platform. Archived to be resurrected one day🤞
https://mlem.ai
Apache License 2.0
717 stars 44 forks source link

`serve`: more ways to serve the model #431

Open aguschin opened 2 years ago

aguschin commented 2 years ago

You can serve your model with MLEM with:

$ mlem serve $FRAMEWORK models/mymodel

$FRAMEWORK values we want to support:

This list is going to be updated. Please feel free to post a comment or upvote the existing ones if you need something we don't support yet :)

wiswisbus commented 2 years ago

It would be great if mlem supports triton inference server