iterative / mlem

🐶 A tool to package, serve, and deploy any ML model on any platform. Archived to be resurrected one day🤞
https://mlem.ai
Apache License 2.0
717 stars 44 forks source link

`build`: more formats to export your model #208

Open aguschin opened 2 years ago

aguschin commented 2 years ago

MLEM could be a powerful tool if you need to distribute your model with different channels and use it in different circumstances (or easily switch between those). This could be a part of Value Props for us. So I think MLEM should be able to convert/build/export MLEM model to any widely used format and back, including:

This list is going to be updated. Please feel free to post a comment or upvote the existing ones if you need something we don't support yet :)

daavoo commented 2 years ago

Would love https://onnx.ai/

daavoo commented 2 years ago

And https://developer.nvidia.com/tensorrt :)

judahrand commented 1 year ago

BentoML compatible format? Alternatively, it might be good to contribute a MLEM runner to BentoML?

aguschin commented 1 year ago

last comment is related to https://github.com/iterative/mlem/issues/265