bentoml / BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
https://bentoml.com
Apache License 2.0
7.15k stars 791 forks source link

Encryption Support #1824

Closed yaysummeriscoming closed 1 year ago

yaysummeriscoming commented 3 years ago

We're having a look at using BentoML. So far looks to be a really good fit, but the problem we have is that we're deploying onto customer premises. So we need to have some protection against reverse engineering.

Doesn't need to be anything fancy, but the minimum requirements would be:

We'd do authentication and pre/post processing outside BentoML, so no worries there.

Interested to get your thoughts!

parano commented 2 years ago

Thank you for the feedback @yaysummeriscoming! BentoML 1.0 is around the corner and a lot of related things have changed, so I want to give an update:

Model encryption at rest: Is it possible to implement custom model load logic, similar to pre/post processing? Yes it is possible to implement custom model load logic. In BentoML 0.13 version, user may create a custom Artifact class where you may implemented encryption: https://docs.bentoml.org/en/v0.13.1/guides/custom_artifact.html

In BentoML 1.0, this can also be done by creating a custom model, related documentation are coming soon:

As part of 1.0 release, we also rebuild the model management features in Yatai, where it support using AWS s3 or MinIO as its model storage backend. Both should support encryption themselves.

Encrypted requests: Seems quite easy to do in the pre-processing code That's true, it is possible to do this in the pre-processing code.

Cython compilation of the BentoService definition: Not sure if something similar is already done? Could you share more about your requirements and use case for this one? Is your concern that BentoML uses a source distribution of your python code in the docker container?

dgriff67 commented 1 year ago

@parano When might the above documentation be published? Many Thanks

ssheng commented 1 year ago

Summarizing the discussion in the community Slack here. The documentation was unavailable because they were outdated. Model encryption is not currently supported out-of-the-box. We will then have to decrypt the model files in the BentoML process. Which will require updating the save_model and load_model APIs to encrypt and decrypt the model files in memory. I don’t think that is too hard to add a callback function to these APIs.