bentoml / BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
https://bentoml.com
Apache License 2.0
7.18k stars 792 forks source link

feat: Bento build from a container #2495

Open parano opened 2 years ago

parano commented 2 years ago

The Bento build process requires loading the bentoml.Service object for validating the definition and retrieving required models to package. Currently, running bentoml build requires all Service dependencies to be installed, in order to import the Service object. However this brings challenges to build environments other than the data scientists' development machine, e.g. a CI pipeline, where users need to install python and all required dependencies, in order to build a Bento.

Since the bentofile.yaml already defines all required dependencies, we propose to add a --docker option to bentoml build. When the --docker flag is supplied, BentoML will first build a docker image with the bentofile.yaml specification, start a container with the image and mount the build_ctx directory and the BENTOML_HOME directory. And then run bentoml build inside the container, to generate the new Bento.

This simplifies building CI pipeline, where users can simply do the following, without specifying dependencies for the CI pipeline again:

pip install bentoml
git clone my_project_repo_url
cd my_project
bentoml build --docker
bentoml push my_bento:latest
bentoml containerize my_bento:latest

Challenges:

virgile-blg commented 1 year ago

Hello ! Very much looking forward to this feature. Right now is it very complicated / hacky to be able to build and containerize Bentos in a CI pipeline for example.

I somehow managed to write a CI that builds and pushes a Bento to a container registery. Here's one solution :

avinash-penti commented 10 months ago

Hello ! Very much looking forward to this feature. Right now is it very complicated / hacky to be able to build and containerize Bentos in a CI pipeline for example.

I somehow managed to write a CI that builds and pushes a Bento to a container registery. Here's one solution :

  • Start by building a custom Docker image that have the necessary "base" packages : Docker, Python, BentoML
  • As mentioned in the feature request, bentoml build requires having all the Service dependencies. For that, one can provide a script that inspects the bentofile.yml and install all the dependencies in the running container before doing bento build. This script was added into the custom docker image so the CI can run it.
  • For the containerization, one last issue is that the Bento tag is coming from the Service's name in the python code. We need to "parse" from bentoml list for the right tag in order to containerize. For that, it would be nice to set the Bento tag (service name) directly in the bentofile.yml, I guess it would be more convenient.

@virgile-blg If you could paste the code snippet regarding the CI to push to a registry, that'd be helpful. I'm trying to achieve similar and kind of and struck.