bentoml / BentoML

The easiest way to serve AI apps and models - Build reliable Inference APIs, LLM apps, Multi-model chains, RAG service, and much more!
https://bentoml.com
Apache License 2.0
6.99k stars 779 forks source link

Deploy Yatai By remote db and local path report can't locate config file 'bentoml.yml' #1796

Closed wuxutao closed 2 years ago

wuxutao commented 3 years ago

Describe the bug I deploy Yatai Service by a remote PostgreSQL database and local repo base,
When I use command: bentoml containerize IrisClassifier:latest -t iris-classifier --yatai-url=127.0.0.1:50051 It report error: Error: bentoml-cli containerize failed: BentoML can't locate config file 'bentoml.yml' in saved bundle in path: /bentoml/repository/IrisClassifier/20210803122208_2E7460

But the file "bentoml.yml" is in the docker volume

To Reproduce

  1. Start Yaitai service: "docker run -p 3000:3000 -p 50051:50051 -v ~/bentoml:/bentoml bentoml/yatai-service:latest --db-url postgresql://postgres:postgres@192.168.0.47:15433/wt --repo-base-url /bentoml/repository"
  2. Save Bento bundle to Yatai server

    bento_svc.save(yatai_url="127.0.0.1:50051")

  3. bentoml containerize IrisClassifier:latest --yatai-url=127.0.0.1:50051

It will report error

image

and what is "api.amplitude.com"?

mqk commented 3 years ago

I also see this same error. In my setup I deployed the yatai service on kubernetes (minikube).

$ bentoml containerize --yatai-url http://127.0.0.1:62838 IrisClassifier:latest
Found Bento: /root/bentoml/repository/IrisClassifier/20210823162836_FDA855
Error: bentoml-cli containerize failed: BentoML can't locate config file 'bentoml.yml' in saved bundle in path: /root/bentoml/repository/IrisClassifier/20210823162836_FDA855
parano commented 2 years ago

Hi @wuxutao @mqk - sorry about the delay in getting back to you. The reason for this issue is that in BentoML 0.13, Yatai requires a storage backend for storing the Models and it supports both local file system and blob storage(AWS S3, MinIO, Azure blob storage). However, when Yatai is deployed in production mode and connected remotely, only blob storage is supported. Local storage is only meant for local Yatai only. This is definitely something we should document better, and also show a better error message.

Note that BentoML 1.0 and new Yatai project are around the corner. In the new release, local model/bento management no longer relies on Yatai and, Yatai will only support blob storage as the model/bento storage backend. And setting up Yatai on kubernetes will be a lot easier via our official Helm chart, which will automatically create an MinIO instance in the cluster for storage by default.