bentoml / BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
https://bentoml.com
Apache License 2.0
7.16k stars 792 forks source link

Deployment on remote Yatai server fails due to injection issue #1622

Closed nicjac closed 3 years ago

nicjac commented 3 years ago

Describe the bug

Attempting to deploy to SageMaker or Lambda fails with this error:

Error: sagemaker deploy failed: INTERNAL:<dependency_injector.wiring.Provide object at 0x11f748be0> has type Provide, but expected one of: bytes, unicode

To Reproduce

This is based on the latest version of the code as of this writing

Expected behavior Deployment should proceed normally, and the error message should not be displayed.

Environment:

Additional context

After some initial debugging, the error appears to originate from this line: https://github.com/bentoml/BentoML/blob/4019bac4af320bad73bf960f6bd2d617f3fd4a52/bentoml/yatai/yatai_service_impl.py#L106

self.default_namespace is not wired / injected properly, and will instead be a Provide object. This causes issues downstream as a string is expected. A workaround is to specify the environment when deploying via the CLI (--namespace).

My hunch is that YataiServiceImpl does not get properly wired/injected due to it being wrapped in the get_yatai_service_impl method here:https://github.com/bentoml/BentoML/blob/4019bac4af320bad73bf960f6bd2d617f3fd4a52/bentoml/yatai/yatai_service_impl.py#L74

I have little experience with dependency injection so couldn't figure out why it wouldn't get wired properly.

yubozhao commented 3 years ago

Thank you for digging into this issue. I will take a look with the latest master

yubozhao commented 3 years ago

@nicjac I include the proper module, it works with the latest master now. Feel free to reopen if it still happens