Closed alexdivet closed 2 years ago
hi @alexdivet, what is your aiohttp version?
my bad, it's aiohttp==4.0.0a1
and not the stable aiohttp==3.8.1
I've just realised I'm using pip install bentoml --pre
to install the preview version, but it also installs all pre-release / development versions of dependent libraries. Safer to go with pip install bentoml==1.0.0a2
@alexdivet that's indeed an important issue for preview release users, we will fix that in the next preview release to set a max version for those dependencies.
Describe the bug I'm testing
bentoml 1.0.0a2
following your quickstart but I'm hitting an issue with the production mode (it works fine without--production
)To Reproduce Follow the same steps as in the quickstart and run
bentoml serve iris_classifier:latest --production
.That's the definition of the service from the quickstart
Test a request
Screenshots/Logs
Environment: OS: MacOS 11.6 Python Version Python 3.8.12 BentoML Version BentoML-1.0.0a2