Closed notniknot closed 3 years ago
Hi @notniknot. Thanks for reporting! Could you plz give it a try on my branch "https://github.com/bentoml/BentoML/pull/1705"?
pip install git+https://github.com/bojiang/BentoML.git@di2
Hi @bojiang. Sorry, but I still get a presigned amazonaws url. The _get_yataiservice function is missing the proposed changes.
@notniknot Oh, I see. Thanks.
Describe the bug When using a self hosted MinIO as S3 storage, the generated presigned url contains the amazonaws domain, such as 'https://yatai-bucket.s3.amazonaws.com/IrisClassifier/26.tar.gz?...'. Thus, storing and loading models from S3 storage result in an error. This bug is new in Version 0.13.0.
The function _get_yataiservice lacks the s3_endpoint_url parameter. Additionally, if no channel_address in _get_yataiservice is given, the call to _createrepository for the repository parameter of LocalYataiService lacks the s3_endpoint_url.
This can be fixed by first adding the following paramter to the _get_yataiservice function (yatai/yatai_services.py):
s3_endpoint_url: str = Provide[BentoMLContainer.config.yatai.repository.s3.endpoint_url],
And second, adding the s3_endpoint as paramter to the _createrepository call inside of the _get_yataiservice function (yatai/yatai_services.py):
To Reproduce
Expected behavior The model gets saved in the MinIO storage.
Environment: