Hi team, I met an exception when importing an exported model, where the target mlflow deployment uses s3 to store artifacts. The model got uploaded to s3 successfully, but failed at "import_version" function path check:
def _import_version(self, model_name, src_vr, dst_run_id, dst_source, sleep_time):
"""
:param model_name: Model name.
:param src_vr: Source model version.
:param dst_run: Destination run.
:param dst_source: Destination version 'source' field.
:param sleep_time: Seconds to wait for model version crreation.
"""
dst_source = dst_source.replace("file://","") # OSS MLflow
if not dst_source.startswith("dbfs:") and not os.path.exists(dst_source):
raise MlflowExportImportException(f"'source' argument for MLflowClient.create_model_version does not exist: {dst_source}")
Raising an exception 'source' argument for MLflowClient.create_model_version does not exist: s3:...... I commented out the check and the model version got imported successfully. Shall we add s3 paths to supported dst_source? Thanks all
Hi team, I met an exception when importing an exported model, where the target mlflow deployment uses s3 to store artifacts. The model got uploaded to s3 successfully, but failed at "import_version" function path check:
Raising an exception
'source' argument for MLflowClient.create_model_version does not exist: s3:.....
. I commented out the check and the model version got imported successfully. Shall we add s3 paths to supported dst_source? Thanks all