Open adiun opened 4 years ago
@adiun: I ran into the same error. As a intermediate fix use the yml format for inference config - it worked for me. Template for inference_config.yml: https://pastebin.com/CzX5pViy
We have located the root cause and are working on releasing a fix.
A related comment: the bug is only applicable in the deployment which doesn't use Azure Machine Learning Environment. We encourage the usage of Environment based deployment which is faster and more reliable.
@adiun, in your case, using extraDockerfileSteps in InferenceConfig file makes the deployment goes to the non-Environment route. You can consider switch to Environment base deployment documented at: https://docs.microsoft.com/en-us/azure/machine-learning/how-to-use-environments#enable-docker
@rsethur Thanks! @gogowings Thanks for the updates and the tip about using extraDockerFileSteps. I received this guidance separately as well re. deployment performance so will definitely look into switching to that.
Extension name (the extension in question)
Description of issue (in as much detail as possible)
When running the following model deployment command, with
azure-cli-ml
1.0.81 this is now throwing an error. This did not throw an error inazure-cli-ml
1.0.79.az ml model deploy -n quote-clf --inference-config-file src/inferconfig.json --auth-enabled true --deploy-config-file src/deployconfig_stab.json --compute-target aks-aml-stab -m quote_clf:3 -m quote_tfidf:3 --workspace-name aml-stab --resource-group aml-stab --overwrite -v
Error:
src/inferconfig.json
:src/deployconfig_stab.json
:src/environment.yaml
: