Closed abhijelly closed 1 year ago
The approach 1, you don't have a config.json file which specifies which subs, resource_group and workspace you are using. The config.json should be something like this:
{ "subscription_id": "", "resource_group": "", "workspace_name": "" }
You can add the config.json file to the source_directory which is configured in ParallelRunConfig.
Or you can hard-code doing this: ws = Workspace(subscription_id=, resource_group=, workspace_name) model_path = Model.get_model_path("forecast_model/model.cb", version=1, _workspace=ws )
thank you for answering!
hardcoding approach worked for me because during batch processing, the parallel worker was not recognizing the workspace config file. One correction to your answer is, Model.get_model_path()
should only be given the model folder name not the complete path to the model file. After getting the model folder path, append to the "model.cb" so that model can be loaded
I have modified the demand forecasting template for my use case. I'm unable to load my model in score script. The model I'm using is a CatBoost model which has its owns
load_module()
method. I've tried the following approaches which are failing -Approach 1: Loading the model registered in the workspace
Approach 2: Passing the
model_path
as a n argument in theParrallRunStep
AZUREML_MODEL_DIR
is None for some reaasonmodel.cb
in source directory and just simply loading the model using CatBoostload_module()
methodAny thoughts what I might be doing wrong in my approaches? It would be highly appreciated! Thank you!