For an AutoML Forecasting experiment, I'd like to compare the performance of the best model with the performance of another model from the same experiment.
For an AutoML run, I understand how to get the best performing model and its metrics like this:
However, this list seems to be in an arbitrary order and I struggle to get the corresponding job names for the algorithms.
"internal" job names for the child runs
The child runs seem to have different names than the names shown in Azure ML Studio.
They are named for instance upbeat_square_abs3942_2 - i.e. the name of the parent run upbeat_square_abs3942 followed by underscore plus a number (_2in this example).
But Azure ML Studio displays names like (no upbeat_square_abs3942_2 to be found):
What I'm trying to do
For an AutoML Forecasting experiment, I'd like to compare the performance of the best model with the performance of another model from the same experiment.
For an AutoML run, I understand how to get the best performing model and its metrics like this:
But how can I fetch the job for any model based on the algorithm name? Something like:
So far, I managed to figure out the following:
list of algorithms used in the AutoML experiment
However, this list seems to be in an arbitrary order and I struggle to get the corresponding job names for the algorithms.
"internal" job names for the child runs
The child runs seem to have different names than the names shown in Azure ML Studio. They are named for instance
upbeat_square_abs3942_2
- i.e. the name of the parent runupbeat_square_abs3942
followed by underscore plus a number (_2
in this example).But Azure ML Studio displays names like (no
upbeat_square_abs3942_2
to be found):So this code works:
but using a name shown in the screenshot above throws an exception, e.g.
Question
How can I obtain the model and metrics for any algorithm used in the experiment?
Thanks!