aws / amazon-sagemaker-feedback

Amazon SageMaker Public Feedback Dashboard
Creative Commons Attribution Share Alike 4.0 International
5 stars 1 forks source link

Unable to see training job information in model registry #90

Open venkysriram23 opened 4 months ago

venkysriram23 commented 4 months ago

Product Version

Issue Description

I created a SageMaker Pipeline with train-step, eval-step and model-registry step. After the pipeline successfully completes I see the model is approved and registered in model package-group but it doesn't show training job information in version. The metrics saved in the registry step also do not show up. The model artifact s3 uri also is not visible.

Expected Behavior

I could see the metrics appearing in the pipeline graph

image

Observed Behavior

Following is the training job related to pipeline run

image

But it doesn't show up in the registry

image

Product Category

Models, Pipelines

Feedback Category

No response

Other Details

No response

ntw-au commented 3 months ago

We also are experiencing the same issue. This even affects models that previously displayed metrics in Studio as intended. We have made no relevant changes to our pipeline in that time.

venkysriram23 commented 3 months ago

Hello, I created few more pipelines and this problem still seems to persist on studio. Training jobs are not being correctly attached to model-package version.

image image image

Any possible suggestions to resolve this issue ?

AbdelrahmanRagab38 commented 3 months ago

we facing the same issue on our project , and i am trying to debug it but everything seems works good, any possible suggestion to resolve it ? and also the sagemaker studio refresh the page every second !

ntw-au commented 3 months ago

Something appears to have been updated in SageMaker Studio within the last few days. The model page now shows detail from the linked training and evaluation jobs, including model metrics. The confusion matrix isn't displaying in a coloured matrix like it used to, but in general this is a significant improvement.

image

AbdelrahmanRagab38 commented 3 months ago

I tried to add a custom visual to appear with the metrics but i can't see it , is it applicable to add a custom plots to the evaluate? how you add the confusion metrics ? please share the resources with me @ntw-au

ntw-au commented 3 months ago

@AbdelrahmanRagab38, you can't add custom plots, but you can add a confusion matrix by formatting according to Model Quality Metrics.

ntw-au commented 2 months ago

Confirming that the confusion matrix and metric table is now working as it did prior, so from our perspective this issue is resolved.

AbdelrahmanRagab38 commented 2 months ago

@ntw-au yes, now the issue solved And also the visuals appears in the studio (not only on the classic studio as it was before) Right?

I am still searching for a way to add custom metrics? No way till now? What is your opinion/ alternative

ntw-au commented 2 months ago

There is an example of writing an evaluation.json file at https://github.com/aws/amazon-sagemaker-examples/blob/main/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/scikit_learn_data_processing_and_model_evaluation.ipynb.

The file needs to be in the format given by Model Quality Metrics, and you can add custom entries such as below as long as they're numeric only.

{
    "multiclass_classification_metrics": {
        "my_custom_metric": {
            "value": 0.53
        }
    }
}

You need to add a value key, but standard_deviation is optional.