aws / aws-sdk-go

AWS SDK for the Go programming language.
http://aws.amazon.com/sdk-for-go/
Apache License 2.0
8.62k stars 2.07k forks source link

Cannot use ml.t3.medium in region due to validation #5229

Closed Shaked closed 5 months ago

Shaked commented 5 months ago

Describe the bug

Hey I'm working with https://github.com/hashicorp/terraform-provider-aws which uses the aws-sdk-go client. While using aws_sagemaker_endpoint_configuration there's an instance type validation running:

https://github.com/hashicorp/terraform-provider-aws/blob/624d56a4205acbbc4cf59b6ccf2ae96e8e3b1f7a/internal/service/sagemaker/endpoint_configuration.go#L310-L316

This validation uses https://github.com/aws/aws-sdk-go/blob/main/service/sagemaker/api.go, which seems to have ml.t2.medium but not ml.t3.medium:

image

Expected Behavior

In this case terraform plan will allow ml.t2.medium but fail when running terraform apply and won't allow a terraform plan with ml.t3.medium while it should work via apply.

Current Behavior

The following error is thrown for during a terraform plan for ml.t3.medium

expected production_variants.0.instance_type to be one of ["ml.t2.medium" "ml.t2.large" "ml.t2.xlarge" "ml.t2.2xlarge" "ml.m4.xlarge" "ml.m4.2xlarge" "ml.m4.4xlarge" "ml.m4.10xlarge" "ml.m4.16xlarge" "ml.m5.large" "ml.m5.xlarge" "ml.m5.2xlarge" "ml.m5.4xlarge" "ml.m5.12xlarge" "ml.m5.24xlarge" "ml.m5d.large" "ml.m5d.xlarge" "ml.m5d.2xlarge" "ml.m5d.4xlarge" "ml.m5d.12xlarge" "ml.m5d.24xlarge" "ml.c4.large" "ml.c4.xlarge" "ml.c4.2xlarge" "ml.c4.4xlarge" "ml.c4.8xlarge" "ml.p2.xlarge" "ml.p2.8xlarge" "ml.p2.16xlarge" "ml.p3.2xlarge" "ml.p3.8xlarge" "ml.p3.16xlarge" "ml.c5.large" "ml.c5.xlarge" "ml.c5.2xlarge" "ml.c5.4xlarge" "ml.c5.9xlarge" "ml.c5.18xlarge" "ml.c5d.large" "ml.c5d.xlarge" "ml.c5d.2xlarge" "ml.c5d.4xlarge" "ml.c5d.9xlarge" "ml.c5d.18xlarge" "ml.g4dn.xlarge" "ml.g4dn.2xlarge" "ml.g4dn.4xlarge" "ml.g4dn.8xlarge" "ml.g4dn.12xlarge" "ml.g4dn.16xlarge" "ml.r5.large" "ml.r5.xlarge" "ml.r5.2xlarge" "ml.r5.4xlarge" "ml.r5.12xlarge" "ml.r5.24xlarge" "ml.r5d.large" "ml.r5d.xlarge" "ml.r5d.2xlarge" "ml.r5d.4xlarge" "ml.r5d.12xlarge" "ml.r5d.24xlarge" "ml.inf1.xlarge" "ml.inf1.2xlarge" "ml.inf1.6xlarge" "ml.inf1.24xlarge" "ml.dl1.24xlarge" "ml.c6i.large" "ml.c6i.xlarge" "ml.c6i.2xlarge" "ml.c6i.4xlarge" "ml.c6i.8xlarge" "ml.c6i.12xlarge" "ml.c6i.16xlarge" "ml.c6i.24xlarge" "ml.c6i.32xlarge" "ml.g5.xlarge" "ml.g5.2xlarge" "ml.g5.4xlarge" "ml.g5.8xlarge" "ml.g5.12xlarge" "ml.g5.16xlarge" "ml.g5.24xlarge" "ml.g5.48xlarge" "ml.p4d.24xlarge" "ml.c7g.large" "ml.c7g.xlarge" "ml.c7g.2xlarge" "ml.c7g.4xlarge" "ml.c7g.8xlarge" "ml.c7g.12xlarge" "ml.c7g.16xlarge" "ml.m6g.large" "ml.m6g.xlarge" "ml.m6g.2xlarge" "ml.m6g.4xlarge" "ml.m6g.8xlarge" "ml.m6g.12xlarge" "ml.m6g.16xlarge" "ml.m6gd.large" "ml.m6gd.xlarge" "ml.m6gd.2xlarge" "ml.m6gd.4xlarge" "ml.m6gd.8xlarge" "ml.m6gd.12xlarge" "ml.m6gd.16xlarge" "ml.c6g.large" "ml.c6g.xlarge" "ml.c6g.2xlarge" "ml.c6g.4xlarge" "ml.c6g.8xlarge" "ml.c6g.12xlarge" "ml.c6g.16xlarge" "ml.c6gd.large" "ml.c6gd.xlarge" "ml.c6gd.2xlarge" "ml.c6gd.4xlarge" "ml.c6gd.8xlarge" "ml.c6gd.12xlarge" "ml.c6gd.16xlarge" "ml.c6gn.large" "ml.c6gn.xlarge" "ml.c6gn.2xlarge" "ml.c6gn.4xlarge" "ml.c6gn.8xlarge" "ml.c6gn.12xlarge" "ml.c6gn.16xlarge" "ml.r6g.large" "ml.r6g.xlarge" "ml.r6g.2xlarge" "ml.r6g.4xlarge" "ml.r6g.8xlarge" "ml.r6g.12xlarge" "ml.r6g.16xlarge" "ml.r6gd.large" "ml.r6gd.xlarge" "ml.r6gd.2xlarge" "ml.r6gd.4xlarge" "ml.r6gd.8xlarge" "ml.r6gd.12xlarge" "ml.r6gd.16xlarge" "ml.p4de.24xlarge" "ml.trn1.2xlarge" "ml.trn1.32xlarge" "ml.trn1n.32xlarge" "ml.inf2.xlarge" "ml.inf2.8xlarge" "ml.inf2.24xlarge" "ml.inf2.48xlarge" "ml.p5.48xlarge" "ml.m7i.large" "ml.m7i.xlarge" "ml.m7i.2xlarge" "ml.m7i.4xlarge" "ml.m7i.8xlarge" "ml.m7i.12xlarge" "ml.m7i.16xlarge" "ml.m7i.24xlarge" "ml.m7i.48xlarge" "ml.c7i.large" "ml.c7i.xlarge" "ml.c7i.2xlarge" "ml.c7i.4xlarge" "ml.c7i.8xlarge" "ml.c7i.12xlarge" "ml.c7i.16xlarge" "ml.c7i.24xlarge" "ml.c7i.48xlarge" "ml.r7i.large" "ml.r7i.xlarge" "ml.r7i.2xlarge" "ml.r7i.4xlarge" "ml.r7i.8xlarge" "ml.r7i.12xlarge" "ml.r7i.16xlarge" "ml.r7i.24xlarge" "ml.r7i.48xlarge"], got ml.t3.medium

A similar error is thrown when using terraform apply with ml.t2.medium since it's not available in the region I'm working with.

Reproduction Steps

  1. Create terraform workflow
  2. Use https://github.com/hashicorp/terraform-provider-aws
  3. Use aws_sagemaker_endpoint_configuration
  4. Use ml.t3.medium as the instance type

Alternatively you most likely be able to use ProductionVariantInstanceType_Values() and run it against the non existing instance type which should prove the point

Possible Solution

Either to add ml.t3.* machines or to automatically generate it from AWS instance list

Additional Information/Context

No response

SDK version used

v1.51.23

Environment details (Version of Go (go version)? OS name and version, etc.)

Mac/Linux/Go 1.21

RanVaknin commented 5 months ago

Hi @Shaked ,

The AWS SDK is code generated from the API model of each AWS service it interacts with. In this case, the service API for sagemaker was not modeled with the ml.t3.* instance types. I was not sure if this was intentional or simply an oversight in terms of modeling so I have reached out to the Sagemaker service team internally and got this response:

SageMaker inference does not support ml.t3. instance type hence this is intentional. We enable only the instance types that are supported by our service.

Thanks, Ran~

github-actions[bot] commented 5 months ago

Comments on closed issues are hard for our team to see. If you need more assistance, please either tag a team member or open a new issue that references this one. If you wish to keep having a conversation with other community members under this issue feel free to do so.