aws / aws-step-functions-data-science-sdk-python

Step Functions Data Science SDK for building machine learning (ML) workflows and pipelines on AWS
Apache License 2.0
289 stars 88 forks source link

partitions(aws-cn, us-gov) support #120

Closed zxkane closed 3 years ago

zxkane commented 3 years ago

All resources of task are hard code partition aws. The partition info should be fetched from boto3 client.

zxkane commented 3 years ago

We are using below workaround for reference,

from stepfunctions.steps.fields import Field

attrs=getattr(step, ‘fields’)
attrs[Field.Resource.value]=‘arn:aws-cn:states:::glue:startJobRun.sync’
setattr(step, ‘fields’, attrs)
wong-a commented 3 years ago

Noted. Thanks for raising this!

Setting Field.Resource.value for Steps such as this one need to be updated.

hlmiao commented 3 years ago

How test the fixed version, same issue as previous experienced in cn-north-1. { "error": "States.Runtime", "cause": "An error occurred while executing the state 'Extract, Transform, Load' (entered at the event id #2). The resource belongs to a different partition from the running execution." }

wong-a commented 3 years ago

@hlmiao https://github.com/aws/aws-step-functions-data-science-sdk-python/pull/131 hasn't been merged or released yet. If you want to test it out, you can pull the source from the PR's branch.

hlmiao commented 3 years ago

Got it.

wong-a commented 3 years ago

Released in v2.1.0: https://github.com/aws/aws-step-functions-data-science-sdk-python/releases/tag/v2.1.0