Closed darkcrawler01 closed 9 years ago
The splat (*
) operator in arguments is used to "unzip" a list/tuple into individual items as if you were passing them in directly. You can use the double-splat (**
) to expand a dictionary into keyword-args:
redshift_db = pline.RedshiftDatabase(
**{ 'connectionString' : connectionString,
'databaseName' : self.params.redshift_db,
'*password' : self.params.redshift_pass,
'name' : "redshiftDb",
'id' : "redshiftDb",
'username' : self.params.redshift_user })
Alternatively, you could omit the *password
argument from the initialization and add it after the fact by calling setattr()
;
setattr(redshift_db, '*password', self.params.redshift_pass)
Or add it to the fields
attribute directly:
redshift_db.fields['*password'] = self.params.redshift_pass
pretty cool thanks !
No problem! I hope this repo helps
I think its missing data pipeline parameters, other than that its clean and simple Thanks :+1:
If you link me to the page on AWS describing the parameters you mean I can see about adding them.
Parameters allow values to be set in run time from the UI. I use it for setting up constants like S3 bucket, etc Here's the doc: http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-custom-templates.html
Additionally, exposing validate
pipeline is very useful:
http://boto.readthedocs.org/en/latest/ref/datapipeline.html#boto.datapipeline.layer1.DataPipelineConnection.validate_pipeline_definition
print self.pipeline.region.validate_pipeline_definition(map(dict, self.pipeline.objects),
self.pipeline.pipeline_id)
Sorry for piling up the different features on a single issue.
I've opened a ticket with Amazon to figure out how to pass parameters into the pipeline definition. I tried creating a pipeline with parameters from a template, then using boto to pull the definition JSON and pushing that same JSON back onto the pipeline and it threw an error.
I've opened a pull request to make this easier in boto: https://github.com/boto/boto/pull/3297 I will update the pline with a workaround in the meantime
oops I didnt know that boto
doesn't support parameters
FYI boto3 does support parameters
http://boto3.readthedocs.org/en/latest/reference/services/datapipeline.html#DataPipeline.Client.put_pipeline_definition
Okay, I've bumped pline
to 0.2.0
and added support. See README sections:
Let me know if that works out for you.
awesome ! 1 tiny usecase is missed in parameters:
using parameters within strings
eg: s3://bucket/config-#{myEnv}.sh
i am guessing the ShellCommandActivity might look something like this.
pline.ShellCommandActivity(
...
command = ('s3://bucket/config-', param_env, '.sh') )
Or
command = 's3://bucket/config-%s' % param_env.lookup_key
Again .. Thanks for super quick feedback !
command = "s3://bucket/config-#{%s}.sh" % param_env.id
Assuming param_env.id is "myEnv", this would yield:
's3://bucket/config-#{myEnv}.sh'
Perfect ! thanks !
I am trying to do this:
redshift_db = pline.RedshiftDatabase( connectionString=connectionString, databaseName=self.params.redshift_db, *password=self.params.redshift_pass, name="redshiftDb", id="redshiftDb", username=self.params.redshift_user)
Any suggestions to achieve that ?