amancevice / pline

AWS Pipeline Wrapper for boto3
MIT License
5 stars 4 forks source link

Is there a way to set `*password` in Database #1

Closed darkcrawler01 closed 9 years ago

darkcrawler01 commented 9 years ago

I am trying to do this:

redshift_db = pline.RedshiftDatabase( connectionString=connectionString, databaseName=self.params.redshift_db, *password=self.params.redshift_pass, name="redshiftDb", id="redshiftDb", username=self.params.redshift_user)

    *password=self.params.redshift_pass,
             ^
SyntaxError: invalid syntax

Any suggestions to achieve that ?

amancevice commented 9 years ago

The splat (*) operator in arguments is used to "unzip" a list/tuple into individual items as if you were passing them in directly. You can use the double-splat (**) to expand a dictionary into keyword-args:

redshift_db = pline.RedshiftDatabase(
    **{ 'connectionString' : connectionString, 
        'databaseName'     : self.params.redshift_db, 
        '*password'        : self.params.redshift_pass, 
        'name'             : "redshiftDb", 
        'id'               : "redshiftDb", 
        'username'         : self.params.redshift_user })

Alternatively, you could omit the *password argument from the initialization and add it after the fact by calling setattr();

setattr(redshift_db, '*password', self.params.redshift_pass)

Or add it to the fields attribute directly:

redshift_db.fields['*password'] = self.params.redshift_pass
darkcrawler01 commented 9 years ago

pretty cool thanks !

amancevice commented 9 years ago

No problem! I hope this repo helps

darkcrawler01 commented 9 years ago

I think its missing data pipeline parameters, other than that its clean and simple Thanks :+1:

amancevice commented 9 years ago

If you link me to the page on AWS describing the parameters you mean I can see about adding them.

darkcrawler01 commented 9 years ago

Parameters allow values to be set in run time from the UI. I use it for setting up constants like S3 bucket, etc Here's the doc: http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-custom-templates.html

Additionally, exposing validate pipeline is very useful: http://boto.readthedocs.org/en/latest/ref/datapipeline.html#boto.datapipeline.layer1.DataPipelineConnection.validate_pipeline_definition

     print self.pipeline.region.validate_pipeline_definition(map(dict, self.pipeline.objects), 
      self.pipeline.pipeline_id)

Sorry for piling up the different features on a single issue.

amancevice commented 9 years ago

I've opened a ticket with Amazon to figure out how to pass parameters into the pipeline definition. I tried creating a pipeline with parameters from a template, then using boto to pull the definition JSON and pushing that same JSON back onto the pipeline and it threw an error.

amancevice commented 9 years ago

I've opened a pull request to make this easier in boto: https://github.com/boto/boto/pull/3297 I will update the pline with a workaround in the meantime

darkcrawler01 commented 9 years ago

oops I didnt know that boto doesn't support parameters FYI boto3 does support parameters http://boto3.readthedocs.org/en/latest/reference/services/datapipeline.html#DataPipeline.Client.put_pipeline_definition

amancevice commented 9 years ago

Okay, I've bumped pline to 0.2.0 and added support. See README sections:

Let me know if that works out for you.

darkcrawler01 commented 9 years ago

awesome ! 1 tiny usecase is missed in parameters:

Or

  command  = 's3://bucket/config-%s' %  param_env.lookup_key

Again .. Thanks for super quick feedback !

amancevice commented 9 years ago
command  = "s3://bucket/config-#{%s}.sh" %  param_env.id

Assuming param_env.id is "myEnv", this would yield:

's3://bucket/config-#{myEnv}.sh'
darkcrawler01 commented 9 years ago

Perfect ! thanks !