Open timnoinc opened 7 years ago
+1000
+1
+1
In the meantime, I was able to get this working in a pipeline script like so:
step([$class: 'AWSEBDeploymentBuilder', zeroDowntime: false,
awsRegion: '****', applicationName: '****', environmentName: '****',
bucketName: ''****', rootObject: "app.zip",
versionLabelFormat: '****',
versionDescriptionFormat: '****'])
hi for the pipeline script above, has anybody tried it yet? How to passs AWS credentials in the script?
How to pass AWS credentials in the script?
tried to wrap the step in a withCredentials() block, and also just setting environment variables, however for some reason this wasn't working. An IAM Role on the Jenkins EC2 instance succeeded. If anyone has this all set up perfectly, please post an example. Thanks.
for the love of Pete, after a good deal of trial and error I got it to work in a declarative pipeline Here is my sanitized stage the gcs var is optional. Make sure you have proper IAM permissions for the s3 bucket and cloudformation
environment { jk_aws_id = '<JENKINS_AWS_CREDENTIAL_ID>' GIT_BRANCH_NAME = "${sh(script:'echo ${GIT_BRANCH##*/}', returnStdout: true).trim()}" } stages { stage('Saving to Elastic Beanstalk and Deploy') { environment { gcs = "${sh(script:'echo -n ${GIT_BRANCH_NAME,,}-${GIT_COMMIT:0:8}', returnStdout: true).trim()}" } when { anyOf { branch 'develop'; branch 'master' } } steps { step([$class: 'AWSEBDeploymentBuilder', credentialId: "${jk_aws_id}", awsRegion: 'us-east-1', applicationName: '<APP_NAME>', environmentName: '<ENVIRON_NAME>', rootObject: '.', includes: '**/*', excludes: '', bucketName: '<S3_BUCKET_NAME>', keyPrefix: '<S3_FOLDER_NAME>', versionLabelFormat: "$gcs", versionDescriptionFormat: "$gcs", sleepTime: '10', checkHealth: 'true', maxAttempts: '12' ]) } } }
Hey I was trying to use this with jenkins pipeline, but I don't see any Beanstalk related steps. It would be great if those were included!