Closed alberttwong closed 7 years ago
What version of Ansible are you using?
ahh... so I had to do a force upgrade. now I have this.
Alberts-MacBook-Pro:ansible_aws_deployer alwong$ ansible-playbook -i 127.0.0.1 ansible/bu-workshop.yml -e "config=bu-workshop" -e "aws_region=us-west-2" -e "guid=atlanta"
[WARNING]: Host file not found: 127.0.0.1
[WARNING]: provided hosts list is empty, only localhost is available
[DEPRECATION WARNING]: Specifying include variables at the top-level of the task
is deprecated. Please see:
http://docs.ansible.com/ansible/playbooks_roles.html
#task-include-files-and-encouraging-reuse
for currently supported syntax
regarding included files and variables.
This feature will be removed in a future
release. Deprecation warnings can be disabled by setting
deprecation_warnings=False in ansible.cfg.
[WARNING]: While constructing a mapping from /Users/alwong/workshop/ansible_aws
_deployer/ansible/roles/common/tasks/config_repos.yml, line 14, column 3, found
a duplicate dict key (tags). Using last defined value only.
PLAY [Starting environment deployment] *****************************************
TASK [Generate CloudFormation Template] ****************************************
changed: [localhost]
TASK [Launch CloudFormation template] ******************************************
fatal: [localhost]: FAILED! => {"changed": false, "failed": true, "msg": ""}
to retry, use: --limit @/Users/alwong/workshop/ansible_aws_deployer/ansible/bu-workshop.retry
PLAY RECAP *********************************************************************
localhost : ok=1 changed=1 unreachable=0 failed=1
That's weird that there's no error.
Do you have your AWS CLI configured correctly? Can you rerun with -vvvv ?
Alberts-MacBook-Pro:ansible_aws_deployer alwong$ ansible-playbook -i workshop ansible/bu-workshop.yml -e "config=bu-workshop" -e "aws_region=us-west-2" -e "guid=atlanta" -vvvv
Using /Users/alwong/workshop/ansible_aws_deployer/ansible.cfg as config file
[WARNING]: Host file not found: workshop
[WARNING]: provided hosts list is empty, only localhost is available
[DEPRECATION WARNING]: Specifying include variables at the top-level of the task
is deprecated. Please see:
http://docs.ansible.com/ansible/playbooks_roles.html
#task-include-files-and-encouraging-reuse
for currently supported syntax
regarding included files and variables.
This feature will be removed in a future
release. Deprecation warnings can be disabled by setting
deprecation_warnings=False in ansible.cfg.
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/provision_cf.yml
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/dynamic_inventory.yml
[WARNING]: While constructing a mapping from /Users/alwong/workshop/ansible_aws
_deployer/ansible/roles/common/tasks/config_repos.yml, line 14, column 3, found
a duplicate dict key (tags). Using last defined value only.
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/roles/common/tasks/./config_repos.yml
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/roles/common/tasks/./subscription_manager_repos.yml
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/roles/common/tasks/./packages.yml
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/roles/nfs/tasks/./packages.yml
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/roles/nfs/tasks/./config_nfs_mount_storage.yml
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/roles/nfs/tasks/./nfs_exports.yml
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/roles/openshift-provisioner/tasks/./packages.yml
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/roles/openshift-node/tasks/./packages.yml
statically included: /Users/alwong/workshop/ansible_aws_deployer/ansible/roles/openshift-node/tasks/./install_docker_for_openshift.yml
Loading callback plugin default of type stdout, v2.0 from /Library/Python/2.7/site-packages/ansible/plugins/callback/__init__.pyc
PLAYBOOK: bu-workshop.yml ******************************************************
18 plays in ansible/bu-workshop.yml
PLAY [Starting environment deployment] *****************************************
TASK [Generate CloudFormation Template] ****************************************
task path: /Users/alwong/workshop/ansible_aws_deployer/ansible/provision_cf.yml:3
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: alwong
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695 `" && echo ansible-tmp-1481660362.15-45586138520695="` echo $HOME/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695 `" ) && sleep 0'
Using module file /Library/Python/2.7/site-packages/ansible/modules/core/files/stat.py
<127.0.0.1> PUT /var/folders/by/4pgnv4fx5pv1n4hjc7b_5plr0000gn/T/tmpFR1Vii TO /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695/stat.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695/ /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695/stat.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695/stat.py && sleep 0'
Using module file /Library/Python/2.7/site-packages/ansible/modules/core/files/file.py
<127.0.0.1> PUT /var/folders/by/4pgnv4fx5pv1n4hjc7b_5plr0000gn/T/tmpRTYfLJ TO /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695/file.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695/ /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695/file.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695/file.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.15-45586138520695/ > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"changed": false,
"diff": {
"after": {
"path": "../workdir/cf.bu-workshop.atlanta.json"
},
"before": {
"path": "../workdir/cf.bu-workshop.atlanta.json"
}
},
"gid": 20,
"group": "staff",
"invocation": {
"module_args": {
"backup": null,
"content": null,
"delimiter": null,
"dest": "../workdir/cf.bu-workshop.atlanta.json",
"diff_peek": null,
"directory_mode": null,
"follow": true,
"force": false,
"group": null,
"mode": null,
"original_basename": "cf.bu-workshop.template.j2",
"owner": null,
"path": "../workdir/cf.bu-workshop.atlanta.json",
"recurse": false,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"state": null,
"unsafe_writes": null,
"validate": null
}
},
"mode": "0644",
"owner": "alwong",
"path": "../workdir/cf.bu-workshop.atlanta.json",
"size": 12734,
"state": "file",
"uid": 501
}
TASK [Launch CloudFormation template] ******************************************
task path: /Users/alwong/workshop/ansible_aws_deployer/ansible/provision_cf.yml:8
Using module file /Library/Python/2.7/site-packages/ansible/modules/core/cloud/amazon/cloudformation.py
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: alwong
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1481660362.9-153000454478082 `" && echo ansible-tmp-1481660362.9-153000454478082="` echo $HOME/.ansible/tmp/ansible-tmp-1481660362.9-153000454478082 `" ) && sleep 0'
<127.0.0.1> PUT /var/folders/by/4pgnv4fx5pv1n4hjc7b_5plr0000gn/T/tmpcRsY2C TO /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.9-153000454478082/cloudformation.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.9-153000454478082/ /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.9-153000454478082/cloudformation.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python /Users/alwong/.ansible/tmp/ansible-tmp-1481660362.9-153000454478082/cloudformation.py; rm -rf "/Users/alwong/.ansible/tmp/ansible-tmp-1481660362.9-153000454478082/" > /dev/null 2>&1 && sleep 0'
fatal: [localhost]: FAILED! => {
"changed": false,
"failed": true,
"invocation": {
"module_args": {
"aws_access_key": null,
"aws_secret_key": null,
"disable_rollback": false,
"ec2_url": null,
"notification_arns": null,
"profile": null,
"region": "us-west-2",
"security_token": null,
"stack_name": "bu-workshop-atlanta",
"stack_policy": null,
"state": "present",
"tags": {
"Stack": "project bu-workshop-atlanta"
},
"template": "../workdir/cf.bu-workshop.atlanta.json",
"template_format": null,
"template_parameters": {},
"template_url": null,
"validate_certs": true
},
"module_name": "cloudformation"
},
"msg": ""
}
to retry, use: --limit @/Users/alwong/workshop/ansible_aws_deployer/ansible/bu-workshop.retry
PLAY RECAP *********************************************************************
localhost : ok=1 changed=0 unreachable=0 failed=1
Alberts-MacBook-Pro:ansible_aws_deployer alwong$
Ansible isn't telling you anything.
I also notice in your output:
"aws_access_key": null,
"aws_secret_key": null,
Your secret vars file is also likely misconfigured.
Alberts-MacBook-Pro:ansible_aws_deployer alwong$ ansible --version
ansible 2.2.0.0
config file = /Users/alwong/workshop/ansible_aws_deployer/ansible.cfg
configured module search path = Default w/o overrides
The example secret vars doesn't say to put in AWS keys
Alberts-MacBook-Pro:ansible_aws_deployer alwong$ cat Environments/bu-workshop_secret_vars.yml
rhel_subscription_user: alwong@redhat.com
rhel_subscription_pass: XXXXXX
rhel_pool_id:
- "8a85f981523a11c501523ccc41107c51"
I tried
ansible-playbook -i workshop ansible/bu-workshop.yml -e "config=bu-workshop" -e "aws_region=us-west-2" -e "guid=atlanta" -vvvv -e "aws_access_key=AKIAJN44NFRNI3HUWGNA" -e aws_secret_key=/XpPh0EdaBIlCUDUHIRRqjDH2Bx6PsAnHgiEh80G
all 3 options don't work on populating.
You're correct. The example secret vars file does not suggest the keys are required. But they are. This is already reported in #29 and is not yet fixed.
The file is called bu-workshop_secret_vars.yml
, not "secrets.yaml", as you are using the config bu-workshop
. The section of this repo's documentation says:
Vars files
Each "environment" has two vars files _vars and _secret_vars in the Environment folder. The example_secret_vars file shows the format for what to put in your bu-workshop_secret_vars file, if you were using the bu-workshop playbook.
The bu-workshop_vars file contains most of the configuration settings to use in the environment. Really the only ones you should expect to modify are the domain-related and number of (workshop) user options. All AMIs and sizing is preconfigured and automatic for the AWS region you deploy into.
It could be a little clearer, I suppose. But this is moving to a new implementation in the near future (group vars).
aws_access_key_id
and aws_secret_access_key
, which is why your 3rd attempt did not work -- you are not providing the right vars. You took the literal output of the arguments Ansible reported it passed to the module and assumed those were the vars. I can take partial blame for that, since that's what I pasted. Restructing some pieces of this. Also going to include an example secrets for reference including the AWS creds.
I have all the AWS settings put in and added my ssh-key to my shell. Modified subdomain_base var in
Environments/bu-workshop_vars.yml
and populatedbu-workshop_secret_vars.yml
with my infoTrying to run and getting the following error
I don't quite know what is wrong.