Open magick93 opened 4 years ago
@magick93,
Could you exec into one of your AWX containers and run this (and share the results)?
~ awx-manage shell_plus
(then, once the shell comes up...)
print(ProjectUpdate.objects.get(pk=1537).result_stdout_text)
print(ProjectUpdate.objects.get(pk=1537).job_env)
Hi @ryanpetrello
print(ProjectUpdate.objects.get(pk=1537).result_stdout_text)
doesnt return anything.
print(ProjectUpdate.objects.get(pk=1537).job_env)
returns:
{'AWX_RMQ_MGMT_SERVICE_PORT_RMQMGMT': '15672', 'LC_ALL': 'en_US.UTF-8', 'AWX_WEB_SVC_SERVICE_PORT_HTTP': '80', 'AWX_WEB_SVC_PORT_80_TCP': 'tcp://172.30.178.191:80', 'KUBERNETES_PORT_53_UDP': 'udp://172.30.0.1:53', 'LANG': 'en_US.UTF-8', 'RABBITMQ_SERVICE_HOST': '172.30.38.89', 'AWX_WEB_SVC_SERVICE_PORT': '80', 'HOSTNAME': 'awx-0', 'RABBITMQ_PORT_15672_TCP': 'tcp://172.30.38.89:15672', 'AWX_WEB_SVC_SERVICE_HOST': '172.30.178.191', 'AWX_RMQ_MGMT_SERVICE_HOST': '172.30.203.25', 'KUBERNETES_PORT_53_UDP_PORT': '53', 'POSTGRESQL_SERVICE_PORT': '5432', 'RABBITMQ_PORT_15672_TCP_PORT': '15672', 'RABBITMQ_SERVICE_PORT_AMQP': '5672', 'POSTGRESQL_PORT_5432_TCP_ADDR': '172.30.197.220', 'KUBERNETES_PORT_53_TCP': 'tcp://172.30.0.1:53', 'KUBERNETES_PORT_53_TCP_PORT': '53', 'KUBERNETES_SERVICE_PORT_DNS': '53', 'KUBERNETES_PORT_53_TCP_ADDR': '172.30.0.1', 'POSTGRESQL_PORT_5432_TCP_PORT': '5432', 'AWX_SKIP_MIGRATIONS': '1', 'KUBERNETES_PORT_443_TCP_PROTO': 'tcp', 'KUBERNETES_PORT_443_TCP_ADDR': '172.30.0.1', 'AWX_WEB_SVC_PORT_80_TCP_ADDR': '172.30.178.191', 'RABBITMQ_PORT_5672_TCP_PROTO': 'tcp', 'KUBERNETES_PORT': 'tcp://172.30.0.1:443', 'AWX_RMQ_MGMT_PORT_15672_TCP_PORT': '15672', 'POSTGRESQL_PORT_5432_TCP': 'tcp://172.30.197.220:5432', 'KUBERNETES_PORT_53_UDP_ADDR': '172.30.0.1', 'PWD': '/home/awx', 'HOME': '/var/lib/awx', 'RABBITMQ_PORT_5672_TCP_ADDR': '172.30.38.89', 'KUBERNETES_SERVICE_PORT_DNS_TCP': '53', 'RABBITMQ_PORT_15672_TCP_ADDR': '172.30.38.89', 'AWX_WEB_SVC_PORT_80_TCP_PROTO': 'tcp', 'AWX_RMQ_MGMT_PORT_15672_TCP_PROTO': 'tcp', 'KUBERNETES_PORT_53_UDP_PROTO': 'udp', 'KUBERNETES_SERVICE_PORT_HTTPS': '443', 'RABBITMQ_PORT_5672_TCP_PORT': '5672', 'KUBERNETES_PORT_443_TCP_PORT': '443', 'POSTGRESQL_SERVICE_HOST': '172.30.197.220', 'AWX_RMQ_MGMT_PORT_15672_TCP_ADDR': '172.30.203.25', 'AWX_WEB_SVC_PORT_80_TCP_PORT': '80', 'KUBERNETES_PORT_443_TCP': 'tcp://172.30.0.1:443', 'POSTGRESQL_SERVICE_PORT_POSTGRESQL': '5432', 'POSTGRESQL_PORT': 'tcp://172.30.197.220:5432', 'RABBITMQ_PORT_15672_TCP_PROTO': 'tcp', 'AWX_WEB_SVC_PORT': 'tcp://172.30.178.191:80', 'RABBITMQ_SERVICE_PORT': '15672', 'SHLVL': '1', 'LANGUAGE': 'en_US.UTF-8', 'POSTGRESQL_PORT_5432_TCP_PROTO': 'tcp', 'AWX_RMQ_MGMT_PORT_15672_TCP': 'tcp://172.30.203.25:15672', 'KUBERNETES_SERVICE_PORT': '443', 'PATH': '/var/lib/awx/venv/ansible/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin', 'RABBITMQ_PORT_5672_TCP': 'tcp://172.30.38.89:5672', 'KUBERNETES_SERVICE_HOST': '172.30.0.1', 'RABBITMQ_SERVICE_PORT_HTTP': '15672', 'AWX_RMQ_MGMT_SERVICE_PORT': '15672', 'AWX_RMQ_MGMT_PORT': 'tcp://172.30.203.25:15672', 'KUBERNETES_PORT_53_TCP_PROTO': 'tcp', '_': '/usr/local/bin/supervisord', 'SUPERVISOR_ENABLED': '1', 'SUPERVISOR_SERVER_URL': 'unix:///tmp/supervisor.sock', 'SUPERVISOR_PROCESS_NAME': 'dispatcher', 'SUPERVISOR_GROUP_NAME': 'tower-processes', 'LC_CTYPE': 'en_US.UTF-8', 'DJANGO_SETTINGS_MODULE': 'awx.settings.production', 'DJANGO_LIVE_TEST_SERVER_ADDRESS': 'localhost:9013-9199', 'TZ': 'UTC', 'ANSIBLE_FACT_CACHE_TIMEOUT': '0', 'ANSIBLE_FORCE_COLOR': 'True', 'ANSIBLE_HOST_KEY_CHECKING': 'False', 'ANSIBLE_INVENTORY_UNPARSED_FAILED': 'True', 'ANSIBLE_PARAMIKO_RECORD_HOST_KEYS': 'False', 'ANSIBLE_VENV_PATH': '/var/lib/awx/venv/ansible', 'HTTP_PROXY': 'http://proxy.domains:8080/', 'HTTPS_PROXY': 'http://proxy.domains:8080/', 'NO_PROXY': '127.0.0.1,localhost,10.0.0.0/8,.domains,172.30.0.0/16', 'PROOT_TMP_DIR': '/tmp', 'AWX_PRIVATE_DATA_DIR': '/tmp/awx_1537_bj9iv7fk', 'VIRTUAL_ENV': '/var/lib/awx/venv/ansible', 'PYTHONPATH': '/var/lib/awx/venv/ansible/lib/python3.6/site-packages:', 'ANSIBLE_RETRY_FILES_ENABLED': 'False', 'ANSIBLE_ASK_PASS': 'False', 'ANSIBLE_BECOME_ASK_PASS': 'False', 'DISPLAY': '', 'TMP': '/tmp', 'PROJECT_UPDATE_ID': '1537', 'ANSIBLE_CALLBACK_PLUGINS': '/var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/plugins/callback', 'ANSIBLE_GALAXY_SERVER_GALAXY_URL': 'https://galaxy.ansible.com', 'ANSIBLE_GALAXY_SERVER_LIST': 'galaxy', 'ANSIBLE_STDOUT_CALLBACK': 'awx_display', 'AWX_ISOLATED_DATA_DIR': '/tmp/awx_1537_bj9iv7fk/artifacts/1537', 'RUNNER_OMIT_EVENTS': 'False', 'RUNNER_ONLY_FAILED_EVENTS': 'False'}
@magick93,
I can see the environment variables being set in your job_env
properly (I assume they're correct)?
'HTTP_PROXY': 'http://proxy.domains:8080/',
'HTTPS_PROXY': 'http://proxy.domains:8080/',
'NO_PROXY': '127.0.0.1,localhost,10.0.0.0/8,.domains,172.30.0.0/16'
Perhaps the ansible-galaxy
command itself doesn't respect these?
https://github.com/ansible/awx/blob/devel/awx/playbooks/project_update.yml#L139
If that's the case, it's not likely to be a bug in AWX itself; all we can do is set these env vars in the pty we run the playbook and ansible-galaxy
CLI in (and it looks like we're doing that).
Git seems to be respecting these, as I have no problem pulling and updating projects in AWX.
Its only since I added a role and a requirements.yml that contains a src
to a github repo that this problem occurred.
So I suspect the issue is more specifically related to ansible-galaxy.
@magick93 have you tried lower-casing the env var names, e.g., http_proxy
?
Also, all of this (for ansible-galaxy
, at least) is going to depend on Python's support for these proxy vars, and I'm not certain that Python supports CIDR notation in no_proxy
as you've specified in your example:
https://github.com/ansible/ansible/issues/52705#issuecomment-466496745
I'm not certain that Python supports CIDR notation in no_proxy as you've specified in your example
I dont think this is the issue as I'm using both internal git (gitlab) and external github. And there is no issue updating projects from either source. Updating projects from the internally hosted gitlab is using the no_proxy
.
I'll try with lowercase env var names....
Yea, this may just require some experimenting. I can say with certainty based on your job_env
that we're setting those env vars during the project update run; I can't speak to whether the underlying libraries properly support them in the format you've specified, though.
If you really wanted to dig in and confirm there's not some bug in the way AWX sets the env vars, you could stick a sleep
somewhere in the project update playbook:
https://github.com/ansible/awx/blob/devel/awx/playbooks/project_update.yml#L139
...and then when you run another update, exec into the container and look for the (sleeping) ansible-playbook
process that's running project_update.yml
and get its pid. Then go check out /proc/<pid>/environ
and see if it looks like the env vars are properly set.
I changed the keys to lower case and then was able to access the remote repo. so my extra variables entry looks like: { "HOME": "/var/lib/awx", "http_proxy": "http://proxy.domain:8080/", "https_proxy": "http://proxy.domain:8080/", "no_proxy": "127.0.0.1,localhost,10.0.0.0/8,.domain" }
ISSUE TYPE
SUMMARY
Basically same as described on https://github.com/ansible/awx/issues/1532
ENVIRONMENT
STEPS TO REPRODUCE
I have a project that uses a
requirement.yml
to fetch roles. This is failing as it doesnt use the specified proxy.In
SETTINGS / JOBS / Extra Environment Variables
I have the following:EXPECTED RESULTS
Expect AWX to use the proxy, even when fetching roles.
ACTUAL RESULTS
Job fails with the below error:
ADDITIONAL
It seems AWX is using the proxy when fetching projects from git (eg, github). But when fetching a role it is not.