Open gundalow opened 1 year ago
build for ARM so people on the new M1 mac dont have to keep rebuilding the images
Fail fast
Makefile improvements
ability to run "e2e-test" against docker-compose development environment within vscode
ability to use vscode debugger against docker-compose development environment
vscode dev container
https://github.com/ansible/awx/pull/13641#issuecomment-1460804551 "for the django migration errors, just kill the containers and rerun make docker-compose. there is some kind of race condition the first time starting the containers (when migrations are needed)."
that threw me off when I ran the tests locally :)
@Klaas- thanks for mentioning that one! we fixed it recently here https://github.com/ansible/awx/pull/13670
make ui-devel
is undocumented in the README
/var/lib/awx
needs to be manually created (why) with certain permissions
make ui-devel
is undocumented in the README/var/lib/awx
needs to be manually created (why) with certain permissions
It is documented, but almost impossible to follow the docs.
I'll start from the README and walk our way through the key parts of the documentation, and then maybe it will become more obvious how things need to get reshaped.
Starting at https://github.com/ansible/awx/blob/devel/README.md#contributing
We get taken to: https://github.com/ansible/awx/blob/devel/CONTRIBUTING.md
After some scrolling, we end up at https://github.com/ansible/awx/blob/devel/CONTRIBUTING.md#build-and-run-the-development-environment
Which sends us over to https://github.com/ansible/awx/blob/devel/tools/docker-compose/README.md
Buried deep down on that page are the instructions for building the UI: https://github.com/ansible/awx/blob/devel/tools/docker-compose/README.md#clean-and-build-the-ui
At the end of the day, as long as make / ansible / docker-compose are installed, you really just need to run:
$ make docker-compose
$ docker exec tools_awx_1 make clean-ui ui-devel
...and then wait for initial migrations to run. After that, a randomized admin password will be printed in the terminal output.
Note how we didn't need to build any image as is described in our documentation. This is because we publish a pre-built devel image which should work most of the time.
[Makefile/build]: probably a small thing, but the setuptools-scm dependency seems like a wart to me. 1. We don't support anything other than git anywhere else and 2. it fails for me all the time. One or two of the recipes have conditionals for getting it, but a lot of them fail because it's so implicit (VERSION macro always needs it, and it's at the top level). If I understand correctly, we can get the same version strings we're after by running git or checking the dir instead of relying on setuptools-scm.
[Makefile/build]: Nearly every place where the Makefile checks for VENV_BASE is wrong. They all expand to just if [ "staticstring" ]
which is always true, so $VENV_BASE/bin/activate
is always called.
There needs to be documentation and a complete package installer for people behind very strong corporate policies. I have to download everything prior to install and try to figure out how to force it to work.
Other app/packages I have worked with release a bundle RPM (or .dev) installer that contains all of the dependency files and will install the packages in order needed.
I realize this is a big task however there are many engineers out here that will never have command line access in prod environment for git and Pypi, etc.
This issue of no complete installer combined with hundreds of old documentation repos that contain instructions to install AWX V17 and older which are completely different make for a very long untangle of a big install mess.
Good point by @jjwatt Similar to that, the Makefile is opinionated about PYTHON. This leads to error messages when running any target which are distracting
make: python3.9: No such file or directory
Personally, I have done fairly well with my workaround which is calling things like PYTHON=$(which python) make foobar
, which will accommodate the fact that I'm running in a virtual environment running with a more recent python version. However, I feel like this is one of the stumbling blocks that newcomers will encounter, as this will be a red herring for basically any other issue they may be trying to diagnose. I understand that it is important that we default to 3.9 in one scenario - I think running targets from inside the container. I have wondered if using python3
instead would still suffice for that use case without causing other problems. Maybe we could check for a version of 3.9 or higher and write errors if not.
ability to use vscode debugger against docker-compose kube development environment
switching branches while running development environment break the environment and cause us to start over again
(due to db migration miss match between the branch)
make kube-devel
to simply deploy kubernetes base dev environment
currently u need to do this in the operator repo rather than directly from awx repo
[program:dispatcher]
{% if kube_dev | bool %}
command = make dispatcher
directory = /awx_devel
{% else %}
command = awx-manage run_dispatcher
directory = /var/lib/awx
{% endif %}
this is annoying and clutter up our makefile
https://github.com/ansible/awx-operator/issues/1308 need this to make CI pass for awx and awx-operator on feature branches
https://github.com/ansible/awx/issues/13901 - we didn't even know or understand that tests do not work the same when ran in the local environment as when ran on github.
I just updated https://github.com/ansible/awx/pull/13521 to add some clarity to another persistent error from our Makefile. This logic throws errors on all checks, and is used to generate the version, but logically... the version must not be necessary for checks if it's not failing the checks, just throwing up a non-fatal error. I do think it will make things a lot smoother for contributors if we can clean up our own mess with things like this.
I want to also call out https://github.com/ansible/awx/pull/13938, which will get people accurate check results if they make pull requests against other people's forks, which sometimes happens.
[Makefile/docker-compose-container-group]: Starting Kubernetes 1.24, Secrets are not automatically generated when Service Accounts are created, I suggest we create a secret of the type "service-account-token" with the same name as the service account and use that for the rest of the conf #14596
Please confirm the following
Bug Summary
Let's use this GitHub Issue to brain-dump problems we've seen. Including:
AWX version
devel
Select the relevant components
Installation method
kubernetes
Modifications
no
Ansible version
No response
Operating system
No response
Web browser
No response
Steps to reproduce
n/a
Expected results
Easier to develop.
Actual results
Not as easy as it could be
Additional information
No response