Closed sydoluciani closed 3 years ago
Judging by the size of the instructions, setting up Selenium is the most difficult/problematic part of contributing. To address this, it would be nice at least support running Selenium in a container. This can be achieved with the following setup code:
def setUpModule():
global browser
from selenium.webdriver import DesiredCapabilities, Remote
browser = Remote(
command_executor="http://localhost:4444/wd/hub",
desired_capabilities=DesiredCapabilities.FIREFOX,
)
browser.set_window_size(1920, 1080)
return browser
To use, one can install/start selenium and browser with a single command:
docker run -d -p 4444:4444 -p 5900:5900 -v /dev/shm:/dev/shm selenium/standalone-firefox:4.0.0-alpha-7-prerelease-20200826
To visually inspect progress a VNC server is running inside the selenium container
Perhaps a test setup can be made that allow switching between running Selenium locally and in a container.
As the tests also require deformdemo to be running (which is already containerized) , it would make even more sense to be able to run the entire test suite with a docker-compose file.
That dockerfile could look something like this:
version: "3.2"
services:
deformdemo:
hostname: deformdemo
build:
context: deformdemo
selenium:
hostname: selenium
image: selenium/standalone-firefox:4.0.0-alpha-7-prerelease-20200826
ports:
- 5900:5900 # for vnc debugging
environment:
- VNC_PASSWORD=secret
- SCREEN_WIDTH=1920
- SCREEN_HEIGHT=1080
volumes:
- /dev/shm:/dev/shm
tester:
container_name: tester
build:
context: .
dockerfile: Dockerfile-tests
depends_on:
- deformdemo
- selenium
running the entire test suite from scratch could then be as simple as (provided the prerequisites git and docker are available):
git clone deform
docker-compose up
docker exec -it tester bash -c nosetests
All the setup instructions can be codified inside the two dockerfiles instead of scattered around bash scripts, instructions and yaml files. For development purposes a second dockerfile can be added that has the local sourcecode mounted inside so that it can be updated without having to rebuild the container
@tdamsma Thank you.
@stevepiercy We did discuss having docker container in Pull Request 397, however at that point the priority was to fix the failed tests. maybe just creating local_functional3 excluded from tox envlist, for local testing only, and leave the rest the same.
Oh, that's right. I forgot about that conversation.
My concerns are:
Would it solve the chicken and egg problem, making it possible for us to test simultaneous changes in both Deform and deformdemo in GitHub Actions or Travis, without specifying from where to pull the changes?
Understandable and valid concerns.
not increase the maintenance burden
I think once it is set up properly it should stay pretty stable, so I think this will be ok
not to alienate contributors by imposing a new requirement, allowing users to continue what they do now
Should be possible to make a setup that allows for both.
clear documentation of how to use and troubleshoot issues
can be done
Would it solve the chicken and egg problem, making it possible for us to test simultaneous changes in both Deform and deformdemo in GitHub Actions or Travis, without specifying from where to pull the changes?
Not by itself, containers just make running the tests more easily reproducible across different systems. The core problem I think is that deform and deformdemo are by there nature tightly coupled, so their releases, PR's, features etc should be completely in sync. The best way I know how to do that is a monorepo; i.e. merging deformdemo into deform. I would understand if that is not an option, but I think that is fundamentally the easiest solution.
But to keep the current split, a few options crossed my mind:
create a "next" branch on both repositories that is used in the tests. The next branch is either equal to master, or has some extra's bolted on to progress to the next step. In the deform repo a PR to master is tested using the next branch of deformdemo. A PR for deformdemo@master is tested using deform@next. So a new feature (that influences both repo's) requires PR's to be made to both next branches. These PR's are merged without testing, and then new PR's are made for merging next into master. This all feels quite cumbersome, especially when iterations are needed to get stuff ready. Also this doesn't allow multiple PR's for different features to co-exist. Effectively what I would want is that when I make a PR into Pylons/deform@master from tdamsma/deform@
use git submodules and tests against whatever the current commit of the submodule is. I am not so experienced with this so not really sure if this would actually work
require a certain order for changes: first commit/merge a PR in deformdemo, and only the make a PR for deform. Use some kind of decorator to skip tests for unavailable features depending on the deform version, something like pytest.mark.skipif(deform.__version__ <= "3.11")
To be honest I don't like any of these options much. Perhaps this should be a separate discussion for the containerization?
FTR, we have a Docker file in deformdemo as an alternative to installing it from source. It could be used as a starting point.
I opened a separate issue for the chicken-and-egg problem, and responded there.
Replace run-selenium-tests.bash with Dockerfile, and run docker instead of run-selenium-tests.bash for local test, then replace lines 36 to 39 of .travis.yml to run Deformdomo docker in travis.