The new PhysioNet platform built using Django. The new site is currently hosted at https://physionet.org/
sudo apt-get install sqlite3
.requirements.txt
..env.example
file to .env
.physionet-django
directory:
python manage.py resetdb
to reset the database.python manage.py loaddemo
to load the demo fixtures set up example files.python manage.py runserver
to run the server.The local development server will be available at http://localhost:8000.
cp .env.example .env
.docker compose build
.docker compose up
to run the postgres database, development and test containers.docker compose exec dev /bin/bash
to enter the development container shell.physionet-django
directory:python manage.py resetdb
to reset the database.python manage.py loaddemo
to load the demo fixtures set up example files.docker compose exec test /bin/bash
to enter the test container shell.physionet-django
directory:python manage.py resetdb
to reset the database.python manage.py loaddemo
to load the demo fixtures set up example files.python manage.py test
to run the tests.The local development server will be available at http://localhost:8000.
All the management commands should be executed inside the desired container (with docker compose exec dev /bin/bash/
or docker compose exec test /bin/bash
).
The code should dynamically reload in development, however, if there are any issues you can stop the docker compose up
command and run docker compose up --build
which will rebuild the physionet image.
Docker compose uses volumes to persist the database contents and data directories (media and static files). To clean up the created containers, networks and volumes stop docker compose up
and run docker compose down -v
. Do not run docker compose down -v
if you want to retain current database contents.
Background tasks are managed by Django Q2, "a native Django task queue, scheduler and worker application using Python multiprocessing".
If you would like to run background tasks on your development server, you will need to start the task manager with python manage.py qcluster
To access a debug prompt raised using breakpoint()
:
docker container ls
to get a list of active containersdocker attach CONTAINER_ID
The debugger should now be available in the new shell.
dev
branch, titled after the new feature/change to be implemented.dev
branch with a clear title and description of the changes. Tips for a good pull request: http://blog.ploeh.dk/2015/01/15/10-tips-for-better-pull-requests/If using docker, all of the commands should run inside the test container (docker compose exec test /bin/bash
). You may need to pip install coverage
beforehand if not using docker.
test*.py
files.physionet-django
directory and run python manage.py test
.physionet-django
directory and run coverage run --source='.' manage.py test
. Next run coverage html
to generate an html output of the coverage results.physionet-django
directory and run flake8 [PATH_TO_FILE(s)]
. As part of the physionet-build-test
workflow, flake8 will be run only against modified code relative to dev
or the base PR branch.
Note: flake8
is only installed in the workflow. To install it for local testing, see here.test_browser.py
files, selenium and the firefox driver are required. If you want to see the test run in your browser, remove the options.set_headless(True)
lines in the setUpClass
of the browser testing modules.During development, the following workflow is applied for convenience:
fixtures
subdirectory of each app. Example file: <BASE_DIR>/<appname>/fixtures/demo-<appname>.json
To conveniently obtain a clean database with the latest applied migrations, run:python manage.py resetdb
. This does not populate the database with any data.
When using docker, the migrated and empty database will be the default state and only python manage.py loaddemo
has to be called in both dev
and test
containers.
If you need to add, remove, or modify any models or fields, your branch will also need to include the necessary migration script(s). In most cases, Django can generate these scripts for you automatically, but you should still review them to be sure that they are doing what you intend.
After making a change (such as adding a field or changing options), run ./manage.py makemigrations
to generate a corresponding migration script. Then run ./manage.py migrate
to run that script on your local sqlite database.
If you make changes and later decide to undo them without committing, the easiest way is to simply run rm */migrations/*.py && git checkout */migrations
to revert to your current HEAD. Then run ./manage.py makemigrations
again if necessary, followed by ./manage.py resetdb && ./manage.py loaddemo
.
If other migrations are committed to dev in the meantime, you will need to resolve the resulting conflicts before your feature branch can be merged back into dev. There are two ways to do this:
If the two sets of changes are independent, they can be combined by merging dev
into the feature branch and adding a "merge migration":
git checkout my-new-feature && git pull && rm */migrations/*.py && git checkout */migrations
git merge --no-ff --no-commit origin/dev
./manage.py makemigrations --merge
The latter command will ask you to confirm that the changes do not conflict (it will not detect conflicts automatically.) Read the list of changes carefully before answering. If successful, you can then run:./manage.py migrate && ./manage.py test
git add */migrations/ && git commit
As with any pull request, have someone else review your changes before merging the result back into dev
.If the migration behavior interacts with other changes that have been applied to dev in the meantime, the migration scripts will need to be rewritten.
rm */migrations/*.py; git checkout origin/dev */migrations
./manage.py makemigrations
./manage.py migrate && ./manage.py test
git add */migrations/ && git commit
The theme of the deployed website can be configured by changing the following environment variables:
LIGHT
The management command "compilestatic" generates a theme.scss file and compiles the following CSS files.
Note: The css files above are not tracked by git and are generated only when you run compilestatic command.
If you want to setup cronjobs, you can do that by adding a new file or update the existing cronjobs file based on your requirements.
Here are the loations where you might want to add your cronjobs.
deploy/common/etc/cron.d/
deploy/staging/etc/cron.d/
(For cronjobs that should run on staging environment)
deploy/production/etc/cron.d/
(For cronjobs that should run on production environment)
Here is an example of existing cronjob from deploy/production/etc/cron.d/physionet
:
31 23 * * * www-data env DJANGO_SETTINGS_MODULE=physionet.settings.production /physionet/python-env/physionet/bin/python3 /physionet/physionet-build/physionet-django/manage.py clearsessions
pyproject.toml
is the primary record of dependencies. This file is typically used by pip for package management. Dependencies are also tracked in pyproject.toml
and requirements.txt
.
The process for updating packages is:
pyproject.toml
poetry.lock
file with: poetry lock --no-update
poetry export -f requirements.txt --output requirements.txt --with dev