COSMOS is a web application designed to manage collections indexed in NASA's Science Discovery Engine (SDE), facilitating precise content selection and allowing metadata modification before indexing.
$ docker-compose -f local.yml build
$ docker-compose -f local.yml up
If you prefer to run the project without Docker, follow these steps:
$ psql postgres
postgres=# create database <some database>;
postgres=# create user <some username> with password '<some password>';
postgres=# grant all privileges on database <some database> to <some username>;
# This next one is optional, but it will allow the user to create databases for testing
postgres=# alter role <some username> with superuser;
Copy .env_sample
to .env
and update the DATABASE_URL
variable with your Postgres credentials.
DATABASE_URL='postgresql://<user>:<password>@localhost:5432/<database>'
Ensure READ_DOT_ENV_FILE
is set to True
in config/settings/base.py
.
$ python manage.py runserver
Run initial migration if necessary:
$ python manage.py migrate
$ docker-compose -f local.yml run --rm django python manage.py createsuperuser
Create additional users through the admin interface (/admin).
To load collections:
$ docker-compose -f local.yml run --rm django python manage.py loaddata sde_collections/fixtures/collections.json
Navigate to the server running prod, then to the project folder. Run the following command to create a backup:
docker-compose -f production.yml run --rm --user root django python manage.py dumpdata --natural-foreign --natural-primary --exclude=contenttypes --exclude=auth.Permission --indent 2 --output /app/backups/prod_backup-20241114.json
This will have saved the backup in a folder outside of the docker container. Now you can copy it to your local machine.
mv ~/prod_backup-20240812.json <project_path>/prod_backup-20240812.json
scp sde:/home/ec2-user/sde_indexing_helper/backups/prod_backup-20240812.json prod_backup-20240812.json
Finally, load the backup into your local database:
docker-compose -f local.yml run --rm django python manage.py loaddata prod_backup-20240812.json
$ docker-compose -f local.yml run --rm django python manage.py shell
>>> from django.contrib.contenttypes.models import ContentType
>>> ContentType.objects.all().delete()
>>> exit()
$ docker cp /path/to/your/backup.json container_name:/path/inside/container/backup.json
$ docker-compose -f local.yml run --rm django python manage.py loaddata /path/inside/the/container/backup.json
$ docker-compose -f local.yml run --rm django python manage.py migrate
If the JSON file is particularly large (>1.5GB), Docker might struggle with this method. In such cases, you can use SQL dump and restore commands as an alternative, as described here.
$ mypy sde_indexing_helper
To run tests and check coverage:
$ coverage run -m pytest
$ coverage html
$ open htmlcov/index.html
$ pytest
Refer to the Cookiecutter Django documentation.
$ pip install celery
$ cd sde_indexing_helper
$ celery -A config.celery_app worker -l info
Please note: For Celery's import magic to work, it is important where the celery commands are run. If you are in the same folder with manage.py, you should be right.
$ cd sde_indexing_helper
$ celery -A config.celery_app beat
To install pre-commit hooks:
$ pip install pre-commit
$ pre-commit install
$ pre-commit run --all-files
Sign up for a free account at Sentry and set the DSN URL in production.
Refer to the detailed Cookiecutter Django Docker documentation.
Documented here.
We welcome contributions to improve the project! Before you begin, please take a moment to review our Contributing Guidelines. These guidelines will help you understand the process for submitting new features, bug fixes, and other improvements.
Eventually, job creation will be done seamlessly by the webapp. Until then, edit the config.py
file with the details of what sources you want to create jobs for, then run generate_jobs.py
.
/sde_indexing_helper/templates/
/sde_indexing_helper/static/js
/sde_indexing_helper/static/css
/sde_indexing_helper/static/images
tmux new -s docker_django
Once you are inside, you can run dmshell.
Later, you can do this to get back in.
tmux attach -t docker_django
To delete the session:
tmux kill-session -t docker_django