The new and improved DDW Analyst UI interface
Make sure you're starting with a clean DB volume, so Docker knows to create the new User
docker-compose down` `docker volume rm metadata2
Create a persistent dev volume
docker volume create --name=metadata2
Create a self-signed certificate
mkdir -p ssl && openssl req -newkey rsa:2048 -new -nodes -x509 -days 3650 -keyout ssl/privkey.pem -out ssl/fullchain.pem
Build your app
docker-compose up --build -d
Fetch CSV files from GitHub
docker-compose exec web python manage.py update_csv_files
Migrate the database.
docker-compose exec web python manage.py migrate
Load test data
docker-compose exec web python manage.py loaddata test_data` `docker-compose exec web python manage.py loaddata --database=datasets test_datasets
Alternatively, load the real data
export FTSUSER=X` `export FTSPASS=Y` `docker-compose exec web data_updates/completed_scripts.sh
Create a superuser.
docker-compose exec web python manage.py createsuperuser
Add the bit registry to npm config to install bit dependencies
npm config set @bit:registry https://node.bitsrc.io
Install frontend dependencies
npm install
Bundle frontend code and collect static files
npm run build
Restart the app.
docker-compose restart
Manually run management command to download csv files:
docker-compose exec web python manage.py update_csv_files
Create a scheduled event to periodically download updates from the git repo. The bash script is update_csv_files.sh
The git hub repo with csv files can be found here
To create a test development DB, for local development (e.g. virtualenv steps below)
Ensure the line that normally appears as local all postgres peer
in pg_hba.conf
instead reads local all postgres trust
Run script ./dev-db-setup.sh
A database analyst_ui will be created in your local postgres instance.
Access sample database through default postgres user using
psql -d postgres
\c analyst_ui
For additional users, edit script analyst_ui_users.sql adding the username that you need
Run script to grant permissions to all the schemas and tables of analyst_ui
Follow the steps under the virtualenv section below to intergrate with your local environment.
Prerequisites
virtualenv env
source env/bin/activate
pip install -r requirements.txt
python manage.py update_csv_files
python manage.py migrate
python manage.py loaddata test_data
python manage.py loaddata --database=datasets test_datasets
python manage.py createsuperuser
npm install
npm run dev
NB: is set to watch for changes and recompileexport DJANGO_DEV='True' && python manage.py runserver
Make sure you're starting with a clean DB volume, so Docker knows to create the new User:
docker-compose down
docker volume rm metadata2
Create a persistent dev volume:
docker volume create --name=metadata2
Create a self-signed certificate:
mkdir -p ssl && openssl req -newkey rsa:2048 -new -nodes -x509 -days 3650 -keyout ssl/privkey.pem -out ssl/fullchain.pem
Build & run your app with the dev docker config:
docker-compose -f docker-compose.dev.yml up --build
Fetch CSV files from GitHub
docker-compose -f docker-compose.dev.yml exec web python manage.py update_csv_files
Migrate the database:
docker-compose -f docker-compose.dev.yml exec web python manage.py migrate
Load test data:
docker-compose -f docker-compose.dev.yml exec web python manage.py loaddata test_data
docker-compose -f docker-compose.dev.yml exec web python manage.py loaddata --database=datasets test_datasets
Alternatively, you can acquire a db dump of the live data (binary) and import it into your database:
docker-compose -f docker-compose.dev.yml exec db psql -U analyst_ui_user -d analyst_ui -c 'drop schema public CASCADE;'
docker-compose -f docker-compose.dev.yml exec db psql -U analyst_ui_user -d analyst_ui -c 'create schema public;'
docker cp [DB BUMP FILE NAME].backup ddw-analyst-ui_db_1:/var/lib/postgresql/data
docker exec ddw-analyst-ui_db_1 pg_restore -U analyst_ui_user -d analyst_ui /var/lib/postgresql/data/[DB BUMP FILE NAME].backup
docker-compose -f docker-compose.dev.yml exec web python manage.py update_csv_files
docker-compose -f docker-compose.dev.yml exec web python3 manage.py migrate
Create a superuser:
docker-compose -f docker-compose.dev.yml exec web python manage.py createsuperuser
Add the bit registry to npm config to install bit dependencies:
npm config set @bit:registry https://node.bitsrc.io
Install frontend dependencies:
npm install
Dynamic API base URL setting
Add an API_BASE_URL
in the .env
file and assign it either a localhost, staging, or production url. If not set, this defaults to the url of the current environment in which the application is running.
Start frontend dev environment which watches and collects static files:
npm start
Configure a cronjob to run the run-schedules.sh
script which in turn runs the command that checks for scheduled events every minute
* * * * * /root/run-schedules.sh >/root/cron-logs.txt 2>&1
Make sure the following schemas are created;
archives
i.e CREATE SCHEMA archives;
dataset
i.e CREATE SCHEMA dataset;
On first run (i.e if you have used a database dump without FTS Precoded tables included or a clean DB set up) run the following scripts;
docker-compose exec web data_updates/manual_data.sh
docker-compose exec web data_updates/manual_data_fts.sh
which adds the FTS metadata into the DBThe above should only be run once, on initial deployment of the feature, and only if using a clean and fresh DB set up with no data or using a DB dump that does not include the FTS metadata
To pull the latest FTS updates from the APIs, we shall run
docker-compose exec web data_updates/fts.sh
. Note that at this point the analyst may download the updated codelists and edit them, then re-upload them using the https://ddw.devinit.org/update/ featureTo precode and join the dependency tables, we shall run:
docker-compose exec web data_updates/fts_precode.sh
. This should be run everytime there is a change made to the codelist entries or everytime the script in 3 above is run.To update the manual FTS tables with missing codelist items, we shall finally run the below script
docker-compose exec web data_updates/fts_diff.sh
We can run 4 and 5 above in one step by using:
docker-compose exec web data_updates/finalise_precode.sh
This will be the preferred way of running them from the front end as a scheduled event.
NOTE:
docker-compose exec web data_updates/finalise_precode.sh
which combines 4 and 5 into one step. This can be run from the Scheduled Events
on the front end.This is set up to run with Cypress.
To test locally:
baseUrl
option in the cypress.json
file to one that suits your current needfrontend/cypress/fixtures/users.json
filenpm run cy:run
for headless tests and npm run cy:open
for interactive tests in a browser.If you're using Postman for testing the REST api, you can use the following setup:
POST
to http://localhost:8000/api/auth/login/
with Basic Auth
and the Username and Password.Tests
which will save the token to the environment.
var jsonData = JSON.parse(responseBody);
postman.setEnvironmentVariable("token", jsonData.token);
Authorization: Token {{token}}
If certbot has not been installed already, install certbot by following commands
sudo add-apt-repository ppa:certbot/certbot
sudo apt-get install certbot
Run below script to generate certificates
certbot renew --dry-run --webroot -w /root/ddw-analyst-ui/static/letsencrypt
If the command above is run successfully copy certificates to the ssl folder of ddw app
cp -f /etc/letsencrypt/live/ddw.devinit.org/privkey.pem /root/ddw-analyst-ui/ssl/
cp -f /etc/letsencrypt/live/ddw.devinit.org/fullchain.pem /root/ddw-analyst-ui/ssl/
From ddw-analyst-ui root folder, reload nginx so that certificates are picked
docker-compose exec ddw-analyst-ui_nginx_1 nginx reload
Check if there is a cron job set to renew certificates. If there is non add the cron task below. This will try to renew the certificate twice a day every day
0 */12 * * * /root/ddw-analyst-ui/certbot.sh >/dev/null 2>&1
Read more here
release/v[VERSION NUMBER]
package.json
v[VERSION NUMBER]
- this will deploy to production