An API and UI for searching, using, and maintaining Data Sources.
git clone https://github.com/Police-Data-Accessibility-Project/data-sources-app.git
cd data-sources-app
If you don't already have virtualenv, install the package:
pip install virtualenv
Then run the following command to create a virtual environment:
virtualenv -p python3.11 venv
source venv/bin/activate
pip install -r requirements.txt
Either add a .env
file to your local root directory or manually export these secrets. Reach out to contact@pdap.io or make noise in Discord if you'd like access.
.env
file# Local development
VITE_VUE_API_BASE_URL=http://localhost:5000
VITE_VUE_APP_BASE_URL=http://localhost:8888
# Deployed app
# VITE_VUE_API_BASE_URL=https://data-sources.pdap.io/api
# VITE_VUE_APP_BASE_URL=https://data-sources.pdap.io/
# Production database and API
DO_DATABASE_URL=secret
SECRET_KEY=secret
# Mailgun key for notifications
MAILGUN_KEY=secret
# Discord key for #dev-alerts channel
WEBHOOK_URL=secret
export VITE_VUE_API_BASE_URL=http://localhost:5000
export VITE_VUE_APP_BASE_URL=http://localhost:8888
export DO_DATABASE_URL=secret
export SECRET_KEY=secret
export MAILGUN_KEY=secret
export WEBHOOK_URL=secret
To connect to the database, your IP address will need to be added to the "allow" list in DigitalOcean database settings. Reach out to someone with admin access to get your IP address added.
python3 app.py
cd client
npm install
npm run dev
All unit tests for the API live in the app_test.py file. It is best practice to add tests for any new feature to ensure it is working as expected and that any future code changes do not affect its functionality. All tests will be automatically run when a PR into dev is opened in order to ensure any changes do not break current app functionality. If a test fails, it is a sign that the new code should be checked or possibly that the test needs to be updated. Tests are currently run with pytest and can be run locally with the pytest
command.
Endpoints are structured for simplified testing and debugging. Code for interacting with the database is contained in a function suffixed with "_results" and tested against a local sqlite database instance. Limited rows (stored in the DATA_SOURCES_ROWS and AGENCIES_ROWS variables in app_test_data.py) are inserted into this local instance on setup, you may need to add additional rows to test other functionality fully.
Remaining API code is stored in functions suffixed with "_query" tested against static query results stored in app_test_data.py. Tests for hitting the endpoint directly should be included in regular_api_checks.py, makes sure to add the test function name in the list at the bottom so it is included in the Github actions run every 15 minutes.
pip install pytest
pytest
Linting is enforced with black on PR creation. You can use black to automatically reformat your files before commiting them, this will allow your PR to pass this check. Any files that require reformatting will be listed on any failed checks on the PR.
black app_test.py
Docstrings and Type Checking are checked using the pydocstyle and mypy
modules, respectively. When making a pull request, a Github Action (python_checks.yml
) will run and,
if it detects any missing docstrings or type hints in files that you have modified, post them in the Pull Request.
These will not block any Pull request, but exist primarily as advisory comments to encourage good coding standards.
Note that python_checks.yml
will only function on pull requests made from within the repo, not from a forked repo.
A few things to know:
pinia
for state management. This works much better with the composition API than with options, so it is recommended to use the composition API if you need data from one of the pinia
stores.npm run build
npm run preview
npm run lint
npm run lint:fix
npm run test
npm run test:ci
npm run test:changed
npm run coverage