fpdcc / ccfp-asset-dashboard

CCFP Asset Dashboard
0 stars 1 forks source link

CIP Planner and Asset Dashboard application

Developing

Development requires a local installation of Docker and Docker Compose.

Build application containers:

docker-compose build

Run the app:

docker-compose up

The app will be available at http://localhost:8000. The database will be exposed on port 32001.

Load the development data:

docker-compose run --rm app python manage.py loaddata asset_dashboard/fixtures/data.json

Import the district boundaries:

docker-compose run --rm app make districts

Restore the FPDCC database

Download the database from Dropbox and save the tar file in this repo's root directory. You'll need to have postgres installed on your machine to run the command.

Load the database:

pg_restore -U postgres -h localhost -p 32002 -d fpdcc -O FPDCC_DataMade_backup112221.tar

The password is postgres (as defined in docker-compose.yml).

Connect to the database with psql:

psql -U postgres -h localhost -p 32002

Examine the tables. In the postgres shell:

postgres=# \c fpdcc
psql (14.0, server 12.5)
You are now connected to database "fpdcc" as user "postgres".
fpdcc=# \dt *.*

You should see a list of all the tables.

Debugging

Run the app with a debugger:

docker-compose run --rm -p 8000:8000 app

Compiling Sass to CSS

This project uses Sass to compile a custom Bootstrap build with house styles. Making changes to the Sass? Use our develop script to auto-compile your changes to CSS and commit your changes.

# On a running app container
docker-compose exec app npm run-script develop

# OR, in a one-off container
docker-compose run --rm app npm run-script develop

# Add your changes to version control
git add asset_dashboard/static/css/bootstrap.custom.css
git commit -m "Update custom Bootstrap build"

Note that you only need to update the Sass to override base Bootstrap styles. See the Bootstrap documentation on theming for more information.

To extend Bootstrap styles and add new styles, edit app.css directly.

Running tests

Run tests without testing the GIS models:

docker-compose -f docker-compose.yml -f tests/docker-compose.yml run --rm app

To test the GIS models in your local environment with the restored database, use this command with the TEST_GIS environment variable:

docker-compose -f docker-compose.yml -f tests/docker-compose.yml run -e TEST_GIS=True --rm app

Dumping and Loading Fixture Data

Dump the data:

docker-compose run --rm app python manage.py dumpdata \
    --natural-foreign \
    --indent 2 \
    -e contenttypes \
    -e sessions \
    -o asset_dashboard/fixtures/data.json \
    asset_dashboard auth

Tech Stack

This application is a Django app with Postgres. It's managed with Docker. Parts of the user interface were built with React (such as all map interfaces and the CIP planner page).

You'll need Docker on your machine for local development, otherwise Docker will take care of all the dependencies. Read DataMade's how-to documentation for details on the Docker configuration.

The React code is baked into the Django templates. Read more about this approach in DataMade's documentation about Django/React integration.

How to Deploy

This application is deployed via Heroku. The Heroku pipeline is setup so that the master branch deploys to the staging pipeline, and the deploy branch deploys to the production pipeline. This configuration is documented in DataMade's how-to repo.

There are a few ways to prompt a deployment:

  1. Whenever code is pushed to GitHub and merged to master, Heroku will automatically deploy the master branch to the staging environment. You can prompt this by merging a pull request, or you can push your local master to GitHub (git push origin master). Once your master branch is ready for production, you can deploy to production from your local command line with this command: git push origin master:deploy. This resets the deploy branch to mirror master, which initiates the deploy action. This is the preferred way to deploy.
  2. Via the Heroku dashboard's user interface. You'll only ever do this in the rare case that the GitHub/Heroku integration is broken.
  3. Using the Heroku CLI. You'll only ever do this in the rare case that the GitHub/Heroku integration is broken.

Whenever you open a new PR, the Heroku integration is setup to create a new review app. The link for this review app will show up on the GitHub PR page. The PR skeleton description includes instructions for turning on the connection with the remote GIS database (this DB is detailed below).

Details about how databases are connected

The application uses two databases:

  1. A Postgres instance on AWS RDS. This is the application's main database that we write to.

The RDS instance is named ccfp-asset-dashboard. We created three databases within the instance:

  1. production
  2. staging
  3. review

Each app in the Heroku pipeline is configured to connect with the corresponding environment's database.

The RDS security group is configured to accept connections from the application on Heroku. Since Heroku doesn't have static IP addresses, the quotagaurd static add-on helps establish a connection with the remote database. For details on how this works, see PR #91 and PR #70, as well as discussions in issues #59 and #60.

  1. The Forest Preserves of Cook County's GIS database. We've setup a remote connection with the FPDCC's database. This connection also requires QuotaGuard Static.

What you can set in admin account

In the admin interface located on the website path /admin, an admin user can do these things:

Where to find things in the code

Models

All of the models are in the asset_dashboard/models.py file. We're using two types of models: managed and unmanaged models. All of the models that inherit from the standard Django models.Model class are readable and writeable — these are managed models. The ones that inherit from the GISModel class are readable and are unmanaged. These unmanaged models allow us to use the Django ORM so that we can access the Forest Preserves' GIS database.

Views

This application uses both standard Django class views, as well as the Django Rest Framework for JSON and GeoJSON. The views are located at:

URLS

asset_dashboard/urls.py contain the URLS that are connected to the Django views and DRF endpoints.

Serializers and Forms

The forms that are within HTML use the Django form classes, and they're located at asset_dashboard/forms.py. These forms are served up in the views.

The Django Rest serializers are located at asset_dashboard/serializers.py. These serializers are only used with the asset_dashboard/endpoints.py file. Together, these pieces of the Django Rest Framework manage ajax requests from the React code. Any GET or POST requests in the React code happen outside of the typical Django view/template and are managed on the backend with Django Rest Framework.

React

As mentioned, parts of this codebase use React. You'll need to dive into the React code if you're dealing with anything related to the maps and CIP planner. All of the React code is located in asset_dashboard/static/js.

The CIP planner is located in the ~/js/PortfolioPlanner.js file (and you should be able to find all of the component's local imports through that).

The relevant map components are located in:

For understanding this Django/React integration, see the DataMade documenation about that. Reading that documentation will help you understand how the React is packaged within the Django template's HTML, as well as how you'll be able to use the Django views to pass data to React.

Templates

All of the HTML is located in asset_dashboard/templates directory.

Static

All JavaScript and CSS is contained in the asset_dashboard/static directory.

Management commands

You'll be able to use any of the built-in Django management commands, but you'll need to do it within the Docker container. For example, to create a new migration, you'll do: docker-compose run --rm app python manage.py makemigrations.

We've created some extra commands, located in asset_dashboard/management/commands.

  1. clear_cache.py automatically runs whenever a new version of the app deploys.
  2. Create the zones and political boundaries. These all run whenever the application deploys, but also need to be ran when developing locally (as documented in the README with the docker-compose run --rm app make districts command). These are orchestrated with a Makefile and should be ran together.
    • create_zone_geojson.py creates the zone boundaries based on the GIS database.
    • import_boundaries.py creates political boundaries from public data
  3. load_development_data.py loads some fake data for local development. This is documented in the README steps for setting up local development.

Docker commands

See above development section for when setting up the application for local development.

Some other helpful docker commands: