HakaiInstitute / hakai-datasets

Hakai Datasets that are going into https://catalogue.hakai.org/erddap/
0 stars 0 forks source link

Hakai Datasets

This repository contains different components needed to produce and maintain Hakai's datasets on Hakai ERDDAP servers. Server update status:

All datasets made available within the datasets.d folder in the ERDDAP XML format are made available on the production server.

Hakai deploy ERDDAP as docker containers by using the docker-erddap image. Continuous Integration is handled via the erddap-deploy actions and the container configuration is handled via CapRover applications.

See GitHub Deployments for all active deployments maintained via this repository.

Configuration

The present repository is handled via CapRovers Applications. To configure a deployment, follow these steps:

Testing environment

For local development, make a copy of sample.env file as .env. Update the environment variables to match the deployed parameters. Omit the email parameters and baseHttpsUrl and baseUrl.

Add test files if needed within the datasets/ directory.

Run docker-compose

docker-compose up -d

If successful, you should be able to access your local ERDDAP instance at http://localhost:{HOST_PORT}/erddap (default: http://localhost:8080/erddap)

Hakai Database integration

All views and tables generated from the different SQL queries made available in the view directory are run nightly from the hecate.hakai.org server from the master branch.

Continuous Integration

All commits to this repository are tested by different linter through a PR or commit to the development and master branches:

We are using the super-linter library to generate the different automated integration tests.

If the linter tests and erddap_deploy tests pass, changes will automatically be reflected on the associated deployment via: