opendatacube / datacube-ows

Open Data Cube Open Web Services
Other
69 stars 35 forks source link
flask hacktoberfest ogc-services python

========================== Datacube Open Web Services

.. image:: https://github.com/opendatacube/datacube-ows/actions/workflows/lint.yml/badge.svg :target: https://github.com/opendatacube/datacube-ows/actions/workflows/lint.yml

.. image:: https://github.com/opendatacube/datacube-ows/actions/workflows/test.yml/badge.svg :target: https://github.com/opendatacube/datacube-ows/actions/workflows/test.yml

.. image:: https://github.com/opendatacube/datacube-ows/actions/workflows/docker.yml/badge.svg :target: https://github.com/opendatacube/datacube-ows/actions/workflows/docker.yml

.. image:: https://github.com/opendatacube/datacube-ows/actions/workflows/scan.yml/badge.svg :target: https://github.com/opendatacube/datacube-ows/actions/workflows/scan.yml

.. image:: https://codecov.io/gh/opendatacube/datacube-ows/branch/master/graph/badge.svg :target: https://codecov.io/gh/opendatacube/datacube-ows

.. image:: https://img.shields.io/pypi/v/datacube?label=datacube :alt: PyPI

Datacube-OWS provides a way to serve data indexed in an Open Data Cube as visualisations, through open web services (OGC WMS, WMTS and WCS).

Features

Known CRS Limitations

  1. ODC datasets with WKT-format CRSs will not work with OWS - data from such datasets will never be displayed. OWS currently only works with EPSG format CRSs.

  2. Datasets that straddle the anti-meridian or the north or south polar region will cause issues with the legacy postgres driver.

These are fundamental limitation of the way OWS works with the postgres ODC index driver. These limitations will be addressed in v1.9.0, but only for the new ODC postgis index driver.

Community

This project welcomes community participation.

Join the ODC Discord <https://discord.com/invite/4hhBQVas5U>__ if you need help setting up or using this project, or the Open Data Cube more generally. Conversation about datacube-ows is mostly concentrated in the Discord channel #wms.

Please help us to keep the Open Data Cube community open and inclusive by reading and following our Code of Conduct <code-of-conduct.md>__.

Setup

Datacube_ows (and datacube_core itself) has many complex dependencies on particular versions of geospatial libraries. Dependency conflicts are almost unavoidable in environments that also contain other large complex geospatial software packages. We therefore strongly recommend some kind of containerised solution and we supply scripts for building appropriate Docker containers.

Linting

.. code-block::

flake8 . --exclude Dockerfile --ignore=E501 --select=F401,E201,E202,E203,E502,E241,E225,E306,E231,E226,E123,F811
isort --check --diff **/*.py
autopep8  -r  --diff . --select F401,E201,E202,E203,E502,E241,E225,E306,E231,E226,E123,F811

Configuration and Environment

The configuration file format for OWS is fully documented here <https://datacube-ows.readthedocs.io/en/latest/configuration.html>_.

And example configuration file datacube_ows/ows_cfg_example.py is also provided, but may not be as up-to-date as the formal documentation.

Environment variables that directly or indirectly affect the running of OWS are documented here <https://datacube-ows.readthedocs.io/en/latest/environment_variables.html>_.

Docker-Compose

setup env by export ^^^^^^^^^^^^^^^^^^^

We use docker-compose to make development and testing of the containerised ows images easier.

Set up your environment by creating a .env file (see below).

To start OWS with flask connected to a pre-existing database on your local machine::

docker-compose up

The first time you run docker-compose, you will need to add the --build option::

docker-compose up --build

To start ows with a pre-indexed database::

docker-compose -f docker-compose.yaml -f docker-compose.db.yaml up

To start ows with db and gunicorn instead of flask (production)::

docker-compose -f docker-compose.yaml -f docker-compose.db.yaml -f docker-compose.prod.yaml up

The default environment variables (in .env file) can be overriden by setting local environment variables::

Enable pydev for pycharm (needs rebuild to install python libs)

hot reload is not supported, so we need to set FLASK_DEV to production

export PYDEV_DEBUG=yes export FLASK_DEV=production docker-compose -f docker-compose.yaml -f docker-compose.db.yaml up --build

setup env with .env file ^^^^^^^^^^^^^^^^^^^^^^^^

.. code-block:: console

cp .env_simple .env # for a single ows config file setup
cp .env_ows_root .env # for multi-file ows config with ows_root_cfg.py
docker-compose up

Docker

To run the standard Docker image, create a docker volume containing your ows config files and use something like::

docker build --tag=name_of_built_container .

docker run --rm \ -e DATACUBE_OWS_CFG=datacube_ows.config.test_cfg.ows_cfg # Location of config object -e AWS_NO_SIGN_REQUEST=yes # Allowing access to AWS S3 buckets -e AWS_DEFAULT_REGION=ap-southeast-2 \ # AWS Default Region (supply even if NOT accessing files on S3! See Issue #151) -e SENTRY_DSN=https://key@sentry.local/projid \ # Key for Sentry logging (optional) -e DB_HOSTNAME=172.17.0.1 -e DB_PORT=5432 \ # Hostname/IP address and port of ODC postgres database -e DB_DATABASE=datacube \ # Name of ODC postgres database -e DB_USERNAME=cube -e DB_PASSWORD=DataCube \ # Username and password for ODC postgres database -e PYTHONPATH=/code # The default PATH is under env, change this to target /code -p 8080:8000 \ # Publish the gunicorn port (8000) on the Docker \ # container at port 8008 on the host machine. --mount source=test_cfg,target=/code/datacube_ows/config \ # Mount the docker volume where the config lives name_of_built_container

The image is based on the standard ODC container.

Installation with Conda

The following instructions are for installing on a clean Linux system.

The following approaches have also been tested:

Flask Dev Server

Local Postgres database

  1. create an empty database and db_user

  2. run datacube system init after creating a datacube config file

  3. A product added to your datacube datacube product add url some examples are here: https://github.com/GeoscienceAustralia/dea-config/tree/master/products

  4. Index datasets into your product for example refer to https://datacube-ows.readthedocs.io/en/latest/usage.html

    ::

    aws s3 ls s3://deafrica-data/jaxa/alos_palsar_mosaic/2017/ --recursive \ | grep yaml | awk '{print $4}' \ | xargs -n1 -I {} datacube dataset add s3://deafrica-data/{}

  5. Write an ows config file to identify the products you want available in ows, see example here: https://github.com/opendatacube/datacube-ows/blob/master/datacube_ows/ows_cfg_example.py

  6. Run datacube-ows-update --schema --role <db_read_role> to create ows specific tables

  7. Run datacube-ows-update to generate ows extents.

Apache2 mod_wsgi

Getting things working with Apache2 mod_wsgi is not trivial and probably not the best approach in most circumstances, but it may make sense for you.

If you use the pip install approach described above, your OS's pre-packaged python3 apache2-mod-wsgi package should suffice.

::

cd /etc/apache2/mods-enabled ln -s ../mods-available/wsgi.load . ln -s ../mods-available/wsgi.conf .

Credits

This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.

.. Cookiecutter: https://github.com/audreyr/cookiecutter .. audreyr/cookiecutter-pypackage: https://github.com/audreyr/cookiecutter-pypackage