A software stack that allows automatic submission of jobs to hyp3 over a specified area to easily monitor areas of interest.
Events represent an area of interest and a timeframe for which RTC and InSAR products will be generated. Events are managed (manually) as records in a DynamoDB table:
{
"event_id": "myEvent",
"wkt": "POINT (0 0)",
"processing_timeframe": {
"start": "2021-02-01T00:00:00+00:00",
"end": "2021-03-01T00:00:00+00:00"
}
}
The processing_timeframe.end
attribute is optional, and can extend into the future. Any additional attributes required
by a client application can be included in an event record.
Event monitoring routinely searches ASF's inventory for Sentinel-1 IW SLC granules matching any registered events. For each such granule, one RTC job two InSAR jobs (for nearest and next-nearest neighbors) is automatically submitted to HyP3. Output products of HyP3 jobs are automatically migrated to an S3 bucket with public read permissions for long term archival and distribution.
A public REST API is provided to query events and products:
/events
returns all registered events/events/<event_id>
returns the requested event with a list of its products/recent_products
returns all products processed in the last weekThese resources are required for a successful deployment, but managed separately:
Review the parameters in cloudformation.yml for deploy time configuration options.
python -m pip install -r requirements-find-new.txt -t find_new/src
python -m pip install -r requirements-api.txt -t api/src
python -m pip install -r requirements-harvest-products.txt -t harvest_products/src
Package the CloudFormation template
aws cloudformation package \
--template-file cloudformation.yml \
--s3-bucket <CloudFormation artifact bucket> \
--output-template-file packaged.yml
Deploy to AWS with CloudFormation
aws cloudformation deploy \
--stack-name <name of your HyP3 Event Monitoring Stack> \
--template-file packaged.yml \
--role-arn <arn for your deployment user/role> \
--capabilities CAPABILITY_IAM \
--parameter-overrides \
"EDLUsername=<EDL Username to submit jobs to HyP3>" \
"EDLPassword=<EDL Password to submit jobs to HyP3>" \
"HyP3URL=<URL to a HyP3 deployment for the stack to use"
## Testing
The HyP3 Event Monitoring source contains test files in `tests/`. To run them you need to do a bit of setup first.
- Add components to python path
```sh
export PYTHONPATH="${PYTHONPATH}:${PWD}/find_new/src:${PWD}/api/src:${PWD}/harvest_products/src"
Setup environment variables
export $(cat tests/cfg.env | xargs)
Install test requirements
python -m pip install -r requirements-all.txt
Run tests
pytest tests/