Tidmarsh Data Processing
This repository outlines the standard workflow process for all Living Observatory datasets.
Workflow
1. Create issue for new dataset
Note: This step should be done as soon as you've made a plan to collect data
- Go to the issues page
- Click on "New Issue"
- Fill out required/relevant fields in the issue template for the dataset
- Add lables to the issue if appropriate (e.g. vegetation plots, DTS, piezometer)
- Move the issue into the "On Deck" column in the data workflow page
2. Clean and reformat the data according your lab group's standards
- Move issue into the "Clean & Standardize" column in the data workflow page until this step is completed
- Reference the page in this repository that documents your lab groups data standards
- Once done, check the "QA/QC" and "Format / standardize data" boxes in the to do list for the issue
3. Perform a final check on dataset
Note: this final review may have to be done by a specific person within the lab group or by the database manager. If you are not personally performing the final check, please assign the dataset issue to the person responsible for the final review.
- Move issue into the "Final Check" column in the data workflow page until this step is completed
- Confirm that the dataset has all required fields for the dataset type
- Confirm that there are no errors in the dataset
- Confirm that the dataset has the appropriate structure and file format (e.g. csv, JSON)
4. Import the dataset into the Living Observatory database
Note: This may done on your own using the chain-API, or by the database manager if appropriate
Once this is done:
- Click on "Close Issue" on the issues page
- Comment dataset's API link so it can easily be located and extracted by any team member
- Check the last two boxes in the to do list for the dataset issue
- Move issue to the "Done" column in the data workflow page