Open rburghol opened 10 months ago
@rburghol A simple batch script for running multiple download and import scripts. Note that I've hard coded the start/end times of the datasets. Probably not necessary given our set-up of the download script.
# set needed environment vars
MODEL_ROOT=/backup/meteorology/
MODEL_BIN=$MODEL_ROOT
SCRIPT_DIR=/opt/model/model_meteorology/sh
MET_SCRIPT_PATH=$SCRIPT_DIR
export MODEL_ROOT MODEL_BIN SCRIPT_DIR MET_SCRIPT_PATH
#Download and import PRISM and daymet rasters between dates as available:
startYear=1983
endYear=2024
#Set availability dates for ease:
daymetStartAvailable=1980
daymetEndAvailable=2023
PRISMStartAvailable=1895
PRISMEndAvailable=2024
for (( YYYY=$startYear ; YYYY<=$endYear ; YYYY++ )); do
echo "Running download and import sbatch for $YYYY"
#Daymet download script: Only download daymet data if available
if [ $YYYY -ge $daymetStartAvailable ] && [ $YYYY -le $daymetEndAvailable ]; then
metsrc="daymet"
doy=`date -d "${YYYY}-12-31" +%j`
#Create a loop that runs a slurm job for the download and import script for each day of the year
i=0
while [ $i -lt $doy ]; do
thisdate=`date -d "${YYYY}-01-01 +$i days" +%Y-%m-%d`
sbatch /opt/model/meta_model/run_model raster_met "$thisdate" $metsrc auto met
i=$((i + 1))
done
fi
#PRISM download script: Only download daymet data if available
if [ $YYYY -ge $PRISMStartAvailable ] && [ $YYYY -le $PRISMEndAvailable ]; then
metsrc="PRISM"
doy=`date -d "${YYYY}-12-31" +%j`
#Create a loop that runs a slurm job for the download and import script for each day of the year
i=0
while [ $i -lt $doy ]; do
thisdate=`date -d "${YYYY}-01-01 +$i days" +%Y-%m-%d`
sbatch /opt/model/meta_model/run_model raster_met "$thisdate" $metsrc auto met
i=$((i + 1))
done
fi
done
Prism (points) versus NLDAS2 (blocks) Rapidan River:
Current needs:
Overview
Project Brief Goals and Outline
DEQ needs a process of identifying time and locations of precipitation input errors, and to use that information to rank varying precipitation inputs according to accuracy, and to create an aggregate dataset from the best available spatially and temporally. This analytical process should be able to integrate into existing DEQ workflows, and Virginia Tech should work with DEQ to design updated workflows where necessary to support the integration of these new analytical processes.
Tasks
geo
work flows:amalgamate
workflowsDiagram