HARPgroup / meta_model

0 stars 0 forks source link

Add Local runoff inflows after ETM step #9

Open rburghol opened 1 year ago

rburghol commented 1 year ago

Tasks

Prepare Model for DSN 10

Include in In hsp2->river->prep: https://github.com/HARPgroup/meta_model/tree/main/models/hsp2_cbp6/river/prep

Adding a blank DSN

DSNs must exist before trying to put data in them. So we use wdmtoolbox (a la: sudo pip install wdmtoolbox, see more at https://pypi.org/project/wdmtoolbox/ and https://timcera.bitbucket.io/wdmtoolbox/docs/index.html)

rburghol commented 1 year ago

Check variables after running:

. hspf_config
MODEL_ROOT=$CBP_ROOT
META_MODEL_ROOT=/opt/model/meta_model
export MODEL_ROOT META_MODEL_ROOT

/opt/model/meta_model/run_model hsp2_cbp6 hsp2_2022 PS2_5550_5560 auto river 

echo PS2_5550_5560.wdm PS2_5550_5560_0011.csv 10 1 w message.wdm | wdm_insert_one

# note there is strange behavior when using wdm2text to export, then wdm_insert_one to import that file, 
# then wdm2text to export.  While the original wdm2text has data through hour 23 on 12/31, the wdm_insert_one 
# expects hours from 1 to 2 instead of 0 to 23, and therefore, the final `wdm2text` fails if you request the final year 
# of the simulation, as it is one hour shy.
# So, instead of echo PS2_5550_5560.wdm,1984,2020,11 | wdm2text
# We do:
echo PS2_5550_5560.wdm,1984,2019,10 | wdm2text
rburghol commented 1 year ago

Verify location of data in h5 file

cd tmp/scratch/1587 # this was the scratch dir from the last model run 
python3
# the rest is python
import h5py
import numpy as np
import pandas as pd
from collections import defaultdict
from pandas import DataFrame, read_hdf, HDFStore
from numba.typed import Dict

f = h5py.File(fpath,'a')
fpath = 'PS2_5550_5560.h5'
ts = f['TIMESERIES'].items()
dsn10 = f['TIMESERIES/TS010/table']
dsn10[0]
# (441763200000000000, 20.68031311)