Open ChristinaB opened 4 years ago
What I was thinking of task is to:
1) covert and test the mean and stdev. from Nicoleta's Step_2 notebook from a xarray to a vector or bumpy array that can be exported/saved as a text so that we can use directly in a lognormal notebook similar to (20191207_netcdf_Lognormal_spatial_Depth_Synthetic_LandlabLandslide.ipynb), line 13, but real data. We just need to make sure a flattened matrix started with first value representing grid 0, bottom left corner. Thus, we need to test this. The 'chunk' function, may do this already and we just need to bring that into Nicoleta's Step_2 notebook.
Think we could do this with an array to flatten from the bottom left up to top right:
array=np.array([[6, 7, 8], [3, 4, 5], [0, 1, 2]]) fliparray = np.flip(array, axis=0) [so only y axis flips] landlabarray =fliparray.flatten() [flattens by rows] type(landlabarray) [should be np array landlabarray
[0 1 2 3 4 5 6 7 8]
This is the file to continue coding.
20191206_netcdf_DataDriven_spatial_Depth_SCL_LandlabLandslide
or
20191206_netcdf_DataDriven_spatial_Depth_Synthetic_LandlabLandslide.ipynb
March 20, 2020 Update
Update
1) I have two folders of dates in this Slippery Future Data HS resource: this is the link to the Skagit dates.
2) I finished postprocessing the Skagit historic model but its a huge model folder getting zipped. still running...is there any other processing we need to do? Streamflow? Trying to not get distracted but want to check.
[ ] Upload data to DHSVM/models_skagit Skagit Map(col x rows*datesout).dtw.asc
[ x ] Decide whether we want other data: ice, snow, streamflow, climate forcings. No.
3) Landlab input dictionary: Nicoleta's code was:
mean_dtw_scl = data.wt.mean("time")
from here I sorted out these four outputs that are arrays the length of the nodes. I'm running it on both her netcdf built from the ascii AND the flipped version of my netcdf, and we can compare. [ one line with xarray or 100 lines with me hacking it...let's go with xarray]. The second block is how we will build the dictionary for the lognormal forcing. Look good? We can query and compare the two to be sure it gets flattened to an array as expected.
[ ] Rerun this code with Skagit dates and Skagit map files. Input: one map file. Xarray. Calc. mean/std from netcdf. Test flippy. Save two arrays as two pickles for each model for Landlab I/O. See 20200331_map2netcdf2array_lognormal_spatial_Depth_SCL_LandlabLandslide.ipynb and https://www.hydroshare.org/resource/4cac25933f6448409cab97b293129b4f/
[ ] Load arrays into Landlab Notebook for lognormal_spatial. Output ascii files for probability.
[ ] Testing Notebook to loop through 7 - check space and memory limits on HydroShare
[ ] Setup runs on XSEDE for I/O and data driven spatial
4) Planning for figures and data management
5) Landlab component updates
[ ] test notebook testing four synthetic examples.
[ ] zoom into fire case study Goodell Creek for lognormal spatial
Import libraries and data from HydroShare
Run Notebook from CUAHSI JupyterHub server (public browser access)
Process DHSVM model output: Input DTW ascii for model instance (DHSVM model w/unique timeseries of storms saved based on max saturation annual events unique for each climate forcings (historic (1) & future (6))
[ ] Add time stamp as date not string
Scale/Resample Hydro grid (150m) to Landlab grid (30m)
Use numpy and xarray to process and analyze mean and std of single model instance (30 m)
[ ] Add Markdown to discuss how to design Landlab utility;
[ ] Add and test Topmodel approach to resampling.
Visualize netcdf and landlab grid maps to demonstrate spatial orientation of array outputs
[ ] Output mean and standard deviation DTW arrays as model instance for Landlab lognormal spatial landslide model input (format.txt)
[ ] Loop through time to extract one array per node
Output dictionary DTW pickle as model instance for Landlab data-driven spatial landslide model input (format.pickle)
Version 1: multiple hyd. models with various climate forcings and lognormal spatial landslides for SCL domain
Notebook 3.1 Synthetic domain run historic climate data with four Landslide models
Notebook 3.2 SCL domain Lognormal Spatial Climate Forcing Comparison (1-7 model instances):
This Jupyter Notebook runs the Landlab LandslideProbability component on a Seattle City Light
Landlab grid using four depth to water table options to replace recharge options described in the paper:
use code in .py to create mean and std from a netcdf and save the pickle
test the code inside 20191206_netcdf_DataDriven_spatial_Depth_Synthetic_LandlabLandslide.ipynb save the output
upload the pickle into the lognormal spatial synthetic notebook. and input the pickle instead of the random input data