qjhart / dwr-spatial-cimis

This is an updated set of files from DWR's spatial CIMIS program
MIT License
0 stars 0 forks source link

Spatial CIMIS 3.0 #1

Open qjhart opened 2 months ago

qjhart commented 2 months ago

11:55 Spatial CIMIS 3.0

September Timeframe

In order to make your september timeframe, I’d plan on making no modifications to the existing code, and concentrate on getting the groundwater basins in place. In general, I’d discourage you from moving away from grass, and instead look to take advantage of the grass/python interface, and keep the processing, especially the raster processing.

  1. Copy Zipcode Methodology

    The current zipcode methodology assigns a every cell to one zipcode, and assigns at least one cell to every zipcode. The average of the cells is used for the zipcode. This seems like a good method for the the groundwater vectors as well. The current code uses a simple script for this, but there might be a better grass command for that. I can’t remember why we did it that way.

    1. Use Groundwater shapes

      Similar to the zipcode, the methodology is to convert those groundwater areas into sets of 500m ET pixels.

  2. Distribution

    In our current methodology, we are constrained in our distribution in that the zipcode needed to go into the CIMIS Oracle distribution. In addition, we are constrained to the 2km raster processing. We need to understand if we have any similar constraints we have.

    1. Use CIMIS Mobile app template

      If you don’t have any requirements to the distribution method, For the September timeframe, I would abandon the interactive geometry interface, and instead focus on a more simple implementation. Back in 2018, we developed a CIMIS Mobie App. This app uses a simple REDIS database to provide access to pixels and zipcodes. It is a 2 week ring buffer. I would just the same methodology, extended with ring buffers for the ground water basins.

      1. Keep 2 weeks data

        We found a two week ring buffer was great for graphs. Another good feature of that is you can use this for better predictions later, when you add (our awesome) ETo prediction functions.

      2. Summarize Beyond that at 14 day(I think)

        One thing you could do too add to this application is to add a summarizatio step to the two week feed. What we ended up in our ETO zone maps was to create 14day moving window averages, provided at 1 week increments. We had previously determined these were the appropriate aggregations, and that’s what I’d use initially.

New Algorythmic Development

  1. Divide into radiation / interpolation

    1. Radiation

      1. Turbidity

      2. Environmental Radiation?

        1. Slope, Aspect, Horizon
    2. Interpolation

      1. Re-evaluate Normalization Step

      2. Review for better Wind Speed models

qjhart commented 2 months ago

I completely forgot the modifications for moving to a internet based GOES processing. I would still put that past the September timeframe. I'd have to do some timing on the goes data, but I would probably try and maintain the near up to date Radiation calculation, and fetch data multiple times over the day. You might be able to get away with mesocale downloads, but my guess would be to use CONUS and keep the it simple. After getting that organized, you could look to update with cloud cover as we discussed.

qjhart commented 2 months ago

The final solar calculations only use the r.iheliosat binary and the the g.cimis.daily_solar script. No other applications are required. I'll tag these releases as 1.0.0, so you and DWR can tell if I've made any modifications. This is probably most important for r.iheliosat, since that binary should be used as is.

The script leaves only the 20m B2 and P, rasters which are the ones that need external data or other mapsets. Everything else can be reproduced with a call to g.cimis.daily_solar -s -f to force a rebuld and save intermediate files. It's still 2Gb of data, but that's far better then the current 32Gb :)