Open glmcr opened 1 year ago
sqlite is probably more than enough for our DB requirements, if not we will have to run an instance of a heavier open source database like MySQL or Postgres, this could increase the requirement for the server where pygeoapi is running,
Related to #35 and #28
1). We will begin with simple errors statistics (arithmetic mean, variance, median(?)) for tidal predictions for some of the most important TGs stations (keeping in mind that we could do those calculations for all TGs stations having WL obs data) .
2). We will use those errors statistics for the "uncertainties" attributes in the S111 and S104 DCF8 (and also S111,S104 DCF2 and DCF3) products.
3). We will also use this code to produce WLs forecast errors stats for both H2D2 and Spine-OneD models results and we will compare those forecast errors to prove that H2D2 forecasted WLs (adjusted using the last obs WLs a-la-Spine-OneD) provides better or similar forecast quality than Spine-OneD model.
4). We will then add error covariances calculations between the TGs afterwards.
Things to keep in mind:
The errors statistics code should be generic (data agnostic) as much as we can do since we will probably also use it for comparing HADCP measurements with models forecasted currents.
We will need to implement code that can extract data (WLs, currents) from models results (after conversion to S104, S111 (and S4**)) at the locations of the TGs and HADCP. It will be possible to use our already developped code (on the science network) that assign models grid points to tiles
Things to discuss: