Open dschlaep opened 8 years ago
Currently, accessing climate scenarios and downscaling is a very slow operation. And huge amounts of data are read (and re-read) from disk.
It may be (much) faster (likely depending on how many sites are processed) if the algorithm isn't organized around sites and their downscaling needs, but instead around accessing individual netCDF files, i.e., instead of looping over sites (calls to 'try.ScenarioWeather' for each site), we should consider looping over netCDF files, extract information for all sites/conditions at once, temporarily store them, and then once all netCDF files are extracted, loop over sites to downscale.
What do you think -- is this worth checking out?
Hi Daniel,
I definitely think this is worth checking out. While not egregiously slow, extracting and downscaling the future weather scenarios is the slowest of the external extractions that I have experienced (I've never used NCEP CFSR for daily.. that might be slower!).
I need to look more into the NCO formatting and functionality... I imagine we could call the program through system commanda in R?
I wouldn't mind taking on this task, or distributing it to the undergraduate if time permits. They are currently still working on setting up additional functionality for external data extractions and then are going to work on adding CO2 effects.
-Caitlin
On Wed, Sep 28, 2016 at 9:26 AM, daniel notifications@github.com wrote:
Currently, accessing climate scenarios and downscaling is a very slow operation. And huge amounts of data are read (and re-read) from disk.
It may be (much) faster (likely depending on how many sites are processed) if the algorithm isn't organized around sites and their downscaling needs, but instead around accessing individual netCDF files, i.e., instead of looping over sites (calls to 'try.ScenarioWeather' for each site), we should consider looping over netCDF files, extract information for all sites/conditions at once, temporarily store them, and then once all netCDF files are extracted, loop over sites to downscale.
What do you think -- is this worth checking out?
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub https://github.com/Burke-Lauenroth-Lab/SoilWat_R_Wrapper/issues/29#issuecomment-250220180, or mute the thread https://github.com/notifications/unsubscribe-auth/AM0e_xGbZsBd5csLCQv0P_eHj9pEajA1ks5qupVKgaJpZM4KHcvM .
Caitlin Andrews Ecologist Colorado Plateau Research Station Southwest Biological Science Center U.S. Geological Survey Mobile: 802-922-3494
Hi Caitlin-
I agree and adding the CO2 effects has much higher priority!
There is also the ‘RCMIP5’ package out there, but it uses ncdf4 and dplyr and doesn’t seem to be parallelized.
Thanks -Daniel
On Sep 29, 2016, at 01:38, CaitlinA notifications@github.com wrote:
Hi Daniel,
I definitely think this is worth checking out. While not egregiously slow, extracting and downscaling the future weather scenarios is the slowest of the external extractions that I have experienced (I've never used NCEP CFSR for daily.. that might be slower!).
I need to look more into the NCO formatting and functionality... I imagine we could call the program through system commanda in R?
I wouldn't mind taking on this task, or distributing it to the undergraduate if time permits. They are currently still working on setting up additional functionality for external data extractions and then are going to work on adding CO2 effects.
-Caitlin
On Wed, Sep 28, 2016 at 9:26 AM, daniel notifications@github.com wrote:
Currently, accessing climate scenarios and downscaling is a very slow operation. And huge amounts of data are read (and re-read) from disk.
It may be (much) faster (likely depending on how many sites are processed) if the algorithm isn't organized around sites and their downscaling needs, but instead around accessing individual netCDF files, i.e., instead of looping over sites (calls to 'try.ScenarioWeather' for each site), we should consider looping over netCDF files, extract information for all sites/conditions at once, temporarily store them, and then once all netCDF files are extracted, loop over sites to downscale.
What do you think -- is this worth checking out?
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub https://github.com/Burke-Lauenroth-Lab/SoilWat_R_Wrapper/issues/29#issuecomment-250220180, or mute the thread https://github.com/notifications/unsubscribe-auth/AM0e_xGbZsBd5csLCQv0P_eHj9pEajA1ks5qupVKgaJpZM4KHcvM .
Caitlin Andrews Ecologist Colorado Plateau Research Station Southwest Biological Science Center U.S. Geological Survey Mobile: 802-922-3494 — You are receiving this because you were assigned. Reply to this email directly, view it on GitHub https://github.com/Burke-Lauenroth-Lab/SoilWat_R_Wrapper/issues/29#issuecomment-250332492, or mute the thread https://github.com/notifications/unsubscribe-auth/AEAp2zM6qc03tl6b6xgvP8qoK9TWEynSks5quvpkgaJpZM4KHcvM.
task
goal
information