ua-snap / snap-arctic-portal

0 stars 0 forks source link

Verify NCEP future data set with John #70

Closed brucecrevensten closed 8 years ago

brucecrevensten commented 8 years ago

Are the .grb files we're finding the right ones?

BobTorgerson commented 8 years ago

Yes, the .grb files are the correct ones, as confirmed by John. He stated that the CFSv2 data would be the most relevant and important. We can safely ignore the probabilistic data for now, as John said that the anomalies would tell the best story when placed next to or on top of the historical data.

Here is his email to me:


Good morning John,

In my exploration of the NCEP data for the Arctic Today Portal project, I've found the GRB files that you mention in your previous email at: ftp://ftp.cpc.ncep.noaa.gov/NMME/realtime_anom/

These are real-time forecast anomalies from the FTP given above, and are split into multiple organizations running their own model which is included as part of the generation of the ensemble. Was this the data you were talking about?

I can also see probabilistic forecasts from the data site at: ftp://ftp.cpc.ncep.noaa.gov/NMME/prob/ but this site does not explain what these .dat files are or how to make use of them. They are simply large binary data files that Michael Lindgren and I were having difficulty determining their use.

I have a couple of questions for you stemming from the above few sentences.

  1. Which of these real-time anomaly outputs would you want us to use? I don't see the ensemble results as an option, but rather the data from the organizations that are used to create the NMME ensemble. In your opinion, which would be best to use to be compared or contrasted with the historical NCEP output?
  2. Do you know how to make use of the .dat files found in the probabilistic forecasts? Do you think these data files are important for this project over the forecast anomalies?

Thank you in advance,

Bob Torgerson

Bob,

That NMME site definitely has more than we would want, as it is a clearinghouse for the long-range simulations by all the major modeling centers. The model that makes the most sense for our use is the NCEP CFSv2. That model is a variant of the one used for the past temperatures -- the NCEP reanalysis that we have accessed for the temperatures through the past two days. So going with the CFSv2 would give us some continuity. (CFSv2 stands for Coupled Forecast System, version 2; "coupled" means that it has an interactive ocean and sea ice coupled to the atmospheric component of the model).

I have not used the .dat files, so I cannot offer much help there. I view the probability maps as secondary priorities. The CFSv2 temperature anomalies are easy for users to relate to, as they are simply the number of degrees above or below normal (although degrees C in this case). The probabilities apply to the different terciles of temperature, and a display of those would require that users understand the meaning of terciles -- some users would understand, but others would be confused without a thorough explanation.