University-of-Reading-Space-Science / HUXt

HUXt - a lightweight solar wind model.
MIT License
38 stars 17 forks source link

What is the difference betwen this model and "BRaVDA-HUXt"? #12

Closed eabase closed 4 months ago

eabase commented 4 months ago

On the web page: https://research.reading.ac.uk/met-spate/huxt-forecast/

There are 2 different models shown.

(a) What are the difference and how is this repo's connected to that. (b) Where can I find more info (or the code) for testing the experimental model?

There is also the data issue. (c) Where/How can I get the latest data for running these models?

LukeBarnard commented 4 months ago

Hi @eabase

(a) What are the difference and how is this repo's connected to that.

HUXt is a heliospheric numerical model that simulates the flow of the solar wind from near the Sun out through the heliosphere, typically from the Sun to the Earth for our purposes.

BRaVDA is a data assimilation scheme for improving the real-world representivity of HUXt simulations. BRaVDA assimilates in-situ measurements of solar wind speed from monitors such as ACE, Wind, DSCOVR, and STEREO-A. With these data it computes updated boundary conditions that, when used with HUXt (or potentially other solar wind numerical models), results in simulations that better match the oberved evolution at Earth.

Here is a paper that describes the development of BRaVDA: https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2020SW002698

The BRavDA codebase has it's own repo on GitHub: https://github.com/University-of-Reading-Space-Science/BRaVDA

(b) Where can I find more info (or the code) for testing the experimental model?

In the experimental forecast, these two modules have been combined together. This is in a repo that is not currently public. This is because it depends on access to an API that cannot be made publicly accessible, and so the forecasting codebase will not function for people not directly involved in the work. We are working on finding/supplying public data sources, at which point the whole forecast code base can be made open.

(c) Where/How can I get the latest data for running these models?

This is currently a challenge in our field. The boundary conditions for the forecast model require specification of both the ambient solar wind speed, and the properties of any coronal mass ejections. The former can be obtained from services provided by NASA's CCMC (https://ccmc.gsfc.nasa.gov/). The latter requires analysis of the cornograph data to describe the CME structures. Some services do provide this data, e.g. https://kauai.ccmc.gsfc.nasa.gov/DONKI/. However, as far as I am aware, there is no public repository that provides real-time access to the latest WSA and CME classifications for use in the forecast model.

As this isn't directly an issue with the function or development of HUXt, I'm going to close this issue.

eabase commented 4 months ago

Hi @LukeBarnard ,

Thank you so much for the very detailed explanation! Absolutely great! I will have a look at the links and resources you gave.

That said, it seem to be a common issue for the community to collect live sources and data-feeds that is easily and openly accessible.

The latter requires analysis of the cornograph data to describe the CME structures. ... However, as far as I am aware, there is no public repository that provides real-time access to the latest WSA and CME classifications for use in the forecast model.

So where do you get that data? Perhaps I could at least ask/request access?

LukeBarnard commented 4 months ago

This forecast dashboard was developed as part of the UKRI funded SWIMMR programme (https://www.ralspace.stfc.ac.uk/Pages/SWIMMR.aspx), specifically as part of SWIMMR STFC Project S4 Space Weather Empirical Ensemble Package (SWEEP)​. As part of this, the UK Met Office have developed a service providing API access to the analysis of CMEs produced by their forecasters, as well as the latest results from WSA. This is where our forecast dashboard gets the data from.

I don't have permission to share access to this API. Whilst you could in principle request access to the API, I suspect you will find that the answer will unfortunately be no; to the best of my knowledge the policy is that access is only given to SWEEP project partners. Sorry that I can't be more helpful with this issue!