Open JonCook opened 1 year ago
One thing that might pose a small challenge is HAPI's streaming requirement, so data is sent out as it is read in. This server anticipates this by breaking up data requests into chunks. So for example the server will request a day, send out the day, request the next day send it out, etc. There is a performance enhancement which could be made which reads the second day as the first is streamed out, and I'll see if I can put that in.
There are really no requirements for the landing page, though the one the software generates does a catalog request and generates example queries based on that.
I expect to be working on this in earnest after the new year.
The bulk of this was completed early 2024, and at the DASH conference in Madrid Jon and I will finish off some of the loose ends. In particular:
Also it's been mis-branded hapi-esdc, when hapi-soar seems a more appropriate name. I'll ask Jon about this today.
Jon suggested this might be fine. Right now the intent is that it is just for SOAR files, but other products will probably be added.
Hi @jbfaden @jvandegriff
Following on from the HAPI servers at ESAC telecon meeting held on 22nd November 2023, we would like to create a HAPI server for serving in-situ data for initially Solar Orbiter, but this will be extended or we will create new HAPI servers for other missions which also serve data in a similar way.
As we discussed the in-situ instruments for Solar Orbiter (EPD, SWA, MAG, RPW) serve CDF files via a TAP service. This use case is quite generic and not only limited to Solar Orbiter. SMILE (launch date for 2025) will follow the same approach and sometime in the future the Helio Multi Mission Platform will host applicable data like this as well.
As a starting point Here I provide a TAP metadata query to get a list of filepath and filename for a given date and two in-situ instruments, ordered by begin time.
TAP is very powerful, you can adapt the query as you please to include an end_time or just for one instrument or restrict by processing level - whatever you want. We have plenty of documentation at:
But, please let me know if you need help with the queries of course.
You can then download all those files as tar using the following query (this was to have the files for test purposes I understood)
NOTE - I have changed the year in this request to 2023 as there is a limit on the total amount of data (50GB) you can download this way. This will give you a tar of 852MB.
You can also download the files individually using a query like:
But I understood that having a mount point inside the docker which accesses the /soar/soarps file area would probably be more efficient (no need for external calls to download files and everything done locally)
I have doubts for the landing/home page, do you need a query to get a list of available datasets? We also have this available.
Many thanks, Jonathan