However this is not a great long term solution and it seems the server is now returning an error from the curl command used to download files.
It would be better for these files to be permanently hosted on zenodo, where they could be versioned. Ideally then any user could upload a datafile and have it downloadable by the code if there was a mapping stored in the code (e.g. in datafiles.f90) between the local file and the zenodo url for download.
The main limitation is that zenodo only allows a "flat" upload of files with no directory structure, so would need either one zenodo dataset per directory or would need to somehow download and unpack a .zip file as part of the process
Currently phantom data files are hosted on my website https://users.monash.edu.au/~dprice/phantom/data
However this is not a great long term solution and it seems the server is now returning an error from the curl command used to download files.
It would be better for these files to be permanently hosted on zenodo, where they could be versioned. Ideally then any user could upload a datafile and have it downloadable by the code if there was a mapping stored in the code (e.g. in datafiles.f90) between the local file and the zenodo url for download.
The main limitation is that zenodo only allows a "flat" upload of files with no directory structure, so would need either one zenodo dataset per directory or would need to somehow download and unpack a .zip file as part of the process