Closed timtrice closed 5 years ago
More fstadv products may be located in another subdirectory of the FTP server. In the archives (ftp://ftp.nhc.noaa.gov/atcf/archive/) are yearly subdirectories. In 1998, for example, are dat.gz files for each cyclone. It appears,
Then, there are two subdirectories:
It is messages that may contain what I want. For example, there is a al011998_msg.zip. Inside,
So, may be able to use the FTP server to get all data without going to the front-end.
Additionally, it seems the archives for these text products may go all the way back to 1991 giving another 7 years of data
Computer forecast model data may exists for even earlier cyclones.
Allow either a key or a year (extract the year from key)
Given the year, determine what link to access;
If not current year, ftp://ftp.nhc.noaa.gov/atcf/archive/MESSAGES/2017/
Else, ftp://ftp.nhc.noaa.gov/atcf/
After this point, the functionality should remain the same.
Example of retrieving FTP list of URLs
url <- "ftp://ftp.nhc.noaa.gov/atcf/archive/MESSAGES/2017/mar/"
hdl <- curl::new_handle(dirlistonly = TRUE)
con <- curl::curl(url, "r", hdl)
tbl <- read.table(con, stringsAsFactors = FALSE, fill = TRUE)
close(con)
Duplicate more or less of #113
As it stands now, the flow of
get_fstadv
or the equivalentget_storm_data(products = "fstadv")
is:get_storm_data
extract_product_contents
where the text for each product is extracted.fstadv()
, loaded into a dataframe and returned.With this, any cyclone in the AL or EP basin since 1998 has a fstadv product.
These products do exist on the FTP server. However, they are incomplete for the same time period and both basins.
Most recent cyclone fstadv products can be found here: ftp://ftp.nhc.noaa.gov/atcf/mar/
Archived fstadv products can be found here: ftp://ftp.nhc.noaa.gov/atcf/archive/MESSAGES/
However, not all years have these products (and can possibly be assumed not all cyclone's products will exist here, either).