r-spatial / stars

Spatiotemporal Arrays, Raster and Vector Data Cubes
https://r-spatial.github.io/stars/
Apache License 2.0
564 stars 94 forks source link

An error occurred while using the st_get_dimension_values function to obtain the time #688

Open PanfengZhang opened 6 months ago

PanfengZhang commented 6 months ago

An error occurred while using the st_get_dimension_values function to obtain the time.

The example of .nc data is from NCEP (https://psl.noaa.gov/thredds/fileServer/Datasets/ncep.reanalysis/Monthlies/surface/slp.mon.ltm.1991-2020.nc)

library(stars)
diri <- "C:/Rmet/data/slp.mon.ltm.1991-2020.nc"
fin <- read_ncdf(diri, var = "slp")
Time <- st_get_dimension_values(fin, which = "time")
Time

##  [1] "0000-12-30 UTC" "0001-01-30 UTC" "0001-02-27 UTC" "0001-03-30 UTC"
##  [5] "0001-04-29 UTC" "0001-05-30 UTC" "0001-06-29 UTC" "0001-07-30 UTC"
##  [9] "0001-08-30 UTC" "0001-09-29 UTC" "0001-10-30 UTC" "0001-11-29 UTC"

According to the metadata information of the data, it is known that the actual time range is "0001/01/01 00:00:00 - 0001/12/01 00:00:00".

图片

pvanlaake commented 6 months ago

This file contains climatology data which is why the year seems to be off (in fact it is off, it should be year 0 according to the COARDS convention which this file claims to follow). It is also off by 2 days, which appears to be an issue with POSIXt for these far away years. This is thus not an error in stars but more like an unexpected outcome.

I do not know how to resolve this in stars, but a q-n-d patch is Time <- st_get_dimension_values(fin, which = "time") + days(2).

The data for the climatology is for the period 1991/01/01 - 2020/12/31 (attribute climo_period of dimension time), or the current climatological normal period. It's a bit of a mystery why NCEP would continue to use the obsolete COARDS conventions instead of the current CF Metadata Conventions to produce current data.

pvanlaake commented 2 months ago

The inverse situation also happens...

PanfengZhang commented 2 months ago

Thank you very much for your reply!