alexander-petkov / wfas

A placeholder for the WFAS project.
5 stars 1 forks source link

Add Historical Blended_4km_VH NDVI data to Geoserver #40

Open alexander-petkov opened 3 years ago

alexander-petkov commented 3 years ago

Add historical Blended-4km_VH NDVI data from

ftp://ftp.star.nesdis.noaa.gov/pub/corp/scsb/wguo/data/Blended_VH_4km/geo_TIFF/

There are also real time files which are updated weekly

Initial questions:

  1. Documentation for this data?
    Edit: Found description for files: https://www.star.nesdis.noaa.gov/smcd/emb/vci/VH/vh_ftp.php Link to PDF documentation: https://www.star.nesdis.noaa.gov/smcd/emb/vci/VH_doc/VHP_uguide_v2.0_2018_0727.pdf
  2. There are SMN, SMT, TCI VCI, VHI files--which ones do we need? I will need to calculate storage needs for them and bump up EBS volume size.
  3. Is Country-level data sufficient?
wmjolly commented 3 years ago

I want to keep the global data here, since it's only once per week.

We want SMN which is Smoothed-NDVI. We might also want to grab the VHI data in the future but for now, just SMN.

On Tue, Feb 16, 2021 at 7:23 AM alexander-petkov notifications@github.com wrote:

Add historical Blended-4km_VH NDVI data from

ftp://ftp.star.nesdis.noaa.gov/pub/corp/scsb/wguo/data/Blended_VH_4km/geo_TIFF/

There are also real time files which are updated weekly

Initial questions:

  1. Documentation for this data
  2. There are SMN, SMT, TCI VCI, VHI files--which ones do we need? I will need to calculate storage needs for them and bump up EBS volume size.
  3. Is Country-level data sufficient?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/alexander-petkov/wfas/issues/40, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA4G3D24AUYHXPWWKQQHSE3S7J5WZANCNFSM4XWRGIVQ .

alexander-petkov commented 3 years ago

I want to keep the global data here, since it's only once per week. We want SMN which is Smoothed-NDVI.

The SMN files are 60 GB in size for the archive since 2015, and keep growing as we accumulate weekly. I can increase the volume size by 100GB or so, if the plan is to retain the whole archive and keep accumulating.

Edit: the archive is since 1981

alexander-petkov commented 3 years ago

Interesting...

The source data is LZW compressed, with no tiling or overviews. Converting the data to Geotiff with DEFLATE compression, adding internal tiling, and 3 levels of overviews, the new files are still ~20% smaller than the original format:

gdal_translate -of GTiff \
   -co COMPRESS=DEFLATE \
   -co TILED=YES \
   -co NUM_THREADS=ALL_CPUS \
  VHP.G04.C07.npp.P2020001.SM.SMN.tif \
  VHP.G04.C07.npp.P2020001.SM.SMN.tif.new

gdaladdo --config COMPRESS_OVERVIEW DEFLATE \
   VHP.G04.C07.npp.P2020001.SM.SMN.tif.new 2 4 8 16

ls -ahl VHP.G04.C07.npp.P2020001.SM.SMN.tif VHP.G04.C07.npp.P2020001.SM.SMN.tif.new
-rw-r--r-- 1 root root 26M Feb 18 11:27 VHP.G04.C07.npp.P2020001.SM.SMN.tif
-rw-r--r-- 1 root root 21M Feb 18 11:54 VHP.G04.C07.npp.P2020001.SM.SMN.tif.new
alexander-petkov commented 3 years ago

Layer added, although I still have to write an update script.

So far I have added data from 2020-01-01 on...