I know I'm a bit late to the party as it's been a while since the last update but... First of all: nice project!
Other implementations seem to base themselves of an RG image (rgb but no blue component) where the vertex shader is used to parse the "texture" & depending on the R&G components in the raster image file the U&V compoents are being inferred. I like this approach better as it's purely data driven.
Question I was having was: is it possible to share the tool you have used to extract the information from the GRIB2 files? At least I assume you were using grib2 files from the NCEP sftp? Eccodes is a potential avenue, but how did you create the supplied json as a datasource for the visualisation? Is it a custom implementation or is this output generated by the tool in question?
Hi
I know I'm a bit late to the party as it's been a while since the last update but... First of all: nice project!
Other implementations seem to base themselves of an RG image (rgb but no blue component) where the vertex shader is used to parse the "texture" & depending on the R&G components in the raster image file the U&V compoents are being inferred. I like this approach better as it's purely data driven.
Question I was having was: is it possible to share the tool you have used to extract the information from the GRIB2 files? At least I assume you were using grib2 files from the NCEP sftp? Eccodes is a potential avenue, but how did you create the supplied json as a datasource for the visualisation? Is it a custom implementation or is this output generated by the tool in question?
Thx!