Closed PaulWessel closed 10 months ago
I am downloading the original file to double-check.
Great. I have processed all Earth data and mars (will depends on what you find out) and running moon now - in a few hours all planets and Earth will be done and then I might as well refresh the candidate server with all of it.
Ok, I have already made venus, pluto and mercury.
Great, then once the mars mystery is solved you could look at the Makefile and scripts and see how up update candidate with those planets. I can do the Earth data, and then we can see where we are with mars and moon.
Moon is half-way through on my system.
Perhaps @Esteban82 can double-check and verify that the original indeed has some NaNs?
I run this command and I got almost 40.000 NaN. I couldn't do it for the whole grid due memory issues.
gmt grdinfo Mars_HRSC_MOLA_BlendDEM_Global_200mp_v2.tif -M -R179:59/180/-90/90
Mars_HRSC_MOLA_BlendDEM_Global_200mp_v2.tif: 39957 nodes (15.0%) set to NaN
Hm, makes no sense. If I do that on the TIF I still get 6474 NaN nodes. If I run gmt grd2xyz -s+r I get the same. Same with the nc version of the file. This is Mars_HRSC_MOLA_BlendDEM_Global_200mp_v2.*.
gmt grdinfo Mars_HRSC_MOLA_BlendDEM_Global_200mp_v2.nc -R178/180/-90/90 -M | grep NaN
grdinfo [WARNING]: (w - x_min) must equal (NX + eps) * x_inc), where NX is an integer and |eps| <= 0.0001.
grdinfo [WARNING]: w reset from 178 to 177.999137721
Mars_HRSC_MOLA_BlendDEM_Global_200mp_v2.nc: 6474 nodes (0.0%) set to NaN
@joa-quim, how is this possible I have 64 Gb memory and can read the whole file and get 6474 NaN nodes.
https://planetarymaps.usgs.gov/mosaic/Mars/HRSC_MOLA_Blend/Mars_HRSC_MOLA_BlendDEM_Global_200mp_v2.tif
I think that is this same issue #189.
There I have posted here that:
Mars_HRSC_MOLA_BlendDEM_Global_200mp_v2.nc: 39956 nodes (0.0%) set to NaN
From gdalinfo, the Mars TIF file is 106694 x 53347, has AREA_OR_POINT=Area
so it is pixel registered, and states the region is -R-180/179.9984479/-89.9992240/90
. Pixel size is 0.003374120830641
which is 12.1468349903 arc seconds. Running grdconvert to netCDF obviously shows exactly the same.
Since this is pixel format, any "missing column" at the east boundary cannot be obtained from lon=-180
. No reason an actual data columns should be missing in the original. _ncolumns is even as for pixel grids. I wonder if 179.9984479
is just wrong and should be 180
. The discrepancy (180-179.9984479) = 5.5876 arc sec, not the expected 12.1468349903 . Likewise -89.9992240 is odd: -89.9992240 -(-90) = 2.79 arc secs. Smells like round off issues by the producers. n_rows
is odd (53347) but that is exactly half of longitudes so should be correct.
Our recipe file from January (I guess I started then) says to do
SRC_CUSTOM="gmt grdedit Mars_HRSC_MOLA_BlendDEM_Global_200mp_v2.tif -Rd -fg -GMars_HRSC_MOLA_BlendDEM_Global_200mp_v2.nc"
So clearly it thought (or knew) that we needed to shake some round-off from that -R and set it straight. Regardless of how I do things (grdedit, or grd2xyz | xyz2grd -Rd -I12...s -rp etc) I find 39957 NaNs. All of them occur for lon = 179.998312932
but only at ~75% of the latitudes. So odd but seems to be a feature of the original data, probably due to their processing and roundoff in building their TIF.
I think we will just serve it as is. A 12s gap on Mars is only 200 meter; the data resolution per MOLA docs and wont be noticed. 15s and up will filter and will have no NaNs anyway.
This all smells a mess due to messing grid & pixel registration. NASA used to release grids in grid registration (remember the original SRTMs), then they process with GDAL tools where everything has to be pixel reg. The grid increment is clearly wrong, see.
julia> 360 / 0.003374120830641
106694.46000000209
julia> 360 / 106694 # Compute new inc
0.0033741353778094364
julia> 180 / 0.003374120830641 # Look at the lats
53347.230000001044
julia> 180 / 0.0033741353778094364 # Use the inc computed from longitudes
53347.0 # <= Bingo, an integer number.
So, probably what we could do is to declare it a grid registered grid from [-180 180-0.0033741353778094364], but for lats
the limits would have to be [-90+inc/2 90-inc/2]. Very weird limits but the clearly screwed up this grid parameters.
Great, then once the mars mystery is solved you could look at the Makefile and scripts and see how up update candidate with those planets.
I have update the data for Venus, Pluto and Mercury.
@PaulWessel I got this error.
Setting permissions on venus/venus_relief
esteban82@candidate.generic-mapping-tools.org's password:
chmod: changing permissions of ‘/export/gmtserver/gmt/candidate/server/venus’: Operation not permitted
make: *** [Makefile:273: place-venus-relief] Error 1
I enter the server and I didn't see anything wrong
-bash-4.2$ ls -la
total 26264
drwxrwxr-x 11 esteban82 gmt 4096 sep 25 12:59 .
drwxrwxr-x 3 pwessel gmt 26 sep 25 12:50 ..
-rw-rw-r-- 1 esteban82 gmt 109992 sep 25 12:57 venus_relief_01d_g.grd
-rw-rw-r-- 1 esteban82 gmt 109815 sep 25 12:57 venus_relief_01d_p.grd
drwxrwxr-x 2 esteban82 gmt 4096 sep 25 12:54 venus_relief_01m_g
drwxrwxr-x 2 esteban82 gmt 4096 sep 25 12:58 venus_relief_02m_g
drwxrwxr-x 2 esteban82 gmt 4096 sep 25 12:57 venus_relief_02m_p
drwxrwxr-x 2 esteban82 gmt 310 sep 25 12:55 venus_relief_03m_g
drwxrwxr-x 2 esteban82 gmt 310 sep 25 12:50 venus_relief_03m_p
drwxrwxr-x 2 esteban82 gmt 82 sep 25 12:58 venus_relief_04m_g
drwxrwxr-x 2 esteban82 gmt 82 sep 25 12:51 venus_relief_04m_p
drwxrwxr-x 2 esteban82 gmt 82 sep 25 12:58 venus_relief_05m_g
drwxrwxr-x 2 esteban82 gmt 82 sep 25 12:59 venus_relief_05m_p
-rw-rw-r-- 1 esteban82 gmt 7767317 sep 25 12:59 venus_relief_06m_g.grd
-rw-rw-r-- 1 esteban82 gmt 7771345 sep 25 12:59 venus_relief_06m_p.grd
-rw-rw-r-- 1 esteban82 gmt 2977777 sep 25 12:57 venus_relief_10m_g.grd
-rw-rw-r-- 1 esteban82 gmt 2972864 sep 25 12:58 venus_relief_10m_p.grd
-rw-rw-r-- 1 esteban82 gmt 1381238 sep 25 12:58 venus_relief_15m_g.grd
-rw-rw-r-- 1 esteban82 gmt 1383724 sep 25 12:58 venus_relief_15m_p.grd
-rw-rw-r-- 1 esteban82 gmt 803002 sep 25 12:51 venus_relief_20m_g.grd
-rw-rw-r-- 1 esteban82 gmt 801347 sep 25 12:54 venus_relief_20m_p.grd
-rw-rw-r-- 1 esteban82 gmt 384412 sep 25 12:55 venus_relief_30m_g.grd
-rw-rw-r-- 1 esteban82 gmt 383325 sep 25 12:59 venus_relief_30m_p.grd
-rw-rw-r-- 1 esteban82 gmt 4701 sep 25 12:54 venus_relief_server.txt
BTW, I had to type my password 3 times (for each planet). It was a bit annoying. Is there any way to avoid this? Should I use something like SSH-key?
EDIT: I guess this only applies if using an ssh-key (which you should).
Using ssh(1)
? Try adding -A
to your command.
If you're logging on to that server regularly, you could also add to .ssh/config
something like:
Host <alias>
Hostname <ip or host>
ForwardAgent yes
Port 22
ForwardAgent yes
equals the -A
.
@Esteban82, I think I am done fixing issues in the scripts. Updated the earth mask script to be like the others. Doing make earth-mask will create the various resolutions and write the server_data snippet, all in the staging directory.
Things to do:
make earth-mask
now. OK, it is done.make candidate-info
]After that step the candidate server is finalised. When 6.5 is released we will run make server-release
which syncs the entire candidate/server tree plus candidate/gmt_data_server.txt over to oceania. Note: The cache is separate from all this and is synced via crontab automatically.
Questions? Please check that everything else is there and was made late'ish September!
Ooops, running make earth-geoid
now. Think that is the last one older than ~25 of September. You can check if the _01d files says filter width is 111 (old bad ways) or > 300 (the latest version)
geoid placed.
Hi @PaulWessel
Things to do:
- Make sure you git pull from gmtserver-admin - lots of changes today too.
I check the changes. It looks fine.
- Double-check that everything of remote data sets are placed on the candidate server. Update what is missing.
I could something of this for an small grid (I just have 8GB RAM). Do you know any grid to make? grav? mag?
BTW, I see that now there is a Mean Sea Surface grid in https://topex.ucsd.edu/pub/MSS_CNES_CLS22/. I wonder if would be a good addition for the remote data sets.
gmt grdinfo CNES_CLS_22_H.nc
CNES_CLS_22_H.nc: Title:
CNES_CLS_22_H.nc: Command: gmt xyz2grd -bi3d -RCNES_CLS_22_H.nc -fg -Gjunk.grd
CNES_CLS_22_H.nc: Remark:
CNES_CLS_22_H.nc: Gridline node registration used [Geographic grid]
CNES_CLS_22_H.nc: Grid file format: nf = GMT netCDF format (32-bit float), CF-1.7
CNES_CLS_22_H.nc: x_min: 0 x_max: 360 x_inc: 0.0166666666667 (1 min) name: longitude n_columns: 21601
CNES_CLS_22_H.nc: y_min: -79.4166669828 y_max: 88.0666670172 y_inc: 0.016666666733 (1 min) name: latitude n_rows: 10050
CNES_CLS_22_H.nc: v_min: -105.553001404 v_max: 84.2519989014 name: z
CNES_CLS_22_H.nc: scale_factor: 1 add_offset: 0
CNES_CLS_22_H.nc: format: classic
CNES_CLS_22_H.nc: Default CPT:
2. Double-check that everything of remote data sets are placed on the candidate server. Update what is missing.
Mm. I think that the only thing left to do is moon-relief. I'm affraid is too big for me.
OK, working on gmt so that it works with the various servers as well as GMT 6.4. I can do moon. I have tested make candidate-info and it now correctly builds the gmt_data_server.txt file, including correct count (479) and the commented records. GMT 6.5 parses those two and gets 481 while GMT6.4 only sees comments.
The CNES is a good idea. Perhaps you can add a recipe and build it in your staging directory? git pull stuff first since I have made a few minor script changes but they should not affect you anyway - but stay current.
The CNES is a good idea.
In grav directory there is also this grid (https://topex.ucsd.edu/pub/global_grav_1min/mss_32.1.nc) that it is almost the same. I like more mss_32. It has a better definition of the landmasess but it extends less to the north. The resolution is the same.
gmt grdinfo mss_32.1.nc
mss_32.1.nc: Title: Produced by grdmath
mss_32.1.nc: Command: grdmath d_mss.grd mss_egm_cls22.grd ADD = mss_cls22_updated.grd
mss_32.1.nc: Remark:
mss_32.1.nc: Gridline node registration used [Geographic grid]
mss_32.1.nc: Grid file format: nf = GMT netCDF format (32-bit float), CF-1.7
mss_32.1.nc: x_min: 0 x_max: 360 x_inc: 0.0166666666667 (1 min) name: longitude n_columns: 21601
mss_32.1.nc: y_min: -79.5 y_max: 82 y_inc: 0.0166666666667 (1 min) name: latitude n_rows: 9691
mss_32.1.nc: v_min: -105.552474976 v_max: 84.3210220337 name: z
mss_32.1.nc: scale_factor: 1 add_offset: 0
mss_32.1.nc: format: netCDF-4 chunk_size: 129,130 shuffle: on deflation_level: 3
mss_32.1.nc: Default CPT:
I agree with you. MSS ought to be the geoid. Does it differ much from the geoid?
The geoid is global but the MSS is over ocean only, of course, so probably good. The MSS is probably (?) satellite/altimetry tracking while the geoid is high-resolution spherical harmonics.
The geoid is global but the MSS is over ocean only, of course, so probably good
Yes, and also the geoid is from 2008. It seems that the 2020 version has not been released.
So, which one should I process? The MSS_CNES_CLS22 as a reference (https://topex.ucsd.edu/pub/MSS_CNES_CLS22/Schaeffer_MSS.pdf).
Seems like the CNES is good (global) and is a variation on the theme. I dont think we need two such products so lets just pick the most complete (CNES) and stick with that.
BTW, candidate is now up to date with all the 22 data sets. The CNES will make it 23. Please make a PR for the recipe file.
Just updated gmtserver-admin scripts and the recipe. Works for me now - not sure about repeated tries yet, but went all the way through the tiler. Give it a try after updating.
@PaulWessel when I got this error:
make earth-mss
scripts/srv_downsampler.sh earth_mss
/usr/bin/env: ‘bash -e’: No such file or directory
/usr/bin/env: use -[v]S to pass options in shebang lines
make: *** [Makefile:170: earth-mss] Error 127
I add -S
in that line and it is running now but I am not sure if it is the proper solution for any OS
#!/usr/bin/env -S bash -e
I add -S
in both scripts (srv_downsampler.sh and srv_tiler.sh) and now I think is working fine
@PaulWessel let me know if you want me to put -S
in all the script with this#!/usr/bin/env bash -e
Sure, you can do that. I got earth_mss up and added an earth_mss.cpt to the cache. Will now try to make all the test plots in remote_datasets
I think that this can be closed too.
Since our processing scripts now checks if there are NaNs in the input or output grids, I was surprised to see that Mars had 6474 NaN nodes. That is of course tiny relative to the
106694 x 53347
grid, but why? Dumping them out I find a column next to +180 meridian for latitude range 68.1575346318/89.9983129323. I tiny subset shows the red NaN nodes:Perhaps @Esteban82 can double-check and verify that the original indeed has some NaNs? We then have a few options:
We will do 3 for sure. I am leaning towards 1 as well unless someone else wants to take this on. Note that the full resolution data set here has spacing of 12.1468873601 arc seconds (we are calling that file 12s).