marinebon / mbon-dashboard-server

server software for MBON early alert dashboard using Docker
1 stars 2 forks source link

fix OC_c7fe_e1ee_913c dataset #36

Closed 7yl4r closed 1 year ago

7yl4r commented 2 years ago

the ABI images for FWC are not showing up because the ERDDAP dataset is not loading

http://131.247.136.200:8080/erddap/status.html

I removed an old dataset that was confusing us from the erddap-config. Need to debug the dataset using instructions in that readme.

7yl4r commented 2 years ago

DasDds output:

[root@dune ~]# docker exec -it erddap bash -c "cd /usr/local/tomcat/webapps/erddap/WEB-INF/ && bash DasDds.sh OC_c7fe_e1                                                                                      ee_913c"

////**** ERD Low Level Startup
localTime=2022-01-16T21:19:18+00:00
erddapVersion=2.11
Java 1.8.0_275 (64 bit, Oracle Corporation) on Linux (4.18.0-240.el8.x86_64).
MemoryInUse=    40 MB (highWaterMark=    40 MB) (Xmx ~= 958 MB)
logLevel=info: verbose=true reallyVerbose=false
ERROR in File2.deleteIfOld: dir=/erddapData/dataset/_FileVisitor/ isn't a directory.
bigParentDirectory=/erddapData/
webInfParentDirectory=/usr/local/tomcat/webapps/erddap/
java.awt.HeadlessException:
No X11 DISPLAY variable was set, but this program performed an operation which requires it.
 at gov.noaa.pfel.coastwatch.sgt.SgtUtil.isBufferedImageAccelerated(SgtUtil.java:368)
 at gov.noaa.pfel.erddap.util.EDStatic.<clinit>(EDStatic.java:1601)
 at gov.noaa.pfel.erddap.DasDds.<clinit>(DasDds.java:31)

bufferedImage isAccelerated=[unknown]
Custom messages.xml not found at /usr/local/tomcat/content/erddap/messages.xml
Using default messages.xml from  /usr/local/tomcat/webapps/erddap/WEB-INF/classes/gov/noaa/pfel/erddap/util/messages.xml
CfToFromGcmd static loading /usr/local/tomcat/webapps/erddap/WEB-INF/classes/gov/noaa/pfel/erddap/util/CfToGcmd.txt
*** ERD Low Level Startup finished successfully.

logFileMaxSize=20000000
*** Starting DasDds 2022-01-16T21:19:19+00:00 erddapVersion=2.11
logFile=/erddapData/logs/DasDds.log
Java 1.8.0_275 (64 bit, Oracle Corporation) on Linux (4.18.0-240.el8.x86_64).
MemoryInUse=   242 MB (highWaterMark=   242 MB) (Xmx ~= 958 MB)

*** DasDds ***
This generates the DAS and DDS for a dataset and puts it in
/erddapData/logs/DasDds.out
Press ^D or ^C to exit at any time.

Which datasetID? OC_c7fe_e1ee_913c

*** DasDds OC_c7fe_e1ee_913c
*** deleting cached dataset info for datasetID=OC_c7fe_e1ee_913c
File2.deleteIfOld(/erddapData/dataset/3c/OC_c7fe_e1ee_913c/              ) nDir=   0 nDeleted=   4 nRemain=   0

EDD.oneFromDatasetsXml(OC_c7fe_e1ee_913c)...

*** constructing EDDGridFromFiles(xmlReader)...

*** constructing EDDGridFromFiles OC_c7fe_e1ee_913c
axis0 ***fileName format=yyyyDDD class=DOUBLE regex=MODA_(\d{7})_(\d{7})_7D_FK_OC\.nc captureGroup=2
dir/file table doesn't exist: /erddapData/dataset/3c/OC_c7fe_e1ee_913c/dirTable.nc
dir/file table doesn't exist: /erddapData/dataset/3c/OC_c7fe_e1ee_913c/fileTable.nc
creating new dirTable and fileTable (dirTable=null?true fileTable=null?true badFileMap=null?false)
WARNING: FileVisitorSubdir.visitFileFailed: /srv/imars-objects/fk/MEAN_7D_MODA/OC/MODA_2018204_2018210_7D_FK_OC.nc
WARNING: FileVisitorSubdir.visitFileFailed: /srv/imars-objects/fk/MEAN_7D_MODA/OC/MODA_2014092_2014098_7D_FK_OC.nc
WARNING: FileVisitorSubdir.visitFileFailed: /srv/imars-objects/fk/MEAN_7D_MODA/OC/MODA_2016267_2016273_7D_FK_OC.nc
[...]
*** An error occurred while trying to load OC_c7fe_e1ee_913c:
java.lang.RuntimeException: datasets.xml error on or before line #1721: 0 files found in /srv/imars-objects/fk/MEAN_7D_M                                                                                      ODA/OC/
regex=.*\.nc recursive=true pathRegex=.* time=92ms
 at gov.noaa.pfel.erddap.dataset.EDD.fromXml(EDD.java:443)
 at gov.noaa.pfel.erddap.dataset.EDD.oneFromDatasetsXml(EDD.java:524)
 at gov.noaa.pfel.erddap.dataset.EDD.testDasDds(EDD.java:11123)
 at gov.noaa.pfel.erddap.DasDds.doIt(DasDds.java:129)
 at gov.noaa.pfel.erddap.DasDds.main(DasDds.java:155)
Caused by: java.lang.RuntimeException: 0 files found in /srv/imars-objects/fk/MEAN_7D_MODA/OC/
regex=.*\.nc recursive=true pathRegex=.* time=92ms
 at gov.noaa.pfel.erddap.dataset.EDDGridFromFiles.<init>(EDDGridFromFiles.java:799)
 at gov.noaa.pfel.erddap.dataset.EDDGridFromNcLow.<init>(EDDGridFromNcLow.java:99)
 at gov.noaa.pfel.erddap.dataset.EDDGridFromNcFiles.<init>(EDDGridFromNcFiles.java:105)
 at gov.noaa.pfel.erddap.dataset.EDDGridFromFiles.fromXml(EDDGridFromFiles.java:296)
 at gov.noaa.pfel.erddap.dataset.EDD.fromXml(EDD.java:403)
 ... 4 more

*** closed logFile=/erddapData/logs/DasDds.log at 2022-01-16T21:19:20+00:00

Permissions are showing question marks in this directory, which definitely isn't right. Seeing the same thing in the SST & SST4 directories.

[root@dune ~]# ls -lh /srv/imars-objects/fk/MEAN_7D_MODA/SST4/
ls: cannot access '/srv/imars-objects/fk/MEAN_7D_MODA/SST4/MODA_2018288_2018294_7D_FK_SST4.nc': Permission denied
ls: cannot access '/srv/imars-objects/fk/MEAN_7D_MODA/SST4/MODA_2007134_2007140_7D_FK_SST4.nc': Permission denied
ls: cannot access '/srv/imars-objects/fk/MEAN_7D_MODA/SST4/MODA_2005092_2005098_7D_FK_SST4.nc': Permission denied
[...]
?????????? ? ? ? ?            ? MODA_2021344_2021350_7D_FK_SST4.nc
?????????? ? ? ? ?            ? MODA_2021351_2021357_7D_FK_SST4.nc
?????????? ? ? ? ?            ? MODA_2021358_2021365_7D_FK_SST4.nc
?????????? ? ? ? ?            ? MODA_2022001_2022007_7D_FK_SST4.nc
?????????? ? ? ? ?            ? MODA_2022008_2022014_7D_FK_SST4.nc

Something odd must be going on with the autofs nfs share. I see that server-status#24 is currently happening so maybe that is related. Nope, /fk/ is being served from thing1.

Permissions on thing1 look right:

[root@thing1 ~]# namei -mo /mnt/raid/imars_objects/fk/MEAN_7D_MODA/OC/MODA_2022008_2022014_7D_FK_OC.nc
f: /mnt/raid/imars_objects/fk/MEAN_7D_MODA/OC/MODA_2022008_2022014_7D_FK_OC.nc
 dr-xr-xr-x root root         /
 drwxr-xr-x root root         mnt
 drwxr-xr-x root root         raid
 drwxr-xr-x root root         imars_objects
 drwxrwxr-x 4509 imars-common fk
 drwxr-xr-x 4509 imars-common MEAN_7D_MODA
 drwxrwxrw- 4509 imars-common OC
 -rwxrwxrw- 4509 imars-common MODA_2022008_2022014_7D_FK_OC.nc

I am going to do a system update on dune, reboot, & see if the issue persists.

7yl4r commented 2 years ago

Oh, I am realizing the question marks are a result of NFS disallowing root account access. The permissions show correctly from my user account:

[tylar@dune ~]$ ls -lh /srv/imars-objects/fk/MEAN_7D_MODA/SST4/
total 4.9G
-rwxrwxrw-. 1 4509 imars-common 4.9M Dec 22 07:36 MODA_2002185_2002189_7D_FK_SST4.nc
-rwxrwxrw-. 1 4509 imars-common 4.9M Dec 22 07:36 MODA_2002190_2002196_7D_FK_SST4.nc
-rwxrwxrw-. 1 4509 imars-common 4.9M Dec 22 07:36 MODA_2002197_2002203_7D_FK_SST4.nc
-rwxrwxrw-. 1 4509 imars-common 4.9M Dec 22 07:36 MODA_2002204_2002210_7D_FK_SST4.nc
-rwxrwxrw-. 1 4509 imars-common 4.9M Dec 22 07:36 MODA_2002219_2002224_7D_FK_SST4.nc
-rwxrwxrw-. 1 4509 imars-common 4.9M Dec 22 07:36 MODA_2002225_2002231_7D_FK_SST4.nc
[...]

So docker exec uses root:

[tylar@dune ~]$ docker exec -it erddap bash -c "whoami"
root

So we see the same thing (???). But is ERDDAP trying to access the files as root? I think it is so I am going to try using the user var in the docker-compose.yml.

7yl4r commented 2 years ago

Looking at the problem from the other direction while waiting on that: I could try using no_root_squash on thing1. Here is the relevant config file:

[root@thing1 ~]# cat /etc/exports
# This file is configured through the nfs::server puppet module
/exports/3d_wetlands_dem 192.168.1.0/24(rw,sync,no_subtree_check,insecure_locks) js-17-123.jetstream-cloud.org(rw,sync,no_subtree_check,insecure_locks) imars-airflow-tubastraea.marine.usf.edu(rw,sync,no_subtree_check,insecure_locks) imars-airflow-rugosa.marine.usf.edu(rw,sync,no_subtree_check,insecure_locks)
[...]
/exports/fk 192.168.1.0/24(rw,sync,no_subtree_check,insecure_locks) js-17-123.jetstream-cloud.org(rw,sync,no_subtree_check,insecure_locks) imars-airflow-tubastraea.marine.usf.edu(rw,sync,no_subtree_check,insecure_locks) imars-airflow-rugosa.marine.usf.edu(rw,sync,no_subtree_check,insecure_locks)
[...]
/exports 192.168.1.0/24(rw,fsid=0,insecure,no_subtree_check,async,root_squash) js-17-186.jetstream-cloud.org(rw,fsid=0,insecure,no_subtree_check,async,root_squash) imars-airflow-tubastraea.marine.usf.edu(rw,fsid=0,insecure,no_subtree_check,async,root_squash) imars-airflow-rugosa.marine.usf.edu(rw,fsid=0,insecure,no_subtree_check,async,root_squash)

So that root_squash on the last line needs to be changed to no_root_squash. Easy enough workaround but as it says at the top of the file this is managed by puppet. The proper fix here is to dig into the puppet config, however: the puppet master server is down so that needs to come back up as a prereq to fixing this the right way.

7yl4r commented 2 years ago

Same problem reported still by DasDds after dune reboot but just to be sure the docker-compose config change worked let's do:

[root@dune docker_volumes]# cd erddap-config/
[root@dune erddap-config]# docker-compose down --volumes --rmi all && docker-compose up --build -d
Stopping erddap ... done
Removing erddap ... done
Removing network erddap-config_default
Removing image axiom/docker-erddap
Creating network "erddap-config_default" with the default driver
Pulling erddap (axiom/docker-erddap:)...
latest: Pulling from axiom/docker-erddap
0e29546d541c: Pull complete
9b829c73b52b: Pull complete
cb5b7ae36172: Pull complete
6494e4811622: Pull complete
668f6fcc5fa5: Pull complete
c0879393b07e: Pull complete
bef50c41a74d: Pull complete
0bcabf45ee90: Pull complete
6e2e221e1126: Pull complete
57f0208b026f: Pull complete
40af7d565b7a: Pull complete
85026bf1e0e8: Pull complete
c43c34149edf: Pull complete
5da8b0a402ea: Pull complete
88630c61d310: Pull complete
d9a7b4ebf6ed: Pull complete
bb0d2493452e: Pull complete
c42fad8c0f88: Pull complete
5f956e57b988: Pull complete
03969136d218: Pull complete
Digest: sha256:66036406c9575b368b35963b464d69bf3449e197f1978ec08282c3242e8e04ba
Status: Downloaded newer image for axiom/docker-erddap:latest
Creating erddap ... done

okay so we have created a bigger problem:

[root@dune erddap-config]# docker logs erddap
groupadd: Permission denied.
groupadd: cannot lock /etc/group; try again later.
chown: invalid user: ‘tomcat:tomcat’
chown: invalid user: ‘tomcat:tomcat’

looks like we need root to groupadd.

7yl4r commented 2 years ago

setting user: root:imars-common (0:4???) seems to have worked. DasDds doesn't show any issues; let's see if it is working for real tomorrow.

7yl4r commented 2 years ago

Lol. Nope. Now they are all down (from status.html):

Last major LoadDatasets started 12m 35s ago and finished after 5 seconds.
nGridDatasets  = 0
nTableDatasets = 0
nTotalDatasets = 0
n Datasets Failed To Load (in the last major LoadDatasets) = 14
    anom_9km_2b83_52f7_ef91, chlor_a_l3_pass_db6d_37b4_723b, sst_avhrr_1km_fe92_3143_5288,
    sst_avhrr_1km_53f3_eede_5d2f, OC_e122_8547_dab0, SST4_9c29_08ac_eccb, SST4_b24c_5823_8b7c,
    jplMURSST41anom1day, OC_c7fe_e1ee_913c, SST_2963_b743_ee2e, SST4_aac4_99a3_6549,
    SST_66bb_c258_cd59, SSTN_2f10_d542_b535, OC_2a67_61be_9fd6, (end)
7yl4r commented 2 years ago

and yet DasDds is running fine:

[root@dune docker_volumes]# docker exec -it erddap bash -c "cd /usr/local/tomcat/webapps/erddap/WEB-INF/ && bash DasDds.sh OC_c7fe_e1ee_913c"

////**** ERD Low Level Startup
localTime=2022-01-18T00:18:31+00:00
erddapVersion=2.14
Java 1.8.0_312 (64 bit, Oracle Corporation) on Linux (4.18.0-348.7.1.el8_5.x86_64).
MemoryInUse=    40 MB (highWaterMark=    40 MB) (Xmx ~= 958 MB)
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
logLevel=info: verbose=true reallyVerbose=false
ERROR in File2.deleteIfOld: dir=/erddapData/dataset/_FileVisitor/ isn't a directory.
bigParentDirectory=/erddapData/
webInfParentDirectory=/usr/local/tomcat/webapps/erddap/
java.awt.HeadlessException:
No X11 DISPLAY variable was set, but this program performed an operation which requires it.
 at gov.noaa.pfel.coastwatch.sgt.SgtUtil.isBufferedImageAccelerated(SgtUtil.java:368)
 at gov.noaa.pfel.erddap.util.EDStatic.<clinit>(EDStatic.java:1628)
 at gov.noaa.pfel.erddap.DasDds.<clinit>(DasDds.java:31)

bufferedImage isAccelerated=[unknown]
  copying images/ file: erddapStart2.css
Custom messages.xml not found at /usr/local/tomcat/content/erddap/messages.xml
Using default messages.xml from  /usr/local/tomcat/webapps/erddap/WEB-INF/classes/gov/noaa/pfel/erddap/util/messages.xml
CfToFromGcmd static loading /usr/local/tomcat/webapps/erddap/WEB-INF/classes/gov/noaa/pfel/erddap/util/CfToGcmd.txt
*** ERD Low Level Startup finished successfully.

logFileMaxSize=20000000
*** Starting DasDds 2022-01-18T00:18:32+00:00 erddapVersion=2.14
logFile=/erddapData/logs/DasDds.log
Java 1.8.0_312 (64 bit, Oracle Corporation) on Linux (4.18.0-348.7.1.el8_5.x86_64).
MemoryInUse=    77 MB (highWaterMark=    77 MB) (Xmx ~= 958 MB)

*** DasDds ***
This generates the DAS and DDS for a dataset and puts it in
/erddapData/logs/DasDds.out
Press ^D or ^C to exit at any time.

Which datasetID? OC_c7fe_e1ee_913c

*** DasDds OC_c7fe_e1ee_913c
*** deleting cached dataset info for datasetID=OC_c7fe_e1ee_913c

EDD.oneFromDatasetsXml(OC_c7fe_e1ee_913c)...

*** constructing EDDGridFromFiles(xmlReader)...

*** constructing EDDGridFromFiles OC_c7fe_e1ee_913c
axis0 ***fileName format=yyyyDDD class=DOUBLE regex=MODA_(\d{7})_(\d{7})_7D_FK_OC\.nc captureGroup=2
dir/file table doesn't exist: /erddapData/dataset/3c/OC_c7fe_e1ee_913c/dirTable.nc
dir/file table doesn't exist: /erddapData/dataset/3c/OC_c7fe_e1ee_913c/fileTable.nc
creating new dirTable and fileTable (dirTable=null?true fileTable=null?true badFileMap=null?false)
doQuickRestart=false
1015 files found in /srv/imars-objects/fk/MEAN_7D_MODA/OC/
regex=.*\.nc recursive=true pathRegex=.* time=40ms
old nBadFiles size=0
old fileTable size=0   nFilesMissing=0
#0 inserted in cache
#1 inserted in cache
#2 inserted in cache
[...]
#1012 inserted in cache
#1013 inserted in cache
#1014 inserted in cache
fileTable updated; time=39472ms

  tFileNamePA.size=1015
  dirTable.nRows=1
  fileTable.nRows=1015
    fileTableInMemory=false
    nUnchanged=0
    nRemoved=0 (nNoLastMod=0, nNoSize=0)
    nReadFile=1015 (nDifferentModTime=0 nNew=1015) readFileCumTime=38.174 s avg=37ms
getting metadataFrom /srv/imars-objects/fk/MEAN_7D_MODA/OC/MODA_2022008_2022014_7D_FK_OC.nc
  ftLastMod first=2021-12-22T12:40:43Z last=2022-01-15T05:01:09Z
  time: DoubleArray isn't evenly spaced: [0]=1.0260864E9, [1]=1.0266912E9, spacing=604800.0, average spacing=607526.6272189349.
    smallest spacing=518400.0: [279]=1.1959488E9, [280]=1.1964672E9
    biggest  spacing=1209600.0: [3]=1.0279008E9, [4]=1.0291104E9
  latitude: FloatArray isn't sorted in ascending order: [0]=27.4 > [1]=27.390911.

*** EDDGridFromFiles OC_c7fe_e1ee_913c constructor finished. TIME=39965ms  (>10s!)

**************************** The .das for OC_c7fe_e1ee_913c ****************************
Attributes {
  time {
    String _CoordinateAxisType "Time";
    Float64 actual_range 1.0260864e+9, 1.6421184e+9;
    String axis "T";
    String ioos_category "Time";
    String long_name "Time";
    String standard_name "time";
    String time_origin "01-JAN-1970 00:00:00";
    String units "seconds since 1970-01-01T00:00:00Z";
  }
  latitude {
    String _CoordinateAxisType "Lat";
    Float32 actual_range 23.01037, 27.4;
    String axis "Y";
    String ioos_category "Location";
    String long_name "Latitude";
    String standard_name "latitude";
    String units "degrees_north";
  }
  longitude {
    String _CoordinateAxisType "Lon";
    Float32 actual_range -85.0, -78.51099;
    String axis "X";
    String ioos_category "Location";
    String long_name "Longitude";
    String standard_name "longitude";
    String units "degrees_east";
  }
  chlor_a_median {
    Float64 colorBarMaximum 30.0;
    Float64 colorBarMinimum 0.03;
    String colorBarScale "Log";
    String ioos_category "Ocean Color";
    String long_name "Concentration Of Chlorophyll In Sea Water";
    String Product "chlor_a";
    String standard_name "concentration_of_chlorophyll_in_sea_water";
    String units "mg m^-3";
  }
  chlor_a_anom {
    Float64 colorBarMaximum 10.0;
    Float64 colorBarMinimum -10.0;
    String ioos_category "Ocean Color";
    String long_name "Chlor A Anom";
    String Product "chlor_a_anomaly";
    String units "mg m^-3";
  }
  ABI_median {
    String ioos_category "Unknown";
    String long_name "ABI Median";
    String Product "ABI";
    String units "W m^-2 um^-1 sr^-1";
  }
  ABI_anom {
    Float64 colorBarMaximum 10.0;
    Float64 colorBarMinimum -10.0;
    String ioos_category "Unknown";
    String long_name "ABI Anom";
    String Product "ABI_anomaly";
    String units "W m^-2 um^-1 sr^-1";
  }
  Rrs_667_median {
    Float64 colorBarMaximum -25000.035;
    Float64 colorBarMinimum -25000.055;
    String ioos_category "Optical Properties";
    String long_name "Rrs 667 Median";
    String Product "Rrs_667";
    String units "sr^-1";
  }
  Rrs_667_anom {
    Float64 colorBarMaximum 0.01;
    Float64 colorBarMinimum -0.01;
    String ioos_category "Optical Properties";
    String long_name "Rrs 667 Anom";
    String Product "Rrs_667_anomaly";
    String units "sr^-1";
  }
  NC_GLOBAL {
    String cdm_data_type "Grid";
    String Composite_end_date "01/14/2022";
    String Composite_start_date "01/08/2022";
    String contact "Dan Otis - dotis@usf.edu";
    String Conventions "COARDS, CF-1.6, ACDD-1.3";
    String CreationDate "01/15/2022 05:01:08";
    String creator_email "dotis@usf.edu";
    String creator_name "DOTIS";
    String creator_type "institution";
    Float64 Easternmost_Easting -78.51099;
    Float64 geospatial_lat_max 27.4;
    Float64 geospatial_lat_min 23.01037;
    Float64 geospatial_lat_resolution 0.00908826086956521;
    String geospatial_lat_units "degrees_north";
    Float64 geospatial_lon_max -78.51099;
    Float64 geospatial_lon_min -85.0;
    Float64 geospatial_lon_resolution 0.009088249299719879;
    String geospatial_lon_units "degrees_east";
    String history "2022-01-18T00:19:12Z (local files)
2022-01-18T00:19:12Z http://131.247.136.200:8080/erddap/griddap/OC_c7fe_e1ee_913c.das";
    String Image_size "484 pixels(N-S) x 715 pixels(E-W)";
    String infoUrl "http://imars.marine.usf.edu";
    String institution "USF IMaRS";
    String keywords "abi, ABI_anom, ABI_median, anomaly, chemistry, chlor, chlor_a_anom, chlor_a_median, chlorophyll, color, concentration, concentration_of_chlorophyll_in_sea_water, data, earth, Earth Science > Oceans > Ocean Chemistry > Chlorophyll, florida, local, median, ocean, ocean color, oceans, optical, optical properties, properties, rrs, Rrs_667_anom, Rrs_667_median, science, sea, seawater, source, south, university, usf, water";
    String keywords_vocabulary "GCMD Science Keywords";
    String Lat_Lon_Limits "23.0104N to 27.4N -85W to -79.0108W";
    String license "The data may be used and redistributed for free but is not intended
for legal use, since it may contain inaccuracies. Neither the data
Contributor, ERD, NOAA, nor the United States Government, nor any
of their employees or contractors, makes any warranty, express or
implied, including warranties of merchantability and fitness for a
particular purpose, or assumes any legal liability for the accuracy,
completeness, or usefulness, of this information.";
    Float64 Northernmost_Northing 27.4;
    String Ocean_color_masks_based_on_L2_flags "LAND,CLDICE,HIGLINT";
    String Original_Image_Format "Level-2(NetCDF)";
    String Original_Image_Source "NASA Ocean Biology Processing Group";
    String Processing_and_binning "USF IMaRS";
    String Projection "Equidistant Cylindrical";
    String Region "Florida Keys (FK)";
    String Sensor "MODIS-Aqua";
    String sourceUrl "(local files)";
    Float64 Southernmost_Northing 23.01037;
    String standard_name_vocabulary "CF Standard Name Table v55";
    String summary "USF IMaRS MODIS FK 7 Day Mean Ocean Color.";
    String time_coverage_end "2022-01-14T00:00:00Z";
    String time_coverage_start "2002-07-08T00:00:00Z";
    String Time_interval "7-Day Composite (median)";
    String title "IMaRS MODA 7D.";
    Float64 Westernmost_Easting -85.0;
  }
}

**************************** The .dds for OC_c7fe_e1ee_913c ****************************
Dataset {
  Float64 time[time = 1015];
  Float32 latitude[latitude = 484];
  Float32 longitude[longitude = 715];
  GRID {
    ARRAY:
      Float64 chlor_a_median[time = 1015][latitude = 484][longitude = 715];
    MAPS:
      Float64 time[time = 1015];
      Float32 latitude[latitude = 484];
      Float32 longitude[longitude = 715];
  } chlor_a_median;
  GRID {
    ARRAY:
      Float64 chlor_a_anom[time = 1015][latitude = 484][longitude = 715];
    MAPS:
      Float64 time[time = 1015];
      Float32 latitude[latitude = 484];
      Float32 longitude[longitude = 715];
  } chlor_a_anom;
  GRID {
    ARRAY:
      Float64 ABI_median[time = 1015][latitude = 484][longitude = 715];
    MAPS:
      Float64 time[time = 1015];
      Float32 latitude[latitude = 484];
      Float32 longitude[longitude = 715];
  } ABI_median;
  GRID {
    ARRAY:
      Float64 ABI_anom[time = 1015][latitude = 484][longitude = 715];
    MAPS:
      Float64 time[time = 1015];
      Float32 latitude[latitude = 484];
      Float32 longitude[longitude = 715];
  } ABI_anom;
  GRID {
    ARRAY:
      Float64 Rrs_667_median[time = 1015][latitude = 484][longitude = 715];
    MAPS:
      Float64 time[time = 1015];
      Float32 latitude[latitude = 484];
      Float32 longitude[longitude = 715];
  } Rrs_667_median;
  GRID {
    ARRAY:
      Float64 Rrs_667_anom[time = 1015][latitude = 484][longitude = 715];
    MAPS:
      Float64 time[time = 1015];
      Float32 latitude[latitude = 484];
      Float32 longitude[longitude = 715];
  } Rrs_667_anom;
} OC_c7fe_e1ee_913c;

************************* The .timeGaps for OC_c7fe_e1ee_913c *************************
Time gaps greater than the median (7 days):
[3]=2002-07-29T00:00:00Z -> [4]=2002-08-12T00:00:00Z, gap=14 days
[23]=2002-12-23T00:00:00Z -> [24]=2002-12-31T00:00:00Z, gap=8 days
[75]=2003-12-23T00:00:00Z -> [76]=2003-12-31T00:00:00Z, gap=8 days
[127]=2004-12-22T00:00:00Z -> [128]=2004-12-31T00:00:00Z, gap=9 days
[179]=2005-12-23T00:00:00Z -> [180]=2005-12-31T00:00:00Z, gap=8 days
[231]=2006-12-23T00:00:00Z -> [232]=2006-12-31T00:00:00Z, gap=8 days
[280]=2007-12-01T00:00:00Z -> [281]=2007-12-09T00:00:00Z, gap=8 days
[283]=2007-12-23T00:00:00Z -> [284]=2007-12-31T00:00:00Z, gap=8 days
[335]=2008-12-22T00:00:00Z -> [336]=2008-12-31T00:00:00Z, gap=9 days
[387]=2009-12-23T00:00:00Z -> [388]=2009-12-31T00:00:00Z, gap=8 days
[439]=2010-12-23T00:00:00Z -> [440]=2010-12-31T00:00:00Z, gap=8 days
[491]=2011-12-23T00:00:00Z -> [492]=2011-12-31T00:00:00Z, gap=8 days
[543]=2012-12-22T00:00:00Z -> [544]=2012-12-31T00:00:00Z, gap=9 days
[595]=2013-12-23T00:00:00Z -> [596]=2013-12-31T00:00:00Z, gap=8 days
[647]=2014-12-23T00:00:00Z -> [648]=2014-12-31T00:00:00Z, gap=8 days
[699]=2015-12-23T00:00:00Z -> [700]=2015-12-31T00:00:00Z, gap=8 days
[751]=2016-12-22T00:00:00Z -> [752]=2016-12-31T00:00:00Z, gap=9 days
[803]=2017-12-23T00:00:00Z -> [804]=2017-12-31T00:00:00Z, gap=8 days
[855]=2018-12-23T00:00:00Z -> [856]=2018-12-31T00:00:00Z, gap=8 days
[907]=2019-12-23T00:00:00Z -> [908]=2019-12-31T00:00:00Z, gap=8 days
[959]=2020-12-22T00:00:00Z -> [960]=2020-12-31T00:00:00Z, gap=9 days
[1011]=2021-12-23T00:00:00Z -> [1012]=2021-12-31T00:00:00Z, gap=8 days
nGaps=22

*** closed logFile=/erddapData/logs/DasDds.log at 2022-01-18T00:19:12+00:00

The log.txt mentioned by the docs here DNE:

[root@dune docker_volumes]# docker exec -it erddap bash -c "ls /erddapData/logs/"
DasDds.log  DasDds.log.previous  DasDds.out  DasDds.out.previous

In fact, log.txt is nowhere:

[root@dune docker_volumes]# docker exec -it erddap bash -c "find / -name log.txt"
find: ‘/proc/1/map_files’: Permission denied
7yl4r commented 2 years ago

Oh, the issue is probably this excerpt from docker logs errdap:

2022-01-17T19:48:38 ERROR: while creating new logFile=/erddapData/logs/log.txt
java.io.FileNotFoundException: /erddapData/logs/log.txt (Permission denied)
 at com.cohort.util.String2.setupLog(String2.java:4010)
 at gov.noaa.pfel.erddap.Erddap.<init>(Erddap.java:257)

from docker-compose.yml:

    volumes:
      #- ./erddap/erddap_data:/erddapData

so we need to look at

[root@dune erddap-config]# ls -lah /erddapData/
total 0
drwxr-xr-x.  2 root root   6 May 31  2021 .
dr-xr-xr-x. 18 root root 242 Jan 16 16:35 ..

the fix:

[root@dune erddap-config]# chmod 777 /erddapData/
[root@dune erddap-config]# ls -lah /erddapData/
total 0
drwxrwxrwx.  2 root root   6 May 31  2021 .
dr-xr-xr-x. 18 root root 242 Jan 16 16:35 ..

then

docker-compose down --volumes --rmi all && docker-compose up --build -d

and we'll check back tomorrow

7yl4r commented 2 years ago

Looking at the status.html page again, OC_c7fe_e1ee_913c is still not loading properly. Best step now seems to be to switch from root_squash to no_root_squash on thing1

7yl4r commented 2 years ago

I manually edited thing1:/etc/exports to set no_root_squash, reset the nfs service on thing1, and did a docker hard reset to the ERDDAP instance on dune. Poking around a bit I think it worked but I will check back after >24hrs before closing.

7yl4r commented 2 years ago

Very frustrating. More datasets are broken now but DasDds is throwing no errors (docker exec -it erddap bash -c "cd /usr/local/tomcat/webapps/erddap/WEB-INF/ && bash DasDds.sh OC_c7fe_e1ee_913c")

Last major LoadDatasets started 14.000 s ago and finished after 10 seconds.
nGridDatasets  = 1
nTableDatasets = 1
nTotalDatasets = 2
n Datasets Failed To Load (in the last major LoadDatasets) = 13
    anom_9km_2b83_52f7_ef91, chlor_a_l3_pass_db6d_37b4_723b, sst_avhrr_1km_fe92_3143_5288,
    sst_avhrr_1km_53f3_eede_5d2f, OC_e122_8547_dab0, SST4_9c29_08ac_eccb, SST4_b24c_5823_8b7c,
    OC_c7fe_e1ee_913c, SST_2963_b743_ee2e, SST4_aac4_99a3_6549, SST_66bb_c258_cd59,
    SSTN_2f10_d542_b535, OC_2a67_61be_9fd6, (end)

Suspicious that it finishes so quickly.

User:group when running DasDds looks like it would be the same as run by erddap:

[root@dune erddap-config]# docker exec -it erddap bash -c "whoami"
root
[root@dune erddap-config]# docker exec -it erddap bash -c "groups"
groups: cannot find name for group ID 4504
4504

I don't see any hints in docker logs erddap or in /erddapData/logs/log.txt. It is however odd that only the gom/sst_avhrr_1k dataset is mentioned in log.txt. Maybe erddap is erroring silently on that one and thus never even getting to the others? An error-less crash like that would be worth reporting to the ERDDAP devs.

 [root@dune erddap-config]# docker exec -it erddap bash -c "grep 'files found in' /erddapData/logs/log.txt"
1773 files found in /srv/imars-objects/gom/sst_avhrr_1km/
77175 files found in /srv/imars-objects/gom/sst_avhrr_1km/
[root@dune erddap-config]# docker exec -it erddap bash -c "grep '/srv/imars-objects/' /erddapData/logs/log.txt"
1773 files found in /srv/imars-objects/gom/sst_avhrr_1km/
getting metadataFrom /srv/imars-objects/gom/sst_avhrr_1km/2018.10/n19.20181029.1001.usf.sst.hdf
77175 files found in /srv/imars-objects/gom/sst_avhrr_1km/
#30896 bad file: removing fileTable row for /srv/imars-objects/gom/sst_avhrr_1km/2003.03/n12.20030324.1005.fullpass.sst.hdf

Anyway, first I will try rolling back the inclusion of user: in the docker-compose, reset things yet again, and check back next week.

After reset the user:group is root:root. I do not think this change is going to effect the issue in any way.

7yl4r commented 1 year ago

closing this b/c the dataset has been renamed & reconfigured.