geodesymiami / insarmaps

3 stars 0 forks source link

ingest failure `Unable to open datasource` #10

Closed falkamelung closed 4 years ago

falkamelung commented 5 years ago

When I ingest I get the error message below. Any idea what I could try?

FAILURE:
Unable to open datasource `/projects/scratch/insarlab/famelung/EifelSenAT15/mintpy/JSON/S1_IW1_015_0163_0165_20150122_20190419.mbtiles-journal' with the following drivers.
  -> `PCIDSK'
  -> `netCDF'
  -> `JP2OpenJPEG'
  -> `PDF'
  -> `MBTiles'
  -> `EEDA'
  -> `ESRI Shapefile'
  -> `MapInfo File'
  -> `UK .NTF'
  -> `OGR_SDTS'
  -> `S57'
  -> `DGN'
  -> `OGR_VRT'
  -> `REC'
  -> `Memory'
  -> `BNA'
  -> `CSV'
  -> `NAS'
  -> `GML'
  -> `GPX'
  -> `LIBKML'
  -> `KML'
  -> `GeoJSON'
  -> `GeoJSONSeq'
  -> `ESRIJSON'
  -> `TopoJSON'
  -> `Interlis 1'
  -> `Interlis 2'
  -> `OGR_GMT'
  -> `GPKG'
  -> `SQLite'
  -> `OGR_DODS'
  -> `WAsP'
  -> `PostgreSQL'
  -> `OpenFileGDB'
  -> `XPlane'
  -> `DXF'
  -> `CAD'
  -> `Geoconcept'
  -> `GeoRSS'
  -> `GPSTrackMaker'
  -> `VFK'
  -> `PGDUMP'
  -> `OSM'
  -> `GPSBabel'
  -> `SUA'
  -> `OpenAir'
  -> `OGR_PDS'
  -> `WFS'
  -> `WFS3'
  -> `HTF'
  -> `AeronavFAA'
  -> `EDIGEO'
  -> `GFT'
  -> `SVG'
  -> `CouchDB'
  -> `Cloudant'
  -> `Idrisi'
  -> `ARCGEN'
  -> `SEGUKOOA'
  -> `SEGY'
  -> `XLS'
  -> `ODS'
  -> `XLSX'
  -> `ElasticSearch'
  -> `Carto'
  -> `AmigoCloud'
  -> `SXF'
  -> `Selafin'
  -> `JML'
  -> `PLSCENES'
  -> `CSW'
  -> `VDV'
  -> `GMLAS'
  -> `MVT'
  -> `TIGER'
  -> `AVCBin'
  -> `AVCE00'
  -> `NGW'
  -> `HTTP'
Error inserting into the database. This is most often due to running out of Memory (RAM), or incorrect database credentials... quittingUploading json chunks...
Clearing old dataset, if it is there
Creating index on S1_IW1_015_0163_0165_20150122_20190419
Inserted chunk_1.json to db
Inserted chunk_2.json to db
Inserted chunk_3.json to db
Inserted chunk_4.json to db
Inserted chunk_5.json to db
Inserted chunk_6.json to db
Inserted chunk_7.json to db
Inserted chunk_8.json to db
Inserted chunk_9.json to db
Inserted chunk_10.json to db
Inserted chunk_11.json to db
Inserted chunk_12.json to db
Inserted chunk_13.json to db
Inserted chunk_14.json to db
Inserted chunk_15.json to db
Inserted chunk_16.json to db
Inserted chunk_17.json to db
Inserted chunk_18.json to db
Inserted chunk_19.json to db
2019-07-18 02:44:05,598 - INFO - -----------------Done ingesting insarmaps-------------------
stackTom commented 5 years ago

Can you tell me the commands you used to create the json and mbtile files using hdfeos5_2json_mbtiles.py? It should be an easy fix. I already see a solution, but I want to create the json files and mbtiles first to make sure it is the correct solution.

Does this happen with any other dataset?

falkamelung commented 5 years ago

Does not seem to happen anymore. Closing until it re-occurrs.

falkamelung commented 5 years ago

I just closed yesterday but now it happened again. I do see an early You will probably run out of disk space message from hdfeos5_2json_mbtiles.py .

I then removed data files from the insarmaps server. json_mbtiles2insarmaps.py throws the same error. But then I ran everything again (also hdfeos5_2json_mbtiles.py) and it works fine (no 'run out of disk space' message and the dataset ingests.

What is going on? Is hdfeos5_2json_mbtiles.py also communicating with the server? I thought only json_mbtiles*.py would do that. I could not find the disk space message anywhere in the code. Does it come from a library? Or could this be related to a full (or near full) /tmp space?

ingest_insarmaps.py /nethome/famelung/insarlab/infiles/famelung/TEMPLATES/KashgarSenAT129.template 

*************** Template Options ****************
Custom Template File:  /nethome/famelung/insarlab/infiles/famelung/TEMPLATES/KashgarSenAT129.template
Project Name:  KashgarSenAT129
Work Dir:  /projects/scratch/insarlab/famelung/KashgarSenAT129
1 2 3-->'1 2 3'
39.51 40.25 75.08 77.81-->'39.51 40.25 75.08 77.81'
auto-->auto
None-->None
generate template file: /projects/scratch/insarlab/famelung/KashgarSenAT129/KashgarSenAT129.template
1 2 3-->'1 2 3'
39.51 40.25 75.08 77.81-->'39.51 40.25 75.08 77.81'
20190807:045616 * ingest_insarmaps.py /nethome/famelung/insarlab/infiles/famelung/TEMPLATES/KashgarSenAT129.template
2019-08-07 04:56:16,809 - INFO - Removing directory: /projects/scratch/insarlab/famelung/KashgarSenAT129/mintpy/JSON
2019-08-07 04:56:16,860 - INFO - hdfeos5_2json_mbtiles.py /projects/scratch/insarlab/famelung/KashgarSenAT129/mintpy/S1_IW123_129_0128_0130_20141014_20190708.he5 /projects/scratch/insarlab/famelung/KashgarSenAT129/mintpy/JSON |& tee out_insarmaps.log
20190807:045616 * hdfeos5_2json_mbtiles.py /projects/scratch/insarlab/famelung/KashgarSenAT129/mintpy/S1_IW123_129_0128_0130_20141014_20190708.he5 /projects/scratch/insarlab/famelung/KashgarSenAT129/mintpy/JSON |& tee out_insarmaps.log
You will probably run out of disk space.
102485608 bytes used or committed, of 112979968 originally available
/nethome/famelung/test/operations/rsmas_insar/sources/MintPy/mintpy/hdfeos5_2json_mbtiles.py:235: DeprecationWarning: time.clock has been deprecated in Python 3.3 and will be removed from Python 3.8: use time.perf_counter or time.process_time instead
  start_time = time.clock()
/nethome/famelung/test/operations/rsmas_insar/sources/MintPy/mintpy/hdfeos5_2json_mbtiles.py:107: FutureWarning: `rcond` parameter will change to the default of machine precision times ``max(M, N)`` where M and N are the input matrix dimensions.
To use the future default and silence this warning we advise to pass `rcond=None`, to keep using the old, explicitly pass `rcond=-1`.
  m, c = np.linalg.lstsq(A, y)[0]
/nethome/famelung/test/operations/rsmas_insar/sources/MintPy/mintpy/hdfeos5_2json_mbtiles.py:307: DeprecationWarning: time.clock has been deprecated in Python 3.3 and will be removed from Python 3.8: use time.perf_counter or time.process_time instead
  end_time =  time.clock()
reading displacement data from file: /projects/scratch/insarlab/famelung/KashgarSenAT129/mintpy/S1_IW123_129_0128_0130_20141014_20190708.he5 ...
reading mask data from file: /projects/scratch/insarlab/famelung/KashgarSenAT129/mintpy/S1_IW123_129_0128_0130_20141014_20190708.he5 ...
//login3/projects/scratch/insarlab/famelung/KashgarSenAT129[1015] json_mbtiles2insarmaps.py -u insaradmin -p Insar123 --host insarmaps.miami.edu -P rsmastest -U rsmas\@gmail.com --json_folder /projects/scratch/insarlab/famelung/KashgarSenAT129/mintpy/JSON --mbtiles_file /projects/scratch/insarlab/famelung/KashgarSenAT129/mintpy/JSON/S1_IW123_129_0128_0130_20141014_20190708.mbtiles
Uploading json chunks...
Clearing old dataset, if it is there
Creating index on S1_IW123_129_0128_0130_20141014_20190708
Inserted chunk_1.json to db
Inserted chunk_2.json to db
Inserted chunk_3.json to db
Inserted chunk_4.json to db
Inserted chunk_5.json to db
Inserted chunk_6.json to db
Inserted chunk_7.json to db
Inserted chunk_8.json to db
Inserted chunk_9.json to db
Inserted chunk_10.json to db
Inserted chunk_11.json to db
Inserted chunk_12.json to db
Inserted chunk_13.json to db
Inserted chunk_14.json to db
Inserted chunk_15.json to db
Inserted chunk_16.json to db
Inserted chunk_17.json to db
Inserted chunk_18.json to db
Inserted chunk_19.json to db
Inserted chunk_20.json to db
Inserted chunk_21.json to db
Inserted chunk_22.json to db
Inserted chunk_23.json to db
Inserted chunk_24.json to db
Inserted chunk_25.json to db
Inserted chunk_26.json to db
Inserted chunk_27.json to db
Inserted chunk_28.json to db
Inserted chunk_29.json to db
Inserted chunk_30.json to db
Inserted chunk_31.json to db
Inserted chunk_32.json to db
Inserted chunk_33.json to db
Inserted chunk_34.json to db
Inserted chunk_35.json to db
Inserted chunk_36.json to db
Inserted chunk_37.json to db
Inserted chunk_38.json to db
Inserted chunk_39.json to db
Inserted chunk_40.json to db
Inserted chunk_41.json to db
Inserted chunk_42.json to db
Inserted chunk_43.json to db
Inserted chunk_44.json to db
Inserted chunk_45.json to db
Inserted chunk_46.json to db
Inserted chunk_47.json to db
Inserted chunk_48.json to db
Inserted chunk_49.json to db
Inserted chunk_50.json to db
Inserted chunk_51.json to db
Inserted chunk_52.json to db
Inserted chunk_53.json to db
Inserted chunk_54.json to db
Inserted chunk_55.json to db
Inserted chunk_56.json to db
Inserted chunk_57.json to db
Inserted chunk_58.json to db
Inserted chunk_59.json to db
Inserted chunk_60.json to db
Inserted chunk_61.json to db
Inserted chunk_62.json to db
Inserted chunk_63.json to db
Inserted chunk_64.json to db
Inserted chunk_65.json to db
Inserted chunk_66.json to db
Inserted chunk_67.json to db
Inserted chunk_68.json to db
Inserted chunk_69.json to db
Inserted chunk_70.json to db
Inserted chunk_71.json to db
Inserted chunk_72.json to db
Inserted chunk_73.json to db
Inserted chunk_74.json to db
Inserted chunk_75.json to db
Inserted chunk_76.json to db
Inserted chunk_77.json to db
Inserted chunk_78.json to db
Inserted chunk_79.json to db
Inserted chunk_80.json to db
Inserted chunk_81.json to db
FAILURE:
Unable to open datasource `/projects/scratch/insarlab/famelung/KashgarSenAT129/mintpy/JSON/S1_IW123_129_0128_0130_20141014_20190708.mbtiles-journal' with the following drivers.
  -> `PCIDSK'
  -> `netCDF'
  -> `JP2OpenJPEG'
  -> `PDF'
  -> `MBTiles'
  -> `EEDA'
  -> `ESRI Shapefile'
  -> `MapInfo File'
  -> `UK .NTF'
  -> `OGR_SDTS'
  -> `S57'
  -> `DGN'
  -> `OGR_VRT'
  -> `REC'
  -> `Memory'
  -> `BNA'
  -> `CSV'
  -> `NAS'
  -> `GML'
  -> `GPX'
  -> `LIBKML'
  -> `KML'
  -> `GeoJSON'
  -> `GeoJSONSeq'
  -> `ESRIJSON'
  -> `TopoJSON'
  -> `Interlis 1'
  -> `Interlis 2'
  -> `OGR_GMT'
  -> `GPKG'
  -> `SQLite'
  -> `OGR_DODS'
  -> `WAsP'
  -> `PostgreSQL'
  -> `OpenFileGDB'
  -> `XPlane'
  -> `DXF'
  -> `CAD'
  -> `Geoconcept'
  -> `GeoRSS'
  -> `GPSTrackMaker'
  -> `VFK'
  -> `PGDUMP'
  -> `OSM'
  -> `GPSBabel'
  -> `SUA'
  -> `OpenAir'
  -> `OGR_PDS'
  -> `WFS'
  -> `WFS3'
  -> `HTF'
  -> `AeronavFAA'
  -> `EDIGEO'
  -> `GFT'
  -> `SVG'
  -> `CouchDB'
  -> `Cloudant'
  -> `Idrisi'
  -> `ARCGEN'
  -> `SEGUKOOA'
  -> `SEGY'
  -> `XLS'
  -> `ODS'
  -> `XLSX'
  -> `ElasticSearch'
  -> `Carto'
  -> `AmigoCloud'
  -> `SXF'
  -> `Selafin'
  -> `JML'
  -> `PLSCENES'
  -> `CSW'
  -> `VDV'
  -> `GMLAS'
  -> `MVT'
  -> `TIGER'
  -> `AVCBin'
  -> `AVCE00'
  -> `NGW'
  -> `HTTP'
Error inserting into the database. This is most often due to running out of Memory (RAM), or incorrect database credentials... quitting//login3/projects/scratch/insarlab/famelung/KashgarSenAT129[1016] rm -rf mintpy/JSON
stackTom commented 4 years ago

Does this keep occurring? I don't print You will probably run out of disk space anywhere in that script, and none of the mintpy libraries we call have that message either. If we could reliably reproduce this error, it would help me to debug it.

falkamelung commented 4 years ago

No. I did not see this for a long time. I will close this.

If we had the ability to quickly run a health check, checking for disk space and what else could go wrong, that would be helpful.