I am trying to ingest data and I am getting the error message below. Any idea what could be the reason? I rebooted already but it did not help. I am actually at a meeting and was hoping to ingest more data tonight or tomorrow morning. So if you have a chance to look at this that would be fantastic.
@stackTom
json_mbtiles2insarmaps.py -u insaradmin -p Insar123 --host insarmaps.miami.edu -P rsmastest -U rsmas\@gmail.com --json_folder /projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/JSON --mbtiles_file /projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/JSON/S1_IW1_128_0595_0597_20160605_XXXXXXXX.mbtiles
Uploading json chunks...
Clearing old dataset, if it is there
Creating index on S1_IW1_128_0595_0597_20160605_XXXXXXXX
Inserted chunk_1.json to db
Inserted chunk_2.json to db
Inserted chunk_3.json to db
Inserted chunk_4.json to db
Inserted chunk_5.json to db
Inserted chunk_6.json to db
Inserted chunk_7.json to db
Inserted chunk_8.json to db
Inserted chunk_9.json to db
Inserted chunk_10.json to db
Inserted chunk_11.json to db
Inserted chunk_12.json to db
Inserted chunk_13.json to db
Inserted chunk_14.json to db
Inserted chunk_15.json to db
Inserted chunk_16.json to db
Inserted chunk_17.json to db
Inserted chunk_18.json to db
Inserted chunk_19.json to db
Inserted chunk_20.json to db
Inserted chunk_21.json to db
Inserted chunk_22.json to db
Uploading mbtiles...
Traceback (most recent call last):
File "/nethome/famelung/test/test1/rsmas_insar/sources/PySAR/pysar/json_mbtiles2insarmaps.py", line 164, in <module>
main()
File "/nethome/famelung/test/test1/rsmas_insar/sources/PySAR/pysar/json_mbtiles2insarmaps.py", line 147, in main
dbContoller.upload_mbtiles(parseArgs.mbtiles_file)
File "/nethome/famelung/test/test1/rsmas_insar/sources/PySAR/pysar/add_attribute_insarmaps.py", line 237, in upload_mbtiles
curl.perform()
pycurl.error: (26, ‘')
I am trying to ingest data and I am getting the error message below. Any idea what could be the reason? I rebooted already but it did not help. I am actually at a meeting and was hoping to ingest more data tonight or tomorrow morning. So if you have a chance to look at this that would be fantastic.
@stackTom
Disk does not seem full