Closed 1Map closed 4 years ago
Bump
Unsure, does this always happen or just sometimes? Does your OS clean out temp files regularly? Ogre utilizes the temp directory for processing.
My OS is Ubuntu Server 16.04.
This happens always. Here is an example of my code to import shape file to postgresql (Which works 100%, except for the annoying error message which happens on /node_modules/deasync/index.js:66:11):
shapefile = ogr2ogr(uploadpath + "/" + uploadfile)
.format('PostgreSQL')
.timeout(48000000)
.options(['-nlt', 'GEOMETRY', '-lco', 'GEOMETRY_NAME=the_geom', '-lco', 'FID=onemapfid', '-nln', uploadtable])
.project('EPSG:4326')
.skipfailures()
.destination('PG:host=' + pgdatabaseConfig.host + ' port=' + pgdatabaseConfig.port + ' user=' + pgdatabaseConfig.user + ' dbname=' + pgdatabaseConfig.database + ' password=' + pgdatabaseConfig.password)
.exec(function (er, data) {
if (er) {
lastErrorMessage = err.message;
console.error(er);
} else {
allOk = true;
}
callback();
});
hmm.. deasync
isn't used in this project (or any of its dependencies), is that from some other part of code?
Well, after ogr2ogr, I sleep for 30 seconds using:
https://github.com/jochemstoel/nodejs-system-sleep
Within the sleep function it throws the error as above. I reduced the sleep to only 1 second (just for a test), and no more errors. It is as if ogr2ogr cleans up after the 30 seconds sleep and finds that the temp file is no longer there (because it maybe got deleted while busy in the sleep). Not sure. Error from deasync within the sleep as: ENOENT: no such file or directory, scandir '/tmp/ogr_53c53644160'
Talked too early. Messages are still being displayed.
What type of input do you have, a ZIP file?
No, Shp file. Also tested with various other formats. It works 100%, except for message above.
I now removed the sleep completely and use setTimeout
It gives:
{
"message":"ENOENT: no such file or directory, scandir '/tmp/ogr_53d33293d87'",
"stack":"Error: ENOENT: no such file or directory, scandir '/tmp/ogr_53d33293d87'",
"errno":-2,
"code":"ENOENT",
"syscall":"scandir",
"path":"/tmp/ogr_53d33293d87",
"level":"error",
"timestamp":"2018-05-31T15:54:15.325Z"
}
Pesonally I don't think it is anything to do with the sleep/setTimeout.
I agree. So, to clarify, the extension used at ogr2ogr(uploadpath + "/" + uploadfile)
is a .shp
file or a .zip
containing a shp, dbf etc? Trying to pin down how a tmp directory would even need to be used in your case.
Yes, uploadpath + "/" + uploadfile is a shp file (Not a zip file).
Ok, I made a small test where I export postgresql data to a shape file:
shapefile = ogr2ogr('PG:host=xxxxx port=xxxxx user=xxxxx dbname=xxxxx password=xxxxx')
.format('ESRI Shapefile')
.timeout(48000000)
.options(['-lco', 'GEOMETRY_NAME=the_geom', '-sql', mySQLString])
.project(targetprojection, 'EPSG:4326')
.skipfailures()
.destination('/home/onemap/myfile.shp')
.exec(function (er, data) {
if (er) {
console.error(er);
}
});
Although the file do get exported 100%, it gives an error:
{
"message":"ENOENT: no such file or directory, scandir '/tmp/ogr_53d33473220'",
"stack":"Error: ENOENT: no such file or directory, scandir '/tmp/ogr_53d33473220'",
"errno":-2,
"code":"ENOENT",
"syscall":"scandir",
"path":"/tmp/ogr_53d33473220",
"level":"error",
"timestamp":"2018-05-31T16:09:21.872Z"
}
gdalinfo --version GDAL 2.1.3, released 2017/20/01
Hmm I cannot duplicate so far. I can't even get a .shp
file to work by itself. The underlying ogr2ogr
command seems to need a matching .shx
and .dbf
at a minimum. Do you have a sample that works for you that you would be willing to share?
OK, I cannot duplicate the issue with that file, seems to work well. Can you get this implementation to fail? (just trying to test input issues, not output) If not, maybe try adding in options to see when the failure occurs:
shapefile = ogr2ogr(uploadpath + "/" + uploadfile)
.format('GeoJSON')
.exec(function (er, data) {
if (er) {
lastErrorMessage = err.message;
console.error(er);
} else {
allOk = true;
}
callback();
});
Manage to trace it to your code (index.js):
function getInFilePath(er, fpath) {
if (er) return one(er)
ogr2ogr._inPath = fpath
ogr2ogr._isZipIn = /zip|kmz/.test(path.extname(fpath)) && !/^\/vsizip\//.test(fpath)
ogr2ogr._isCsvIn = /csv/.test(path.extname(fpath))
ogr2ogr._isZipOut = ogr2ogr._driver.output == 'zip'
ogr2ogr._ogrOutPath = ogr2ogr._isZipOut ? util.genTmpPath() : '/vsistdout/'
For SHP, ogr2ogr._driver.output == 'zip'
This should be ogr2ogr._driver.output == 'shp'
Think it is because of: https://github.com/wavded/ogr2ogr/blob/master/modules/drivers.json
,{
"format": "ESRI Shapefile",
"aliases": [ "shp", "prj", "shx", "dbf" ],
"output": "zip"
},
I will try with:
,{
"format": "ESRI Shapefile",
"aliases": [ "shp", "prj", "shx", "dbf" ],
"output": "shp"
},
The above code on a finish will call wrapUp. In this function ogr2ogr._isZipOut returns true, which is not what I want because I want a plain SHP file returned, not a zip. I handle the zip myself.
function wrapUp(er) {
if (er) {
ostream.emit('error', er)
return ogr2ogr._clean()
}
if (!ogr2ogr._isZipOut) {
ostream.emit('end')
ostream.emit('close')
return ogr2ogr._clean()
}
var zs = zip.createZipStream(ogr2ogr._ogrOutPath)
zs.on('error', function (er2) { ostream.emit('error', er2) })
zs.on('end', function () { ostream.emit('close'); ogr2ogr._clean() })
zs.pipe(ostream)
}
The line that actually throws the error is in zip.js (Folder does not exists)
exports.createZipStream = function (dpath) {
var zs = archiver('zip')
fs.readdir(dpath, function (er, files) {
if (er) return zs.emit('error', er)
I changed drivers.json as follow (for a test):
, {
"format": "ESRI Shapefile",
"aliases": ["shp", "prj", "shx", "dbf"],
"output": "shp"
},
And the error message is gone...
Ahh OK, yeah that makes sense, if you reference a shapefile directly knowing that the relevant .shx
and .dbf
exist in the same directory, ogr2ogr would work fine but this tool would error. The zip file convention existed to ensure all the required files were bundled together and was originally used in the Ogre tool which was how people provided the files (as a single ZIP). Seems we'll need an exception for this case.
Newest versions of gdal (2.0+) should be able to handle an shp file by itself without dbf, shx or prj files.
I've done this with shp2json
https://github.com/substack/shp2json/blob/master/index.js#L61-L140
However I don't use that library anymore since compiling gdal as a node dependency with pre-gyp takes forever.
Perhaps you should try that approach and don't assume there will be any complimentary files.
It's suboptimal but it would save you the fuss of checking for each file, which will skyrocket the NPath complexity of the method.
@wavded @1Map using ogr2ogr CLI, transforming a single shp
ogr2ogr -f PGDump file.sql file.shp
throws an error containing:
ERROR 4: Unable to open file.shx or file.SHX.
Try --config SHAPE_RESTORE_SHX true to restore or create it
If I run
ogr2ogr -f PGDump file.sql file.shp --config SHAPE_RESTORE_SHX YES
an shx file is generated and the file is converted without problems.
I tried the same with this library doing:
var inStream = ogr2ogr(inputFile)
.format('GeoJSON')
.options(["--config", "SHAPE_RESTORE_SHX", "YES"])
.timeout(options.timeout)
.stream();
and it works.
However, the coordinates are passed to the GeoJSON literally. If you chain .project('EPSG:4326')
it will complain that the source layer has no coordinate system.
Bottomline is: if the extension is shp
append ["--config", "SHAPE_RESTORE_SHX", "YES"]
to the arguments.
Closing due to age, reopen if issue persists.
I see the following in my error log. Any ideas?