Open creativesapiens opened 1 year ago
Hi @creativesapiens, thanks for the comprehensive bug report. We've been tracking this issue for a while, with reports in https://github.com/pelias/docker/issues/217 among other places. It appears to us that there's something a bit different about the Who's on First importer where it hits this issue, when other importers don't. However none of the Pelias team has ever been able to reproduce it, so maybe you can help us track it down.
We've also seen the issue where invalid requestTimeout
values are interpreted as 0ms, leading to timeout errors though that was a long time ago and with clearly invalid values like 120_000
. However, I tested both "120000"
and 120000
as timeout values and they both worked fine for me.
Can you answer a couple questions for me?
requestTimeout
value to your pelias.json
config in the first place? Was there an example config you found somewhere? If so we'd really like to be able to update it so we can correct it.requestTimeout
value from an integer to a string fixed it? What happens if you change it back or remove that line all together?pelias.json
? The default value is indeed the string "120000"
so I would expect your config to have no effect.node --version
and npm ls
would be helpfulThanks!
I think the issue is that here https://github.com/pelias/dbclient/blob/master/src/configValidation.js#L34 If there is an error it logs that it doesn't exist And the error is silenced and nothing is printed.
I'm also encountering this problem and i'm using the default config timeout.
"esclient": {
"apiVersion": "7.x",
"keepAlive": true,
"requestTimeout": "120000",
Here's my full config:
{
"esclient": {
"apiVersion": "7.x",
"keepAlive": true,
"requestTimeout": "120000",
"hosts": [
{
"env": "development",
"protocol": "https",
"host": "AWS.us-west-2.es.amazonaws.com",
"port": 443,
"auth": "negatron"
}
],
"log": [
{
"type": "stdio",
"json": false,
"level": [
"error",
"warning"
]
}
]
},
"elasticsearch": {
"settings": {
"index": {
"number_of_replicas": "0",
"number_of_shards": "5",
"refresh_interval": "1m"
}
}
},
"interpolation": {
"client": {
"adapter": "null"
}
},
"dbclient": {
"statFrequency": 10000,
"batchSize": 500
},
"api": {
"accessLog": "common",
"host": "http://pelias",
"indexName": "pelias",
"version": "1.0",
"targets": {
"auto_discover": true,
"canonical_sources": [
"whosonfirst",
"openstreetmap",
"openaddresses",
"geonames"
],
"layers_by_source": {
"openstreetmap": [
"address",
"venue",
"street"
],
"openaddresses": [
"address"
],
"geonames": [
"country",
"macroregion",
"region",
"county",
"localadmin",
"locality",
"borough",
"neighbourhood",
"venue"
],
"whosonfirst": [
"continent",
"empire",
"country",
"dependency",
"macroregion",
"region",
"locality",
"localadmin",
"macrocounty",
"county",
"macrohood",
"borough",
"neighbourhood",
"microhood",
"disputed",
"venue",
"postalcode",
"ocean",
"marinearea"
]
},
"source_aliases": {
"osm": [
"openstreetmap"
],
"oa": [
"openaddresses"
],
"gn": [
"geonames"
],
"wof": [
"whosonfirst"
]
},
"layer_aliases": {
"coarse": [
"continent",
"empire",
"country",
"dependency",
"macroregion",
"region",
"locality",
"localadmin",
"macrocounty",
"county",
"macrohood",
"borough",
"neighbourhood",
"microhood",
"disputed",
"postalcode",
"ocean",
"marinearea"
]
}
},
"port": 3100,
"attributionURL": "nope",
"services": {
"pip": {
"url": "http://localhost:3102"
},
"libpostal": {
"url": "http://localhost:4400"
},
"placeholder": {
"url": "http://localhost:3000"
}
}
},
"schema": {
"indexName": "pelias"
},
"logger": {
"level": "debug",
"timestamp": true,
"colorize": true
},
"acceptance-tests": {
"endpoints": {
"local": "http://localhost:3100/v1/"
}
},
"imports": {
"adminLookup": {
"enabled": true,
"maxConcurrentRequests": 100,
"usePostalCities": true
},
"blacklist": {
"files": []
},
"csv": {},
"geonames": {
"datapath": "/data/pelias/geonames",
"countryCode": "US"
},
"openstreetmap": {
"datapath": "/data/pelias/openstreetmap",
"leveldbpath": "/tmp",
"import": [
{
"filename": "extract.osm.pbf"
}
]
},
"openaddresses": {
"datapath": "/mnt/pelias/openaddresses",
"token": "oa.bbbcf5787bb4251445883cc417f811ba02b9fd64809fd56c5a972171fbcfb2f6",
"files": []
},
"polyline": {
"datapath": "/data/pelias/polyline",
"files": [
"north-america-valhalla.polylines.0sv"
]
},
"whosonfirst": {
"datapath": "/data/pelias/whosonfirst",
"importPostalcodes": true,
"countryCode": "US"
}
}
}
If the root causes is a timeout (hard to know with the current logging, until https://github.com/pelias/dbclient/pull/129 is rolled out to the various client libraries), you can increase the timeout.
From https://github.com/pelias/docker/issues/217#issuecomment-1310547892
After having the planet sized import fail a couple dozen times with the default 2 minute timeout, I specified a timeout of 10 minutes and was able to complete the import on the first try.
pelias config:
{
"esclient": {
"requestTimeout": "600000",
...
},
...
}
Describe the bug
Pelias whosonfirst importer reports the following error when importing with
npm run start
with an improper timeout setting inpelias.json
. The reported error is:Whereas one can clearly see that the index does exist:
The configuration was done as:
Steps to Reproduce
pelias.json
npm run start
Expected behavior
A message should be presented that this was an issue with timeout, or perhaps an issue with JSON file.
Environment (please complete the following information):
Pastebin/Screenshots
Additional context
Complete command run with stack trace was given as:
References
What fixed it?
Having a proper
pelias.json
configuration with timeout fixed it:Note:
requestTimeout
as changed to astring
with a value of"120000"
.