Closed lazaruslarue closed 10 years ago
Hi @lazaruslarue, based on your issue and without running myself yet i think you're DoS-ing your Neo4j instance. I suggest you use a batch-query for those type of things.
The library itself has no queuing built in, so all calls you make go straight to Neo4j, which probably just can't keep up with it.
If it's not related to the number of connections, please let me know. I'll look into it then after my exams, which will end by the 27.1.
Cheers, Phil
thanks for the quick response, @philippkueng. i think the DoS situation probably correct. for the time being, i've worked around the issue by creating a cypher query for each entry. this seems to do the trick. if i learn anything else about the issue i'll let you know. good luck with exams :)
Hi @lazaruslarue,
I still couldn't reproduce your scenario. However going with the DoS theory I think Neo4j prefers you to use a batchQuery
for those kind of tasks. http://docs.neo4j.org/chunked/stable/rest-api-batch-ops.html
Let me know in case that still bugs you. Will close the issue in the meantime (spring-cleaning)
Best, Phil
Trying to batch insert a lot of data with versions of this(a), or a timeout-wrapped version(b):
My file has something like 7000 lines, and returns the below error. Works fine when I include only 50 or so lines of the file.
File format is like this: