Closed jMyles closed 7 years ago
Hey @jMyles - I think this was fixed in #34 for the python3.5 branch. It wasn't backported to the Python 2 master branch - mostly cause I think that'll just be deprecated here shortly.
Ahh, @bmuller indeed you are correct.
We're still left hoping for maintenance of the twisted branch, as we have decided to use twisted in the codebase in question.
Thanks as always! Hope you are well.
K - I'll try to keep that in mind for future bug fixes.
I've been hunting the cause of this error:
I think I've found it, but what I'm looking at doesn't make much sense.
set_digest
callsawait spider.find()
, calling the resultnodes
.This usually works just fine, because
nodes
ends up being a list, which is used to emit a log on the following line:self.log.info("setting '%s' on %s" % (dkey.hex(), list(map(str, nodes))))
However, there is at least one execution path where
nodes
is a coroutine instead of a list.set_digest
callsNodeSpiderCrawl.find
, which callsSpiderCrawl._find
, which callsNodeSpiderCrawl._nodesFound
. You can see from the conclusion of this method that, if we haven't contacted all ofself.nearest
, then we return thefind()
method (a coroutine) instead of the list:https://github.com/bmuller/kademlia/blob/master/kademlia/crawling.py#L142
bootstrap
seems to handle this scenario, butset_digest
does not.Am I doing something wrong?