Closed jhs closed 11 years ago
I've been able to replicate this too within the tests for my sharedconfig library. In my case, I can recreate the problem by stopping the feed in test suite A, and then mocha kicks into firing off suite B which has some setup code for the same database that was being monitored in A. You can see this pretty simply, by forking sharedconfig and then running npm test
(Each test runs fine in isolation, but together failure occurs).
I implemented a workaround by checking for this.dead
in the got_activity
handler:
https://github.com/DamonOehlman/follow/compare/patch-wait-timer-not-found
Not sure if this is the right way to handle things, but it definitely prevented the error from re-occurring.
Good catch! When you recreate the problem, you stop the feed in test A. Do you mean .pause()
or .stop()
? Thanks.
I'm using the patch from @DamonOehlman. It successfully prevents my Cannot find wait timer
errors, too. What's stopping this from getting merged into master
?
Very sorry! I thought I merged this months ago. Merged.
thanks!
I'm getting this now piping the npm registry data into elasticsearch. Here's a reduced example
// some docs we index, some docs we don't
// If you pause, and then asynchronously resume,
// and then later pause, and *immediately* resume,
// it gets pretty upset.
var follow = require('follow')
var immediate = false
follow({
db: 'http://isaacs.iriscouch.com/registry'
}, function(err, change) {
if (err)
throw err
var that = this
this.pause()
console.log(change.id)
// decide not to do anything with this one, I guess
if (immediate)
that.resume();
else // write into elasticsearch index
setTimeout(function() {
that.resume()
});
immediate = !immediate
})
I've worked around it in npm2es by only pausing if it's going to be async, but this does seem like a bug in follow.
(This bug is a note to self, may be opaque.)
Figured out how to reproduce this.
Start the distributor from scratch. Set the checkpoint high like 100k or 250k. After about 3 million docs, it will hit this pretty consistently.