airportyh / fireworm

A crawling file watcher.
18 stars 7 forks source link

All events are broadcasted twice #7

Open JsonHunt opened 9 years ago

JsonHunt commented 9 years ago

adding, removing, changing any file matching glob will trigger two events of respective kind

airportyh commented 9 years ago

Not able to reproduce with my basic manual tests. Could you give more detail? Perhaps a gist or a repo that reproduces it.

JsonHunt commented 9 years ago

I'm running it on Windows... I'll try to get more details to you when I have a moment

hexparrot commented 8 years ago

here's a gist that reproduces this issue

https://gist.github.com/hexparrot/fc431960d08749d9061b

Vanuan commented 8 years ago

The bug is somewhere here: https://github.com/airportyh/fireworm/blob/master/lib/dir.js#L57

It's possible that there's some sort of race condition. Do we need to use some sort of mutex here?

Vanuan commented 8 years ago

Yeah, so if entryNames contains multiple duplicates, multiple fs.stat would be called, resulting in multiple emit signals.

Vanuan commented 8 years ago

i.e., you need to use the sync version of fs.stat to avoid the bug.

Vanuan commented 8 years ago

Here's a simpler code that demonstrates a problem.

Array.prototype.includes = function(el) {
  return !(this.indexOf(el) == -1);
}

In the synchronous mode nothing unusual happens:

var seen = [];

['file1', 'file1'].forEach(function(filename) {
  if(!seen.includes(filename)) {
    console.log(filename);
    seen.push(filename);
  }
});

file1 is printed once as expected. But when we add asynchrony, the disaster occurs:

var seen = [];

['file1', 'file1'].forEach(function(filename, i) {
  if(!seen.includes(filename)) { 
    setTimeout(function() { 
      console.log(filename);
      seen.push(filename);
    }, 1000);
  } 
});
Vanuan commented 8 years ago

Ok, noticed this: https://github.com/airportyh/fireworm/blob/master/lib/dir.js#L110

So the bug isn't there. The bug is that there's no such check for directories, so they're added twice, resulting in duplicate files.