isaacs / node-graceful-fs

fs with incremental backoff on EMFILE
ISC License
1.27k stars 148 forks source link

"(libuv) Failed to create kqueue (24)" with fs.ReadFile() on 20,000+ files #24

Closed christianbundy closed 10 years ago

christianbundy commented 10 years ago

I'm sure this sounds ridiculous, but I'm opening, parsing, and closing 20,000 files asynchronously and having issues when my main() loop is slow. In the example below, I'm writing each of the files to MongoDB.

var fs = require('graceful-fs'); // Queues fs.readFile()
var dive = require('dive');      // Recursive dir walking
var databaseUrl = "127.0.0.1:3002/meteor";
var collections = ["Erowid"];
var db = require("mongojs").connect(databaseUrl, collections);

var main = function(e) {
    db.Erowid.insert(e);
}

dive(__dirname + '/json', { all: false }, function (err, file) {
    fs.readFile(file, function (err, contents) {
        if (err) throw err;
        var value = main(JSON.parse(contents));
        if (typeof value === 'object') {
            fs.writeFile(file, JSON.stringify(value, null, 1) + "\r\n", function(err) {
                if (err) throw err;
            });
        }
    })
});

Should I be doing this differently, or is the kqueue error a bug? Thanks!

isaacs commented 10 years ago

The kqueue issue does sound like a Node.js bug (or at least, limitation).

Have you tried just increasing your ulimit rather than using graceful-fs?

christianbundy commented 10 years ago

I've increased my ulimit, but it doesn't fix anything. I'll poke around and see if I can figure something else out.

To anyone reading this in the future, my solution was to concatenate my files into one file with 20,000 lines and read each individually. Not only does this fix the error, it's also much faster.