expressjs / multer

Node.js middleware for handling `multipart/form-data`.
MIT License
11.55k stars 1.05k forks source link

File upload sometimes hangs - multer, busboy, or something else? #53

Closed filipabramowicz closed 9 years ago

filipabramowicz commented 10 years ago

HI!

I encountered some strange problem while using multer - sometimes it hangs while uploading the file.

When I am testing this behavior on my Ubuntu it succeeds every time, but when I switch to development server it hangs quite often.

I compared the behavior on both environments and added some basic printouts for troubleshooting:

app.use(multer({
  dest: './uploads/',
  rename: function (fieldname, filename) {
    return filename.replace(/\W+/g, '-').toLowerCase();
  },
  onFileUploadStart: function (file) {
    process.stderr.write('Uploading file..........');
  },
  onFileUploadComplete: function (file) {
    process.stderr.write('done\n');
  },

...

}));

On development server Uploading file......... is printed in the console but done doesn't show up - which suggests it can be multer or busboy issue.

Full code in which I am using multer can be found here: https://github.com/aplikacjespoleczne/WizualizacjaBudzetu

Also for troubleshooting I've added the simmilar logging to the multer code (in index.js) :

        fileStream.on('data', function(data) {
          console.log('Next part received!');        // ADDED
          if (data) { file.size += data.length; }
          // trigger "file data" event
          if (options.onFileUploadData) { options.onFileUploadData(file, data); }
        });

And the difference between Ubuntu and development server is the last chunk of data. In both environments Next part received! shows up but on development server done doesn't - which may suggest that the 'finish' event is not emitted.

Can it be multer or busboy? Can you give me some advice where to look further? What else could I trace?

Thanks!

samuelngs commented 10 years ago

Ya. I think I'm having the same problem as well.

yevmoroz commented 10 years ago

+1 first time I've used multer – i didnt used cors and credentials and know when both are enabled, that's a huge problem sometimes it goes well, but only when nodejs restarted and first time it upload small files very well, when i'm trying to make further uploads it hangs up and seems this look like a memory overflow

UPD it works ideally with small files, very small; seems like it is something in nodejs and I suppose that when it will be on production server, there will be also some memory overflows maybe.. it depends on a OS, for example I'm using OS X

Albert-IV commented 9 years ago

This isn't the best thought-out issue thread, but I am going to chime in here as well.

I've been having an issue where file uploads would randomly fail when going through a reverse proxy depending on the file. Larger files will upload fine through the reverse proxy, however certain trouble files will refuse to be received by the server. The request will time out on the client side, and the server never seems to receive anything.

At first I thought it was an nginx issue. If you can upload the file without using the proxy OK and it fails when using the proxy, it must be a reverse proxy issue, right? However the logs from nginx tell a different story.

Looking at the debug logs, the proxy gets the connection and pipes it to the application. On correct downloads, nginx will set up the proxy headers and upstream will respond with it parsing the headers and handling the buffered file. On incorrect downloads, nginx sets the proxied headers and then upstream never responds.

So the issue seems to be with the node application itself. During the investigation, I set up multer like so:

app.use(require('multer')({
  limits: {
    fieldNameSize: 999999999,
    fieldSize: 999999999
  },
  includeEmptyFields: true,
  inMemory: true,
  onFileUploadStart: function(file) {
    console.log('Starting ' + file.fieldname);
  },
  onFileUploadData: function(file, data) {
    console.log('Got a chunk of data!');
  },
  onFileUploadComplete: function(file) {
    console.log('Completed file!');
  },
  onParseStart: function() {
    console.log('Starting to parse request!');
  },
  onParseEnd: function(req, next) {
    console.log('Done parsing!');
    next();
  },
  onError: function(e, next) {
    if (e) {
      console.log(e.stack);
    }
    next();
  }
}));

On correct uploads, logs look something like this:

Starting to parse request!
Starting file
Got a chunk of data!
[ ... ]
Got a chunk of data!
Completed file!
Done parsing!

On bad uploads:

Starting to parse request!
Starting file
Got a chunk of data!
[ ... ]
Got a chunk of data!
Completed file!

On these "bad" files, the parser never finishes parsing the request! Not only that, the onError event never gets caught/ fired either! I'm completely open for this being a reverse proxy misconfiguration, but shouldn't onError or onParseEnd be fired?

That being said, there isn't an actual reduced use-case here yet. I'm going to work on getting one put together now.

Edit

Realized onParseEnd needed to have next() called. The issue persists after adding it either way.

I also checked the file sizes and they match from client > nginx file > multer temp file.

jpfluger commented 9 years ago

This thread has good leads and thanks to @filipabramowicz and @droppedonjapan for posting code. Also I use an nginx proxy myself, so can try to dup droppedonjapan tests if you get a particular use-case.

Re. filipabramowicz: I will get to trying the example that you have linked to but maybe s/thing I've been debugging will help you with leads to your problem. Maybe....

In my local tests, I've isolated multer hanging consistently but within the callbacks specified for options.limits. This might be representative of errors you are experiencing in that the issue is with the error callback process:

Currently in Multer:

busboy.on('filesLimit', function() {
  if (options.onFilesLimit) { options.onFilesLimit(); } 
});

So in my tests, I added the onFinish event after the options callback. This allowed my program to continue and not hang:

busboy.on('filesLimit', function() {
  if (options.onFilesLimit) { options.onFilesLimit(); } 
  onFinish(new Error('filesLimit reached: cannot upload more files than specified in options.limits.files'));
});

But onFinish doesn't handle errors. Here's the existing code currently in multer:

var onFinish = function () {
  if (!readFinished || fileCount > 0) return;

  for (var field in req.files) {
    if (req.files[field].length === 1) {
      req.files[field] = req.files[field][0];
    }
  }

  // Parse the body and create a best structure
  req.body = qs.parse(req.body);

  // when done parsing the form, pass the control to the next middleware in stack
  if (options.onParseEnd) { options.onParseEnd(req, next); }
  else { next(); }
};

I redesigned onFinish to more cleanly exit the code. (Again, this is code I'm working on locally and not in multer/develop yet)

var onFinish = function (err) {
  if( err ) {
    req.files = {};
    req.body = {};

    return next(err);
  } else {
    if( !readFinished || fileCount > 0 ) {
      return;
    }

    for( var field in req.files ) {
      if( req.files[field].length === 1) {
        req.files[field] = req.files[field][0];
      }
    }

    req.body = qs.parse(req.body);
  }

  if (options.onParseEnd) { options.onParseEnd(req, next); }
  else { next(); }
};

In the code above, the error would be passed-on to the next function (e.g the response handler). Also, if there's an error, I explicitly empty req.body and req.files. My reasoning is that if there is an error the data isn't complete and so the entire transaction should be treated as an error. Right/Wrong?

Other thoughts: I'm tempted to create either create a custom error or a special object (eg req.multerErr) and instead of emptying req.body and req.files, these could get assigned into it.

req.multerErr = { 
    files: req.files,
    body: req.body
    err: err
};

req.files = {}
req.body = {}

// onParseEnd is never reached b/c there's an error
return next(err);

Not certain if this code is the best direction but it's better than not having a direction, I guess. Thoughts?

Albert-IV commented 9 years ago

I've worked for a few hours to come up with a minimal example, but I can't seem to get one to fail like I expect. This seems to support the idea that my particular issue might not be multer's fault (which is fairly likely at this point).

That being said, I can back up that the filesLimit error handler isn't firing as expected. Before the patch, the onFilesLimit handler was never called. Once I altered multer, the onFilesLimit error handler works as advertised. :+1:

(There was a missed brace somewhere in the above code, so I'm assuming you had wanted it like this:)

var onFinish = function (err) {
  if( err ) {
    req.files = {};
    req.body = {};

    return next(err);
  } else {
    if( !readFinished || fileCount > 0 ) {
      return;
    }

    for( var field in req.files ) {
      if( req.files[field].length === 1) {
        req.files[field] = req.files[field][0];
      }
    }

    req.body = qs.parse(req.body);
  }

  if (options.onParseEnd) { options.onParseEnd(req, next); }
  else { next(); }
};

As far as req.files and req.body, I'd support the idea that a malformed request would have no data set on the request. Adding a custom error field to the request would might be nice for those who want to handle errors down the line rather than multer's onError handler directly. I can't see a ton of people needing the feature, but I could potentially see some use-cases depending on the application. That being said, I am neither a maintainer or a regular here, so my opinions on this isn't super important!

That being said, thanks for looking around @jpfluger, I appreciate it. If I do end up finding a use-case that blows up I'll let you guys know!

jpfluger commented 9 years ago

Thanks for the brace fix. I updated the code. Stripped away some of my debugging helpers before I posted originally. Once busboy gets updated w/ a fix I pr'ed for, I'll put my proposed error handling changes in multer/develop and clone @filipabramowicz 's example for debugging.

prabhash1785 commented 9 years ago

Is this issue resolved? I am using multer to upload file and face the same problem as discussed in this thread. I see file gets uploaded but the control never goes to the next middleware.

katachthonios commented 9 years ago

I'm actually having the same issue this week. I notice that the onFinish is still the same and hasn't been redesigned; I don't know if this is because the onFinish logic isn't the culprit or... Anyway, I'll keep plugging at it as well and post back here if I get anywhere good. I'll also try updating busboy if your pull request has been integrated.

Also figured I'd note that I am using cors.

hacksparrow commented 9 years ago

Can you share some files, which fail to be uploaded?

katachthonios commented 9 years ago

Its not consistent (i.e. a file will work fine multiple times and then fail or vice versa), but let me see if I can find one that is relatively consistent with causing a hang and post that. Also note worthy (perhaps) is this: I'm running node from the console currently and when the files fail to upload, hitting ctrl+c doesn't actually end node, instead the processing takes off again and the file causing the issue as well as other subsequent files upload with no problems until everything just hangs again. Changing the onFinish code as above changes this behavior however, instead of the ctrl+c fixing the issue (without restarting node) everything remains hung, and a second (or third - some multiple, never the first) ctrl+c kills node and drops back to the command line. The final thing to note is that I'm using the latest binary of node and everything is on Windows Server 2012R2 but I was having the same issue on my personal PC which is Windows 7 Home.

prabhash1785 commented 9 years ago

I am running Node v.0.10.32 on Mac OS X 10.9.4.

Based on my experience, my file consistently gets uploaded. However during the hung status of server, multer runs twice (does the retry) and uploads the file twice and finally the request times out showing "No data received".

I am able to terminate my Node process with Control + c on console. Didn't have any problem with that.

I tried uploading small txt files (2 kb) to larger files (a few MB) and I was able to upload them both.

Here is my application code snippet:

index.js (root leve index.js at same level as package.json):

...
.....
app = module.exports = express();
app.use(kraken(options));

app = module.exports = express();
app.use(kraken(options));
app.on('start', function () {
console.log('Application ready to serve requests.');
console.log('Environment: %s', app.kraken.get('env:env'));
});

app.use(multer({
 dest: '/Users/abc/temp/fileuploads',
 limits: {
    fieldNameSize: 500,
    files: 2,
    fields: 5
 },
 rename: function (fieldname, filename) {
    return fieldname + filename + Date.now();
 },
 onFileUploadStart: function (file) {
    console.log('Upload starting for filename: ' + file.originalname);
 },
 onFileUploadData: function (file, data) {
    console.log(data.length + ' of ' + file.fieldname + ' arrived')
 },
 onParseStart: function () {
    console.log('Form parsing started at: ', new Date())
 },
 onParseEnd: function (req, next) {
    console.log('Form parsing completed at: ', new Date());
    next();
 },
 onFileUploadComplete: function (file) {
    console.log(file.fieldname + ' uploaded to  ' + file.path);
 },
 onFileSizeLimit: function (file) {
    console.log('Failed: ', file.originalname)
    fs.unlink('./' + file.path) // delete the partially written file
 },
 onFilesLimit: function () {
    console.log('Crossed file limit!')
 },
 onFieldsLimit: function () {
    console.log('Crossed fields limit!')
 },
 onPartsLimit: function () {
    console.log('Crossed parts limit!')
 },
 onError: function(error, next) {
    console.log("Error occurred while uploading the file!!");
    next(error);
 }
 }));

index.js (under controllers directory of Kraken application)

router.post('/upload', function (req, res) {

        console.log("File Uploaded");

        model.status = "File Uploaded!!";

        var body = req.body;
        console.log("File attributes: " + JSON.stringify(body));

        var files = req.files;
        console.log("Files: " + JSON.stringify(files));

        res.render('uploadfile/datauploadform', model);

    });

Here are the application logs after I start uploading the file (notice multer tried to upload same file twice before timing out):

Form parsing started at: Sat Jan 10 2015 01:29:45 GMT-0800 (PST) Upload starting for filename: HelloNashorn.js 35 of file arrived file uploaded to /Users/abc/temp/fileuploads/fileHelloNashorn1420882185450.js Form parsing completed at: Sat Jan 10 2015 01:29:45 GMT-0800 (PST)

Form parsing started at: Sat Jan 10 2015 01:31:45 GMT-0800 (PST) Upload starting for filename: HelloNashorn.js 35 of file arrived file uploaded to /Users/abc/temp/fileuploads/fileHelloNashorn1420882305450.js Form parsing completed at: Sat Jan 10 2015 01:31:45 GMT-0800 (PST)

hacksparrow commented 9 years ago

It would be hard to debug if other frameworks are involved. Can you guys write a short test which fails?

katachthonios commented 9 years ago

Most definitely, I'll do that later today and post back.

hacksparrow commented 9 years ago

Great!

prabhash1785 commented 9 years ago

I tried Multer with Express and I was able to upload file without any issues. So it looks like Multer isn't working with Kraken framework.

Also I notice that when I upload a *.txt file, the uploaded file has weird characters at the end of every line, something like this: ^M

application.versionMinor=${v4.resource-externalize.minorVersion}^M
application.name=${rpm.name}^M

Any idea why does it append these characters? Shouldn't the file be intact after upload?

katachthonios commented 9 years ago

I'm still kind of building up my test app since I can't replicate the condition in a very basic implementation; I'm just starting from a clean slate. On a very basic level, multer works perfectly - i.e. I can't replicate the behavior with just a basic upload, rename, etc. I did notice in my footling about (though this not a condition that is present in my code that was experiencing the issues) that if I leave an instance where its possible that res.end() is never called the resultant hang is like my previously described hang - just mentioning in case that helps anyone.

e.g.

app.post('/upload', function (req, res) {
        if (somePossiblyFalseBool) {
            res.end('File uploaded.');
        }
 });
katachthonios commented 9 years ago

So I was stripping down the app I had; some of the code may seem a little odd because of debugging and huge chunks of missing functionality, and sloppiness, but this hung for me. Also, it took about 20 files, then eating dinner, then coming back, then 5 more files or so, then watching the first half of Wolf of Wall sStreet, then coming back to it and at that point it was hung. I was running node in windows powershell but I've used the command line as well. (just running like this: PS C:\nodetestingsimple> node server.js). Maybe its just something dumb in my code or "image type" junk. Oh yeah, and fresh npm installs of multer, express and fs just in case I had munged something up.

index.html:

<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
    <title></title>
</head>
<body>
    <h1> Upload</h1>

    <form method='post' action='/upload' enctype='multipart/form-data'>

        <input type='file' name='image' />
        <input type='submit' />
    </form>

</body>
</html>

and the node script (server.js):

/// <reference path="node_modules/multer/index.js" />
/*Define dependencies.*/

var express = require("express");
var multer = require('multer');
var app = express();
var done = false;
var fs = require('fs');

var fileType = "";
var done = false;

var fileTypeEnum = {
    AUDIO: "audio",
    VIDEO: "video",
    IMAGE: "image",
    DOCUMENT: "document"
};

uploadsuccess = function(fileR) {
    console.log('uploadsuccess');

};

/*Configure the multer.*/

app.use(multer({
    ///////////////////////////////////////
    //dest
    dest: './uploads/',

    ///////////////////////////////////////
    //rename
    rename: function (fieldname, filename) {
        console.log('rename');
        return filename + Date.now();     
    },

    ///////////////////////////////////////
    //In clude empty
    includeEmptyFields: true,

    onError: function (error, next) {
        console.log(error);
        next(error);
    },

        ///////////////////////////////////////
    //Start upload
    onFileUploadStart: function (file, req, res) {
        console.log("start");
        var propValue;
        for (var propName in req) {
            propValue = req[propName];
            console.log(propName, propValue);
        }

        console.log(file.originalname + ' is starting ... onFileUploadStart');
        fileType = "";
        done = false;

        if(file.mimetype == 'image/png' || file.mimetype == 'image/jpg' || file.mimetype == 'image/jpeg' || file.mimetype == 'image/x-tiff'
            || file.mimetype == 'image/tiff' || file.mimetype == 'image/gif' || file.mimetype == 'image/svg+xml') {

            fileType = fileTypeEnum.IMAGE;
        }

        else if(file.mimetype == 'application/x-troff-msvideo'
            || file.mimetype == 'video/avi' || file.mimetype == 'video/msvideo' || file.mimetype == 'video/x-msvideo' || file.mimetype == 'video/x-ms-wmv'
            || file.mimetype == 'video/vnd.uvvu.mp4' ||   file.mimetype == 'video/mp4' || file.mimetype == 'application/mp4'
            || file.mimetype == 'video/mpeg' ||  file.mimetype == 'video/quicktime' ) {
            fileType = fileTypeEnum.VIDEO;
        }

        else if(file.mimetype == 'application/pdf') {
            fileType = fileTypeEnum.DOCUMENT;
        }

        else if (file.mimetype == 'audio/mpeg' || file.mimetype == 'audio/x-mpeg' || file.mimetype == 'audio/mpeg3'
            || file.mimetype == 'audio/x-mpeg-3' || file.mimetype == 'audio/mp4') {
            fileType = fileTypeEnum.AUDIO;
        } else {
            done = true;
            return false;

        }

        console.log(fileType);
        return true;

    },

    ///////////////////////////////////////
    //Upload Complete
    onFileUploadComplete: uploadsuccess,

    ///////////////////////////////////////
    //Parse End
    onParseEnd: function(req, next) {

        console.log("Parse End");

        if (!(req.files.file === undefined)) {

            var resultsObj = { Verified: true, PostID: '1111-1111-22222222-2222' };

                var uploadFilePath = './uploads/' + resultsObj.PostID;

                console.log(req.files);

                fs.rename(req.files.file.path, uploadFilePath, function (er) {

                });

            next();
        }

        next();

    }
}));

/*Handling routes.*/

    app.get('/', function (req, res) {
        res.sendFile(__dirname + "/index.html");
    });

    app.post('/upload', function (req, res) {

        if (done == true) {
            res.end("File uploaded.");
        } else {
            res.end("File uploaded.");
        }
    });

/*Run the server.*/
    app.listen(4000, function () {
        console.log("Working on port 4000");
    });
katachthonios commented 9 years ago

This is the file that wouldn't go (its fine now, I don't think the file is the problem, on a retest it works fine):

inspire-by-ashleyrose

katachthonios commented 9 years ago

I can definitely reproduce (at apparently random intervals) using the code above; the time it takes to get replication is variable and the file used seems to have no bearing on the situation. Ctrl +C does clear the hang - this makes me wonder if there's something synchronous going on somewhere that is waiting for the OS (windows in this case) which, in turn, occasionally doesn't return the expected signal. I added a console logging event in on parse start just to see if perhaps it was even getting there when the hand condition occurs but to no avail.

onParseStart: function () {
        console.log('Form parsing started at: ', new Date())
    },

Additionally, during the hang condition I can still telnet raw to port 4000 and it answers,

katachthonios commented 9 years ago

Set up an Ubuntu box running the exact same code, I'm not getting a hang there (yet - so at least the frequency is lower). I'm even more suspicious that this is an OS issue; I'll report back - if anyone has anything, let me know.

katachthonios commented 9 years ago

Update: No issues with multer this past week after moving off of a windows machine to a fedora 21 box for production. I still don't know why it didn't work on windows... firewall? Problem with node on Windows? No Idea.

pierre-b commented 9 years ago

Hi guys, Im experiencing same issue, the upload is stuck at the last chunk of data sometimes with some photos.

The connection is still open though, and gets closed when I restart nginx. Nothing in the logs.

The stack is:

I found when I cleared the cookies it worked again... but I didn't managed to reproduce this a second time yet.

Thank you for your help, I definitely don't know how to debug this...

prabhash1785 commented 9 years ago

If you are attaching a listener to onParseEnd event then make sure you call next() in your listener. This was one of the cause multer was going in hung status for me. Once I added next() in onParseEnd listener, it worked for me.

grimor commented 9 years ago

I've have similar problems. I tested it on 2 file upload inputs. And that are my multer events:

app.use(multer({
        dest:'./public/uploads/test',
        rename: function (fieldname, filename) {
            return filename.replace(/\W+/g, '-').toLowerCase() + Date.now()
        },
        onFileUploadStart: function(file) {
            console.log('Starting ' + file.name);
            // console.log(file);
        },
        onFileUploadData: function(file, data) {
            // console.log('Got a chunk of data!');
            // console.log(file.path);
        },
        onFileUploadComplete: function(file) {
            console.log('Completed file!');
            // console.log(file);
        },
        onParseStart: function() {
            console.log('Starting to parse request!');
        },
        onParseEnd: function(req, next) {
            console.log('Done parsing!');
            // console.log(req.files, req.body);
            next();
        },
        onError: function(e, next) {
            if (e) {
                console.log(e.stack);
            }
            next();
        }
    }));

And in console I have:

Starting to parse request!
Starting video
Starting video
Completed file!
POST /content - - ms - -
Starting to parse request!
Starting video
Starting video
Completed file!
POST /content - - ms - -
Starting to parse request!
Starting video
Starting video
Completed file!
Completed file!
Done parsing!

It is proper ? I shouldn't have one Starting to parse request! two Starting video and Completed files! events, and on the end only one Done parsing! event ?

That produces 6 files on disk instead of 2.

It looks like it making 3 requests and each time trying to upload new file.

kreegr commented 9 years ago

+1

meric426 commented 9 years ago

+1, this only happens to me when the connection is somewhat slow (low bandwidth)

meric426 commented 9 years ago

maybe @mscdex can help with this? the busboy finish-event is never triggered when my upload bandwidth is low..

mscdex commented 9 years ago

@meric426 The only thing preventing 'finish' from being emitted on a busboy stream is if there were file streams emitted that weren't read from. busboy's parsing utilizes the built-in stream backpressure mechanism, so if you don't read from a file stream, it can prevent parsing of the rest of the multipart request (it may not prevent parsing if files fit within highWaterMark bytes of the file streams' buffers).

chaudhryjunaid commented 9 years ago

Latest version of multer 0.1.8 still has this issue. After setting:

limits: { files: 1, fileSize: 1024 * 1024 * 3, },

request hangs when file limit is crossed or file size limit is crossed. The onFileSizeLimit and onFilesLimit functions are called but nothing happens after that and the request times out after 120s. Notably this does not happen for small files, only for some files possibly larger ones. I am worried this request hang must be a possible DoS risk to the server. Please help! Thanks!

LinusU commented 9 years ago

@chaudhry-junaid I'm quite sure that I just fixed that particular behaviour, could you check out the branch in #131 and see if it's working for you? Thanks!

jeffmcmahan commented 9 years ago

@LinusU I'm using multer for a project and I'm having the issue described here; I'd like to know if you could offer any prediction as to when the changes in PR #131 might be shipped?

LinusU commented 9 years ago

@jeffmcmahan I really wish I had but all I can say is that I hope to get it out as soon as possible. It feels like #133 is the only thing left to get right, and then we should be able to ship.

Everything also needs to be reviewed by the other team members but I've had great response and they've been following along, so hopefully that won't take too long.

I can't promise that the changes will ever be released, since I'm not the one who has final say, but I would say that it's looking extremely plausible. I really want to get this shipped within a month, tops.

If you want to go ahead and try it out now, you can pin your dependency at fef01ed, that should keep you with a stable api. I have tested that code and believe it to be working, but I can't make any promises.

What I can promise is that I'll personally be active and respond to any problems you might have, just ping me (@LinusU) in any issues you open and I will try and help. Having someone who tests the code in a real project would be very helpful to us.

If you would also be so kind and give your thoughts on the api I proposed here https://github.com/expressjs/multer/issues/133#issuecomment-111963910 that would be great. I would like to implement that by this weekend and any input is very valuable at the time.

LinusU commented 9 years ago

Multer 1.0.0 is now released! This issue should now be fixed, please reopen if the problem persists.

ping @jeffmcmahan :balloon: :gift: :clap:

chaudhryjunaid commented 9 years ago

Thanks for the fix update! I will check check the latest version in my next project!

jonnolen commented 8 years ago

We are still seeing an issue that looks very much like this one with latest multer (1.1), node 4.2.3, on windows x64. The servers are behind a reverse proxy. @jeffmcmahan

Dynasty9 commented 8 years ago

:+1: I'm seeing this, getting more information now.

Dynasty9 commented 8 years ago

I've narrowed down the failure to indicateDone and readFinished is set to false.

function indicateDone () { if (readFinished && pendingWrites.isZero() && !errorOccured) done() }

This section of code triggers indicateDone()

        storage._handleFile(req, file, function (err, info) {
          if (aborting) {
            appender.removePlaceholder(placeholder)
            uploadedFiles.push(extend(file, info))
            return pendingWrites.decrement()
          }

          if (err) {
            appender.removePlaceholder(placeholder)
            pendingWrites.decrement()
            return abortWithError(err)
          }

          var fileInfo = extend(file, info)

          appender.replacePlaceholder(placeholder, fileInfo)
          uploadedFiles.push(fileInfo)
          pendingWrites.decrement()
          indicateDone() <---- This triggers before
        })

This did not trigger

    busboy.on('finish', function () {
      readFinished = true
      indicateDone()
    })
Dynasty9 commented 8 years ago

The JSON for the file looks like this as it's trigering _handleFile

{
  "fieldname": "asset",
  "originalname": "blitzerSecuutus_diffuse.png",
  "encoding": "7bit",
  "mimetype": "image/png",
  "destination": "/var/folders/np/mr062xw97rg0gbbx5c91l8s00000gn/T",
  "filename": "09b59da77ed8b6d5b2842df90b6cd3b8",
  "path": "/var/folders/np/mr062xw97rg0gbbx5c91l8s00000gn/T/09b59da77ed8b6d5b2842df90b6cd3b8",
  "size": 457924,
  "hash": "9fccd8e2fa9f3b9dcdc0839a1b85e45a"
}

The file is present in the directory. This is on Mac OSX 10.10.2

Dynasty9 commented 8 years ago

Turns out if I change one line here this works.

storage._handleFile(req, file, function (err, info) {
          if (aborting) {
            appender.removePlaceholder(placeholder)
            uploadedFiles.push(extend(file, info))
            return pendingWrites.decrement()
          }

          if (err) {
            appender.removePlaceholder(placeholder)
            pendingWrites.decrement()
            return abortWithError(err)
          }

          var fileInfo = extend(file, info)

          appender.replacePlaceholder(placeholder, fileInfo)
          uploadedFiles.push(fileInfo)
          pendingWrites.decrement()
          readFinished = true <-- put this here
          indicateDone() 
        })
LinusU commented 8 years ago

@Dynasty9 The fix that you proposed would break for multiple files. I appreciate the help on debugging though!

Do you think you could make a test case that is failing, if I get that it should be fairly easy for me to fix the issue. As it is now, I can't reproduce it myself.

yangboxed commented 8 years ago

I am too facing this issue. I wrote a mocha test to test my upload image function and it constantly got hung. My node version is v0.12.9 and npm version is 2.14.9. I wrote a small test (uploads.zip).

In my test, I am trying to upload two images ('original', 'thumbnail') in one API call. I can see that only one fileInfo was pushed to uploadedFiles and then it hung somewhere. The case was, the first time I ran the test, it all passed. And then all the following runs would have some failures - the code hung and timed out. And not sure if this could be a hint. if I only upload one file, it seems working fine.

~/workspace/uploads$ mocha  --timeout 15000 test/setup

  test upload
In before function ....
    ✓ Health check, should return 200
    ✓ Upload files set 1
    ✓ Upload files set 2
    ✓ Upload files set 3
In after function ....

  4 passing (74ms)

~/workspace/uploads$ mocha  --timeout 15000 test/setup

  test upload
In before function ....
    ✓ Health check, should return 200
    ✓ Upload files set 1
    ✓ Upload files set 2
    1) Upload files set 3
In after function ....

  3 passing (15s)
  1 failing

  1) test upload Upload files set 3:
     Error: timeout of 15000ms exceeded. Ensure the done() callback is being called in this test.

~/workspace/uploads$ mocha  --timeout 15000 test/setup

  test upload
In before function ....
    ✓ Health check, should return 200
    1) Upload files set 1
    ✓ Upload files set 2
    ✓ Upload files set 3
In after function ....

  3 passing (15s)
  1 failing

  1) test upload Upload files set 1:
     Error: timeout of 15000ms exceeded. Ensure the done() callback is being called in this test.
s1riedel commented 8 years ago

I am facing this issue also using Busboy and NGINX. Has any resolution ever been found?

I am uploading 2 files at a time, and every 5 or 10 file uploads results in the client timing out and the server does not appear to even see the request come in.

baigao2015 commented 8 years ago

I also have the same problem as well.But I lookaround the answers there is always a problem. I use “connect-busboy” like this: //node.js + express : //app.js

 var busboy = require('connect-busboy');   
  app.use(busboy());

//route setting

    var express = require('express');
    var fs = require('fs');

    var router = express.Router();

    /* GET home page. */
    router.get('/', function (req, res, next) {
        res.render('formatdata', {title: 'use FormData upload'});
    });

    //node.js  Express  Connect-Busboy to upload
    router.post('/fileUpload', function (req, res) {
        var fstream;

        req.pipe(req.busboy);

        // existsSync
        if (!fs.existsSync('public/images/upload')) {
            fs.mkdirSync('public/images/upload');
        }

        req.busboy.on('file', function (fieldname, file, filename) {

            var extension = filename.toLowerCase().split('.');// 
            var extensionTemp = extension[extension.length - 1];//get file extension 
            var extensions = ["png", "jpg", "gif", 'bmp'];//
            var newFileName = '';
            for (key in extensions) {
                if (extensions[key] == extensionTemp) {
                    // use now time to rename
                    newFileName = Date.now() + '.' + extensionTemp;
                    break;
                }
            }

            //fstream = fs.createWriteStream('public/images/upload/' + filename);
            var targetImg = 'public/images/upload/' + newFileName;

            fstream = fs.createWriteStream(targetImg);

            file.pipe(fstream);
            fstream.on('close', function () {
                //  markdown img url
                var wmdImgUrl = '![' + newFileName.toLowerCase().split('.')[0] + '](' + req.protocol + '://' + req.headers.host + targetImg.replace(/^public/, "") + ')';
                var resObj = {
                    code: 1,
                    msg: 'succ',
                    result: {
                        imgName: newFileName.toLowerCase().split('.')[0],// 
                        img: targetImg.replace(/^public/, ""),
                        wmdImgUrl: wmdImgUrl //
                    }
                };
                res.send(resObj);
            });

            fstream.on('error', function(err) {// err event
                var resObj = {
                    code: 0,
                    msg: err.stack,
                    result: {}
                };
                res.send(resObj);
            });
        });
    });

    module.exports = router;
Albert-IV commented 8 years ago

Something I've found out between the time I posted here and now, nginx will re-send POST requests if the connection "times out". nginx issue created on the subject, recent Hacker News discuission.

I'm not saying it's nginx that's the reason this is happening, but something to keep in mind when trying to debug the situation.

kimvex commented 8 years ago

Hello, the problem apparently if in NGINX, after fighting so I found me with error 413 and is a problem of limited upload files to the server, that we must set in NGINX, this would be the way to fix this problem:

open the file:

sudo vim /etc/nginx/nginx.conf

And we look for the next client_max_body_size line and change its value to the size we want (50M). If there is no liena siguiete add the http in the block {...}

client_max_body_size 100M;

and then restart NGINX

sudo service nginx restart

http://atulhost.com/fix-nginx-error-413-request-entity-too-large

nikonovak commented 8 years ago

Just to clarify. In my case the above didn't work. The solution that did work was adding client_max_body_size to the location portion of the virtual config file for nginx responsible for the proxy settings.

/etc/nginx/conf.d/virtual.conf ..... server { server_name example.com; location / { client_max_body_size 100m; proxy_pass http://127.0.0.1:3000; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection 'upgrade'; proxy_set_header Host $host; proxy_cache_bypass $http_upgrade; proxy_request_buffering off; } } ......

Hope this helps someone, NiKO

mafarja commented 8 years ago

@kimvex Solution worked for me. This thread was awesome. Thanks!

cabloo commented 8 years ago

I think I'm having this same issue when uploading multiple files using multer ^1.1.0. Only seems to happen frequently when used between two distant servers, so sounds like the bandwidth issue mentioned above. There is no proxy in use, just a PHP GuzzleHttp client talking to an express API on a different server running on port 3000.

a9urv commented 8 years ago

We faced this issue while trying to upload images from an iOS app to Node.js/Express/Multer via an Nginx reverse proxy. While a majority of the images went through fine, a few specific ones would just not upload. It wasn't a size issue as well since all images were of similar size and it wasn't the largest ones which had this problem. It was exactly as @droppedoncaprica has described earlier in this issue - the file was written to the temp location used by Multer but the control was not handed to next.

What finally worked for us was placing all the text fields in the multipart post request before the image file field - i.e. making the image file the last part of the multipart post request. Since then no upload failures. Hope this helps.

Nginx: 1.4.6 Node.js: 0.10.25 Multer: 1.0.0

zhaoxingguang14 commented 8 years ago

Hello, i'm facing the same issue again with the latest of multer version, the problem exactly to what multer does is not handed to next, the files just directly upload to the server but not executed the query process after that, which is sql.query. I've looked to @droppedoncaprica example to make custom error but i think i need something scenario as @a9urv does to put image file as the last part of the multipart post request, in the other hand my sql.query need the details from that req.files, what can i solve this problem?