Closed rootslab closed 13 years ago
I have written a simple gist for these 2 issues
I'll be happy to know that I have mistaken!
dvv / rootslab: Could you provide an example that doesn't use connect?
We are doing 1-2 GB file uploads with this library in production and have never seen any such problems.
Hi felix! I'm using the upload.js example in formidable.
If I add this line after form.uploadDir ->
form.maxFieldsSize = 200*1024*1024;
I have the 'process out of memory' error with this curl command
curl -i -F name=test -F filedata="@file1.jpg,file2.jpg" http://servername/upload/
and without that line It throws an unhandled exception.
(in my test file1.jpg ~ 60MB)
p.s. I think you could also rewrite your benchmark example, because in my tests I see that the buffer is never created in RAM, this implies to have results under the real potential of the library, I think you could fill it with some data before launching the test
Are you sure that curl command is creating a multipart request?
YES, IT IS.
However, I think that a library must handle exceptions.
For now, I'm sure that is a security hole.
What about your test?
However, I think that a library must handle exceptions.
There is no way to handle out of memory exceptions in node.js. And your code disables my libraries safety mechanism on purpose. I'm not sure how I can help with this.
YES, IT IS.
Actually, your curl command seems to create a nested multipart request, which is not supported by formidable. The correct command is:
curl -i -F name=test -F filedata="@file1.jpg" -F filedata="@file2.jpg" http://servername/upload/
For now, I'm sure that is a security hole.
I appreciate your help with probing my library, but I'm still unconvinced there is a security hole here.
I never say to handle out of memory exceptions. I mean that you have to handle exceptions in request.
YES, IT IS.
It's not MY CURL COMMAND, it's a curl command, I know how to use it.
Do you really think that is not a bug?
... no comment..
It's not MY CURL COMMAND, it's a curl command, I know how to use it.
Who's command is it? When you use the variation I gave you, things will work as expected.
Do you really think that is not a bug?
At this point I think this is a user problem, yes.
Is it possible to spot the error? Does formidable make assumptions on sanity of user input?
dvv: There is a minor problem here that causes higher than necessary memory usage when receiving invalid/unsupported form data. This can certainly be improved, but considering that it only causes problems when disabling the safe guards for limiting memory usage, I would not call it a security issue at this point.
As to whether it's a bug or not: Yes. But I think it's a rather benign one as it only causes problems when mis-configuring the parser on purpose.
You don't have to point out anything with me, thanks, I know curl command, I have used it for testing my library. You don't. Seems that you don't care about security. Good Luck.
It is a BUG or not?
rootslab: If it really bothers you, there are two options. Fork the project and fix the issue or use your own library formaline. But please, stop using caps.
Seems that you don't care about security.
I care about security. But you are simply disabling the security mechanisms of my library. This doesn't mean there is no potential problem here, but I don't think it's exploitable. If you can show an actual exploit that doesn't increase the built-in memory limits, I will fix this ASAP.
IT is BUG or not?
Please read my previous comment, I clarified on my initial statement.
Also, please consider what masylum said. You're being pretty rude to me considering that I wrote this library for free and I'm also providing free support to you here.
What is that I hear formidable doesn't scale?
It doesn't bother me. I don't use it. I post this issue a week ago and I simply do social coding. And yes , I USE CAPS. Bye.
..when mis-configuring the parser on purpose.. ? Where?
It doesn't bother me. I don't use it. I post this issue a week ago and I simply do social coding. And yes , I USE CAPS. Bye.
Again, thanks for reporting this. I won't close the ticket as there are some improvements to be made here.
Your library looks very nice, so I'm happy if it works better for you than mine.
..when mis-configuring the parser on purpose.. ? Where?
form.maxFieldsSize = 200*1024*1024;
I can't upload files/fields of 200 MB? If it is a mistake from user, I think you have to handle it
I can't upload files of 200 MB?
You can. This is the limit for non-file parts. Your curl command does some weird multipart nesting that is not supported, so the enclosed file is assumed to be a non-file field, so the limitations for that apply.
Thanks, felixge but the point is that I don't want to criticize you. I tell you about a possible security bug. Then, after a week, you have answered me by explaning the exact curl command. ( I have written the same curl command on my opening issue ).
Finally, If it is not a problem for you, It' s not a problem for me.
Have a good job ;)
after a week
Sorry, I was on vacation when you opened the issue. I'm also busy with other things sometimes.
you have answered me by explaning the exact curl command. ( I have written the same curl command on my opening issue ).
Please have a look again. The curl command I suggested is different and behaves differently than the one you are using.
Thanks, felixge but the point is that I don't want to criticize you. I tell you about a possible security bug.
I know you do. But you've made it frustrating for both of us by being passive aggressive in some of your comments. I'm more then willing to improve things, but so far this does not seem like an exploitable issue, and I just can't guarantee specific service levels on my open source projects.
your command : curl -i -F name=test -F filedata="@file1.jpg" -F filedata="@file2.jpg" http:...
_the command is the same, I have written -> _
I test it also with curl ( 2 files and my url for upload) ->
curl -i -F name=test -F filedata="@file1.jpg" -F filedata="@file2.jpg" http://server/upload ...
.. passive aggressive in some of your comments ..
I think only that you have to read what I have written, before answering me, and \ we doesn't play poker here**.
Stop here, please. Have a good job. ;)
@felixge: I'd suggest that form.maxFieldsSize be set implicitly in the constructor to a really small (and thus possibly useless, say 1) value, to force the user to explicitly re-set it in his code.
@rootslab: does in turn your lib make assumptions on sanity of user input? The problem of forged multipart body hits more or less any parser.
Any any server which i'm aware of is vulnerable to forged Content-Length:
attack, when client sets its value to N octets but sends X<N and then waits forever, effectively not allowing for server to receive the expected entire data and forcing the server to keep resources bound to such stale request.
Hello, Vladimir.
\ In my library I handle this situation. **
You can decide to throw exception when content-length is missing, or continue without checking content-length at all.
the command is the same, I have written ->
Indeed. I did not see that alternative command in your initial ticket because it was not formatted as a code block. Sorry.
If you're saying that this version of the command causes the same issue for you, I'm surprised. It certainly doesn't cause issues on my end. What curl version are you using? I'm on:
curl 7.20.0 (i386-apple-darwin10.2.0) libcurl/7.20.0 OpenSSL/0.9.8n zlib/1.2.5 libidn/1.18
I think only that you have to read what I have written, before answering me, and we doesn't play poker here.
Sorry I missed a detail here. However, if you had read the documentation about form.maxFieldsSize
we could have saved half of the conversation here as well.
@felixge: I'd suggest that form.maxFieldsSize be set implicitly in the constructor to a really small (and thus possibly useless, say 1) value, to force the user to explicitly re-set it in his code.
The default value is indeed set to 2 MB here, see docs.
Fold.
Raise.
Have you read this? -->
I think you could also rewrite your benchmark example, because in my tests I see that the buffer is never created in RAM, this implies to have results under the real potential of the library, I think you could fill it with some data before launching the test.
@felixge: I saw default value in the source and docs, sure. However, this is quite a big value. I haven't dug in the code deep, but I imagine that the magnitude of this value may cause the issue in point. In @rootslab gist, he's set it to 200Mb and issued the forged request so that I guess the data junk have occupied all these 200Mb.
I meant to make things so that the user had to explicitly set this fuser value on its own, thus making him clearly take the responsibility of coping with forged data.
Hi, it seems that connect-form (/ formidable) doesn't handle fields of the same name.
curl -i -F name=test -F filedata="@file1.jpg" -F filedata="@file2.jpg" http://server/upload
I can see, with a tool like htop, the process growing to 1GB of RAM with a 60MB file upload!
Then, if I'm not mistaken, it is possible to burn all the ram in the server
I try these upload tests with formaline:
1) it writes well the two files, and returns them as results
2) with this weird curl request , formaline doesn't write files on disk, and it doesn't burn the RAM.
I hope it helps.