node-formidable / formidable

The most used, flexible, fast and streaming parser for multipart form data. Supports uploading to serverless environments, AWS S3, Azure, GCP or the filesystem. Used in production.
MIT License
7.04k stars 683 forks source link

File Size Over 30mb is crashing app #645

Closed shellac85 closed 3 years ago

shellac85 commented 4 years ago

Support plan

Context

What are you trying to achieve or the steps to reproduce?

Hi, Just trying to upload a file. Someone noticed an issue when they tried to upload a 65mb file but I am noticing that when I go over 30mb I get errors in the console of VS code. Please see the stack trace from the error. I was on version 1.2.1 but its not working on 1.2.2 either. If you need anymore info, I will be happy to provide it. Thank you :)

What was the result you got?

<--- Last few GCs --->

[15084:000001BC6302C4B0]   287751 ms: Mark-sweep 1053.9 (1070.7) -> 1053.9 (1067.2) MB, 718.5 / 0.0 ms  (average mu 
= 0.602, current mu = 0.000) last resort GC in old space requested
[15084:000001BC6302C4B0]   288447 ms: Mark-sweep 1053.9 (1067.2) -> 1053.9 (1067.2) MB, 696.1 / 0.0 ms  (average mu 
= 0.413, current mu = 0.000) last resort GC in old space requested

<--- JS stacktrace --->

==== JS stack trace =========================================

    0: ExitFrame [pc: 000003DE8C2DC5C1]
    1: StubFrame [pc: 000003DE8C95C882]
Security context: 0x032fc8b1e6e9 <JSObject>
    2: DoJoin(aka DoJoin) [0000032FC8B05E91] [native array.js:1] [bytecode=00000333C45FB9D1 offset=182](this=0x02f50b7826f1 <undefined>,l=0x01df193c2dc1 <JSArray[60171968]>,m=60171968,A=0x02f50b7828c9 <true>,w=0x02f50b7829f1 <String[0]: >,v=0x02f50b7829a1 <false>)
    3: Join(aka Join) [0000032FC8B05EE1] [native arra...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: 00007FF6E0D47DDA v8::internal::GCIdleTimeHandler::GCIdleTimeHandler+4506
 2: 00007FF6E0D22876 node::MakeCallback+4534
 3: 00007FF6E0D231F0 node_module_register+2032
 4: 00007FF6E103B6BE v8::internal::FatalProcessOutOfMemory+846
 5: 00007FF6E103B5EF v8::internal::FatalProcessOutOfMemory+639
 6: 00007FF6E1221DE4 v8::internal::Heap::MaxHeapGrowingFactor+9620
 7: 00007FF6E12202AB v8::internal::Heap::MaxHeapGrowingFactor+2651
 8: 00007FF6E134A2B8 v8::internal::Factory::AllocateRawArray+56
 9: 00007FF6E134AC32 v8::internal::Factory::NewFixedArrayWithFiller+66
10: 00007FF6E10C82DF v8::internal::HashTable<v8::internal::NumberDictionary,v8::internal::NumberDictionaryShape>::NewInternal+63
11: 00007FF6E10C8252 v8::internal::HashTable<v8::internal::NumberDictionary,v8::internal::NumberDictionaryShape>::EnsureCapacity+226
12: 00007FF6E10C8811 v8::internal::Dictionary<v8::internal::NumberDictionary,v8::internal::NumberDictionaryShape>::Add+129
13: 00007FF6E13652D7 v8::internal::Factory::NewCallHandlerInfo+49095
14: 00007FF6E10BCF88 v8::internal::SharedFunctionInfo::SetScript+23528
15: 00007FF6E109FF33 v8::internal::JSReceiver::class_name+20595
16: 00007FF6E133B539 v8::internal::wasm::WasmCodeManager::LookupCode+15273
17: 00007FF6E1248BFC std::vector<v8::internal::compiler::MoveOperands * __ptr64,v8::internal::ZoneAllocator<v8::internal::compiler::MoveOperands * __ptr64> >::_Umove+57468
18: 000003DE8C2DC5C1

What result did you expect?

File uploaded I was hoping for. :)

auto-comment[bot] commented 4 years ago

Thank you for raising this issue! We will try and get back to you as soon as possible. Please make sure you format it properly, followed our code of conduct and have given us as much context as possible. /cc @tunnckoCore @GrosSacASac

tunnckoCore commented 4 years ago

Sorry about that. Stack traces doesn't help me much. :disappointed: Seems like some memory problem, which I think we had in the past, in v1 specifically.

Someone noticed an issue when they tried to upload a 65mb file but I am noticing that when I go over 30mb I get errors in the console of VS code.

Is it failing randomly only some times or it's constantly?

Also, could you please try the v2 canary. I'm seeing it's on a production system, but it worth trying, there's no big fundamental changes behind the scenes. The v1 has a ton of problems and there were similar issues posted in the past too.

Are they some specific files, specific format?

shellac85 commented 4 years ago

Thanks for the quick reply. Ye stack traces aren't great at all!

It's consistent when its comes to files over 30mb. I zipped up one file that was over 30mb and that brought it down to 27mb and it worked fine then.

I have a test environment, so I can try out the v2 canary and let you know how I get on. I have been testing pdfs so far.

tunnckoCore commented 4 years ago

when its comes to files over 30mb. I zipped up one file that was over 30mb and that brought it down to 27mb and it worked fine then.

Good catch it's around the 30mb mark, haha. Strange. Sounds like some system limit? Probably the firewalls, the network configurations? :thinking: Does testing is on the same, as close as possible, env as the prod?

so I can try out the v2 canary and let you know how I get on.

Okay, great. :)

shellac85 commented 4 years ago

Same story with v2 canary unfortunately.

Sounds like some system limit? Probably the firewalls, the network configurations?

This was my initial thought but I couldn't find any setting relating to this. Will have to look more into that area I think!

GrosSacASac commented 3 years ago

I just tried uploading 10 GB of files and it works fine. Can you reproduce the issue using node native http module ? If not, it means the issue is inside foalTS.

shellac85 commented 3 years ago

I dont think its a issue with Formidable. In our code there is a section that encrypts the code and I think its running out of memory with large files:

export const encryptFile: (
  fsPath: string,
  encryptionKey: string,
) => Promise<void> = async (fsPath, encryptionKey) => {
  if (!encryptionKey) throw new Error('Customer encryption key not found');

  const fileData = await readFile(fsPath);
  const dataBase64 = fileData.toString('base64');
  const encryptedFile = CryptoJS.AES.encrypt(dataBase64, encryptionKey);
  const buffer = Buffer.from(encryptedFile.toString(), 'base64');
  writeFileSync(fsPath, buffer);
};

I was trying to achieve the same with streams but no joy yet.

GrosSacASac commented 3 years ago

Use streams to encrypt files without memory overflow. Closing as it is not an issue in Formidable