perwendel / spark

A simple expressive web framework for java. Spark has a kotlin DSL https://github.com/perwendel/spark-kotlin
Apache License 2.0
9.63k stars 1.56k forks source link

Uploading large-size file = Out of Memory #1249

Closed lepe closed 2 years ago

lepe commented 2 years ago

I found this question in SOF from 2016 about a similar issue.

I'm uploading a 2GB file using a relatively simple code to upload files to my server. I noticed that the memory limit is maxed before reaching my code. I was wondering what does Spark does during upload? Does the stream is stored in memory before exporting it into file? It sounds unfeasible to me, but I haven't located the part of the code which does that.

Is this a known limitation, a bug or something that it should never happen (and thus is some mistake on my part)?

Can someone tell me where that code is implemented (I would like to give it a look)? Thanks.

lepe commented 2 years ago

Following the code, it seems the issue is not related to Spark implementation. I believe org.eclipse.jetty.util.MultiPartInputStreamParser will read the stream and store it into a MultiMap<Part> object which is stored in memory. The memory increases when I call spark.Request.getParts() which only calls javax.servlet.http.HttpServletRequest.getParts() which calls org.eclipse.jetty.server.Request.getParts(), finally calling MultiPartInputStreamParser somewhere during the upload.

In short, I'm closing this issue as it seems it has nothing to do with Spark.