Deadwood-ai / deadwood-api

Main FastAPI application for the deadwood backend
GNU General Public License v3.0
0 stars 0 forks source link

Implemented Chunked upload mechanism #65

Closed JesJehle closed 1 month ago

JesJehle commented 1 month ago

The previous upload mechanism was unable to handle large files because the entire image had to be loaded into memory at several stages of the upload process, both on the client and server sides.

This PR introduces a chunked upload mechanism through an additional API route, enabling the frontend application to upload files of any size. With this approach, the server no longer loads the entire file into memory. Instead, the upload is processed in chunks, and the hash is computed incrementally.

Chunks are temporarily stored in a new tmp folder and are combined once the final chunk is uploaded. After successful assembly, the chunks are cleared. Currently, in case of an error, chunks are not deleted to allow for potential future implementation of upload resumption. For now, manual cleanup of chunks is acceptable in case of failure. I have tested this feature extensively in a local environment, and it has proven to be reliable.

Next Steps: This PR will be followed by another with the same name, implementing the chunked upload mechanism on the frontend. The old upload routes remain functional for now. If the route is online, I can make a preview channel of the site and you can test the upload.