CTFd / CTFd

CTFs as you need them
https://ctfd.io
Apache License 2.0
5.54k stars 2.06k forks source link

Uploading Large Files to CTFd #1379

Open danielgarner64 opened 4 years ago

danielgarner64 commented 4 years ago

This doesn't necessarily have to be fixed but I think it is useful to be attached to the project for people who run into this issue.

What happened? Uploading a file greater than 100MB causes the upload to hang indefinitely. This was done through selecting the Challenge through the admin view selecting files and uploading. I also tested it through an API upload using python requests package.

Cause The Cause of the problem was that the Gunicorn thread was killed during the upload process. This is because I was using the default syncronous threads as my worker class for Gunicorn.

What did you expect to happen? If the upload timeout the UI should display an error message that connection was lost to the server.

How to reproduce your issue Attempt to upload a file greater than 100MB to a CTFd server with the default configuration.

Fix I applied In the docker-entrypoint.sh I added the following 3 lines --worker-class "gthread" \ --keep-alive 10 \ --threads 5 This fix prevents the server from operating as any time it searches for a missing resource (there are a few static files the server searches for that don't exist currently) it waits a full 60seconds before responding.

As such to upload large files I suggest adding a longer timeout to your threads e.g:: --timeout 60 Although this will cause workers to wait longer for resources as well.

I suggest as a minimum documenting that CTFd will limit file size due to Gunicorn killing workers that take longer than 30seconds to upload a file.

Environment:

Michael1337 commented 7 months ago

This still seems to be a problem, doesn't it? I use Nginx Proxy Manager and have Nextcloud running on my server, too. I tried different Nginx settings in NPM for CTFd, but I can not upload a file that is about 2 GB in CTFd, but I can upload it in Nextcloud. I can upload a file that is 127 MB in CTFd. I tried the above Gunicorn settings in CTFd, too, but it still doesn't work. It just kind of hangs when the upload progress bar in the web UI is finished. Is there a way to upload the file to my server manually and then create a link in the CTFd database?

danielgarner64 commented 7 months ago

So when I did these uploads I primarily used the Restapi (using python code to encode and uplaod the file). And for very large files 2GB+ I used a very large timeout (in the hours range) it was seomthing like --timeout 6000

pl4nty commented 7 months ago

Manually creating a link would be a great workaround. I have large files that are much faster to upload/download directly, so we currently add download links to challenge descriptions

1podracer1 commented 7 months ago

This doesn't necessarily have to be fixed but I think it is useful to be attached to the project for people who run into this issue.

What happened? Uploading a file greater than 100MB causes the upload to hang indefinitely. This was done through selecting the Challenge through the admin view selecting files and uploading. I also tested it through an API upload using python requests package.

Cause The Cause of the problem was that the Gunicorn thread was killed during the upload process. This is because I was using the default syncronous threads as my worker class for Gunicorn.

What did you expect to happen? If the upload timeout the UI should display an error message that connection was lost to the server.

How to reproduce your issue Attempt to upload a file greater than 100MB to a CTFd server with the default configuration.

Fix I applied In the docker-entrypoint.sh I added the following 3 lines --worker-class "gthread" --keep-alive 10 --threads 5 This fix prevents the server from operating as any time it searches for a missing resource (there are a few static files the server searches for that don't exist currently) it waits a full 60seconds before responding.

As such to upload large files I suggest adding a longer timeout to your threads e.g:: --timeout 60 Although this will cause workers to wait longer for resources as well.

I suggest as a minimum documenting that CTFd will limit file size due to Gunicorn killing workers that take longer than 30seconds to upload a file.

Environment:

* CTFd Version/Commit:

* Operating System: Ubuntu 18.04

* Web Browser and Version: Firefox 75.0

This worked for me. I was able to upload a 2GB .zip file after adding the following lines at the end of my docker-entrypoint.sh file:

End of file (# Start CTFd) --error-logfile "$ERROR_LOG " \ --timeout 6000 \ --keep-alive 10 \ --threads 5

After saving the file, restarting the server, and launching docker-compose, I was able to succesfully upload a 2GB file using the 'Upload' button in admin panel challenges section. Thank you Daniel for this work around, as I am using this for a dissertation project at university. This has saved me a lot of hassle!

iimog commented 6 months ago

I had a similar problem when creating backups on an instance with many larger files. There were timeouts when creating the zip file (file size 12 GB). Increasing the gunicorn timeout fixed it for me.