Closed nightbluepn closed 3 years ago
Try a higher gateway timeout :) Creating a zip file with around 100gb takes a while. I don't know if that is something we can fix. The right way would be to queue such a task, create the archive and send a notification that the download is now available. What is your use case? I mean there are better ways (like using the sync clients) to transfer so much data.
The use case is providing the folder contents to business clients, and they are not supposed/expected to get an account and install the sync client.
I found a lot of old bug reports while researching the problem and i thought Zipstreamer was included in Nextcloud to solve this Problem, in that it Zips the Folder while it's downloading.
I guess a solution could be to use a different way of transfer altogether, though we would prefer handling everything through the very nice and intuitive sharing features of Nextcloud.
Works fine for me. When you say thousands of files what are you talking about here? Ten thousand tiny 8kb files?
One example: 105GB, ~17k files, ranging from small text files to bigger PDFs to large files up to 14GB. What hardware are you running on, and how long does it take for you? What kind of folder did you test?
306 directories, 11103 files, 372GB
It takes ~4 seconds to start the download at over 30MB/s.
Your issue is likely a configuration issue.
I am on NC19 and I see you're two versions behind on the v18 release which is now at 18.0.6. Maybe try updating?
Okay the ~4 seconds you achieve really made me wonder... Downloading first level subdirectories is as quick as it is for you, but the folder itself does not work at all. I have one 80GB file sitting at the root of the folder in question, the subdirectories are up to 20GB in size. So i guess there is something fishy going on after all. I'll guess i need to try more stuff.
Even when selecting all subfolders and downloading them together, it works in a reasonable amount of time (~30-60s). If i add the file to the selection, i wait until all eternity again.
Is the Zipstreamer mechanism maybe working in a way where it starts with a file and it starts streaming back a ZIP as soon as this first file is zipped, and if the first file happens to be huge, it just gets stuck with it?
Have you upgraded to 18.0.6 or 19? I would do that first before looking into this further.
Then you need to look into your nginx and php configurations.. lots of variables here.
Is this Issue still valid in NC21.0.2? If not, please close this issue. Thanks! :)
This issue has been automatically marked as stale because it has not had recent activity and seems to be missing some essential information. It will be closed if no further activity occurs. Thank you for your contributions.
How to use GitHub
Steps to reproduce
Expected behaviour
There should be a download prompt popping up and zipstreamer should build the zip while downloading.
Actual behaviour
Nothing, the circle keeps spinning. I have tested this with timeouts of up to 3600s (1h). After this, i get a 504 gateway timeout.
Server configuration
Operating system: Arch Linux
Web server: Nginx 1.19.0-1
Database: MariaDB 10.4.13-1
PHP version: 7.4.6-1
Nextcloud version: 18.0.4 Updated from an older Nextcloud/ownCloud or fresh install: Updated Nextcloud
Where did you install Nextcloud from: Initially Arch Linux package manager, then update wizard
Signing status: No errors have been found.
Signing status
``` Login as admin user into your Nextcloud and access http://example.com/index.php/settings/integrity/failed paste the results here. ```List of activated apps:
App list
Nextcloud configuration: { "system": { "config_is_read_only": true, "instanceid": "REMOVED SENSITIVE VALUE", "passwordsalt": "REMOVED SENSITIVE VALUE", "secret": "REMOVED SENSITIVE VALUE", "trusteddomains": [ "removed" ], "datadirectory": "REMOVED SENSITIVE VALUE", "dbtype": "mysql", "version": "18.0.4.2", "overwrite.cli.url": "removed", "dbname": "REMOVED SENSITIVE VALUE", "dbhost": "REMOVED SENSITIVE VALUE", "dbport": "", "dbtableprefix": "oc", "mysql.utf8mb4": true, "dbuser": "REMOVED SENSITIVE VALUE", "dbpassword": "REMOVED SENSITIVE VALUE", "installed": true, "memcache.local": "\OC\Memcache\APCu", "mail_smtpmode": "smtp", "mail_smtpsecure": "ssl", "mail_sendmailmode": "smtp", "mail_from_address": "REMOVED SENSITIVE VALUE", "mail_domain": "REMOVED SENSITIVE VALUE", "mail_smtpauthtype": "LOGIN", "mail_smtpauth": 1, "mail_smtphost": "REMOVED SENSITIVE VALUE", "mail_smtpport": "465", "mail_smtpname": "REMOVED SENSITIVE VALUE", "mail_smtppassword": "REMOVED SENSITIVE VALUE", "maintenance": false, "theme": "", "loglevel": 0 } }
Config report
``` If you have access to your command line run e.g.: sudo -u www-data php occ config:list system from within your Nextcloud installation folder or Insert your config.php content here. Make sure to remove all sensitive content such as passwords. (e.g. database password, passwordsalt, secret, smtp password, …) ```Are you using external storage, if yes which one: Local
Are you using encryption: No
Are you using an external user-backend, if yes which one: No
Client configuration
Browser: Firefox/Chrome/Safari
Operating system: Linux/Windows/MacOS
Logs
Web server error log
Web server error log
``` Nginx: nginx: 2020/06/18 13:09:37 [error] 178609#178609: *417522 upstream timed out (110: Connection timed out) while reading response header from upstream, client: x.x.x.x, server: xxx, request: "GET /index.php/apps/files/ajax/download.php?dir=%2F&files=xxx&downloadStartSecret=blfn0pca8qb HTTP/1.1", upstream: "fastcgi://unix:/var/run/php-fpm/php-fpm.sock", host: "xxx" No errors in PHP-FPM ```Nextcloud log (data/nextcloud.log)
Nextcloud log
``` There are recurring jobs being done, two messages that are bugs to be fixed with NC 19 and this: "message":"Memcache \\OC\\Memcache\\APCu not available for local cache","userAgent":"--","version":"18.0.4.2"} ```Browser log
Browser log
``` Nothing of interest ```