nextcloud / desktop

💻 Desktop sync client for Nextcloud
https://nextcloud.com/install/#install-clients
GNU General Public License v2.0
2.92k stars 773 forks source link

"Connection closed" message when syncing files larger then +- 100Mb #4278

Open Jhutjens92 opened 2 years ago

Jhutjens92 commented 2 years ago

I have my nextcloud installation running in a docker container. It's connected to a MySQL DB (another docker container) and exposed to the web using SWAG. All the relevant php.ini/config files have the following settings

Uploading via webserver is no problem but whenever i try to sync the same file using the windows sync client i receive a "Connection Closed"

Expected behaviour

Files should just upload to the nextcloud server.

Actual behaviour

Files aren't being uploaded and client throws an error:

Steps to reproduce

  1. Run nextcloud in docker container.
  2. Use SWAG (docker container) to link (sub)domain to nextcloud
  3. Upload large file (larger then 100Mb) (in my case a .mp4)
  4. See response in client log.

Client configuration

Client version: 3.4.2

Operating system: Microsoft Windows 10 Pro (10.0.19041 Build 19041)

OS language: Dutch

Installation path of client: C:\Program Files\Nextcloud

Nextcloud version: Nextcloud Hub II (23.0.0)

Storage backend: Local server storage

Logs

  1. Client logfile: Client_20220213_2039_owncloud.log.0.txt

  2. Web server error log: N.A.

  3. Server logfile: nextcloud log (data/nextcloud.log): nextcloud.log

(ignore the failed login)

metheis commented 2 years ago

are you using Cloudflare in your setup? I'm running into the same issue because of Cloudflare's 100 MB upload limit and the client not seeming to chunk. https://github.com/nextcloud/desktop/issues/4271

Jhutjens92 commented 2 years ago

are you using Cloudflare in your setup? I'm running into the same issue because of Cloudflare's 100 MB upload limit and the client not seeming to chunk. #4271

I am using cloudflare for SSL on SWAG with my domain. I didn't even stop to think about it. Did you figure out a way around it/fix? Nextcloud is chunking the file (correct me if I'm wrong) so something is not working as intented. I could turn off cloudflare for my subdomain but that would defeat the purpose of using it in the first place.

metheis commented 2 years ago

Right, agreed, I prefer to keep Cloudflare on. What I found works is if I uninstall the client (I had version 3.4.1) and reinstall version 3.3.6 from October, the issue goes away. I think the issue must be with how the newer client chunks (or lack there of).

Jhutjens92 commented 2 years ago

Version 3.3.6 is confirmed to be working. Just tested it out. So something in 3.4.1 broke the chunking.

metheis commented 2 years ago

@allexzander is it possible you could weigh in on this? Thank you for your help

xt0x1c commented 2 years ago

same here

tusharraval102 commented 2 years ago

I can confirm that I have the same issue. I recently upgraded to 3.4.2 and my client app would not sync files larger than 100mb. I just downgraded to 3.3.6 and it works without any issues. I am using CloudFlare, MacOS Monterey, and the latest NextCloud server.

metheis commented 2 years ago

@Valdnet or @er-vin it seems like a number of users are running into this issue starting with 3.4.1 (and still present in 3.4.2). Any thoughts on what change lead to this?

Valdnet commented 2 years ago

@nextcloud/desktop Is it possible to check the described error in issue. Users still have a problem with this. Thanks.

ThatTallGuy21 commented 2 years ago

I'm on 3.4.1 as well and utilize SSL via CF with Nginx. Running into the same issue as described.

Vixavius commented 2 years ago

I was experiencing the same issue with 3.4.2.

v3.3.6 working well.

ghost commented 2 years ago

This is happening to me, I am using cloudflare and the 100MB limit doesn't affect my upload but the client is bringing up errors when uploading files, i'm trying 3.3.6 now to see if this is just a bug

ghost commented 2 years ago

Yep I can confirm on my side aswell version 3.3.6 works fine then latest client broke chunking and possibly keep alive "closing connection"

mitohund commented 2 years ago

Can confirm that 3.4.3 does not work for me, but 3.3.6 does.

Rami-Pastrami commented 2 years ago

Confirmed happening here too

Jhutjens92 commented 2 years ago

I'm suprised no one has picked this up yet from nextcloud desktop team. No mention of it in the new 3.5.0 RC1

Valdnet commented 2 years ago

ping @mgallien @camilasan @claucambra

Kn-ut99 commented 2 years ago

Same issue here with client v3.4.3 on Linux (Manjaro) + cloudflare.

relains commented 2 years ago

Same here on Android. If in cloudflare I check DNS only - it works, switch back to proxied - dosen't work.

WasabiCarpet commented 2 years ago

Dropping a comment here to say I'm experiencing the same issue as above and am also using cloudflare.

Edit: I'll add that reverting to 3.3.6 also fixed the issue for me and all my large files upload just fine now

PaperMemo commented 2 years ago

I randomly put this thing into nextcloud.cfg (in [General] section) and somehow it works for me. (I don't know if it works for other people or not)

chunkSize=10000000
minChunkSize=1000000
maxChunkSize=50000000
targetChunkUploadDuration=6000

Note: I use version 3.4.3 on Manjaro Linux Note 2: I read from https://docs.nextcloud.com/desktop/3.0/advancedusage.html because, in the documentation, I didn't see [General] section in 3.4 🤔

jospoortvliet commented 2 years ago

that is interesting, if that works for others perhaps we can update the documentation at least.

weeix commented 2 years ago

I randomly put this thing into nextcloud.cfg (in [General] section) and somehow it works for me. (I don't know if it works for other people or not)

chunkSize=10000000
minChunkSize=1000000
maxChunkSize=50000000
targetChunkUploadDuration=6000

Note: I use version 3.4.3 on Manjaro Linux Note 2: I read from https://docs.nextcloud.com/desktop/3.0/advancedusage.html because, in the documentation, I didn't see [General] section in 3.4 🤔

I confirm that this works for Nextcloud 3.4.4 (Windows).

Kn-ut99 commented 2 years ago

I randomly put this thing into nextcloud.cfg (in [General] section) and somehow it works for me. (I don't know if it works for other people or not)

chunkSize=10000000
minChunkSize=1000000
maxChunkSize=50000000
targetChunkUploadDuration=6000

Note: I use version 3.4.3 on Manjaro Linux Note 2: I read from https://docs.nextcloud.com/desktop/3.0/advancedusage.html because, in the documentation, I didn't see [General] section in 3.4 thinking

I too can confirm that this solved the issue (Nextcloud client 3.4.3, Manjaro). Only setting "chunkSize" does not work, I had to set all 4 settings in the config file. I don't really understand why, but hey, it works. Thank you :heart:

Gwindalmir commented 2 years ago

I can confirm this fixes it on Nextcloud 4.4.4 on Windows 10.

The only setting I needed was this: targetChunkUploadDuration=6000

That matches what I noticed empirically, which was that the syncing failed and restarted about every 6 seconds.

Iirc from the log, the default was like 20 or 28 seconds.

jospoortvliet commented 2 years ago

review/feedback on the above welcome!

jospoortvliet commented 2 years ago

The only setting I needed was this: targetChunkUploadDuration=6000

What surprises me is that according to the docs, this is the default...

Jhutjens92 commented 2 years ago

targetChunkUploadDuration=6000

I'm confused as to where you would put this setting when running Nextcloud in Docker since the aformentioned .cfg file does not seem to exist.

Gwindalmir commented 2 years ago

I was mistaken, I reverted the change to check, and the default is 60000...

2022-04-14 11:09:41:146 [ info nextcloud.sync.propagator.upload.ng C:\Users\sysadmin\AppData\Local\Temp\2\windows-9586\client-building\desktop\src\libsync\propagateuploadng.cpp:418 ]: Chunked upload of 10000000 bytes took 2552 ms, desired is 60000 ms, expected good chunk size is 235109717 bytes and nudged next chunk size to 122554858 bytes

Gwindalmir commented 2 years ago

targetChunkUploadDuration=6000

I'm confused as to where you would put this setting when running Nextcloud in Docker since the aformentioned .cfg file does not seem to exist.

It took me a while to figure that out too, it goes on the client, not the server. I don't know if it's already possible, but it would be nice to set the preferred chunking settings on the server-side, so it propagates to all clients.

metheis commented 2 years ago

Hi everybody, thanks for finding this. I found simply setting the maximum chunk size to 50 MB (half of Cloudflare's 100 MB upload size limit) worked to resolve this issue.

I put together a short guide to fix this issue with the latest stable release (3.4.4, but should work on any client v3.4+). I tried to make it as accessible as possible to follow.

Windows Fix

Press Win+R on your keyboard to open the Run application. Past the following in the dialog box:

%APPDATA%\Nextcloud\nextcloud.cfg

This will either ask you to pick an application to open nextcloud.cfg or will open in your default text editor (unless you have something else set to open .cfg files). If it asks you to pick an application, feel free to use Notepad or any other editor.

Add the following line under the [General] section:

maxChunkSize=50000000

Save the file, quit Nextcloud desktop, and start it again.

MacOS Fix

Open a Finder window and press Command+Shift+G on your keyboard. This will bring up a 'Go to folder' window. Paste the following in the dialog box:

$HOME/Library/Preferences/Nextcloud

Open the nextcloud.cfg file. If you do not have a default editor for .cfg files, feel free to open the file with TextEdit.

Add the following line under the [General] section:

maxChunkSize=50000000

Save the file, quit Nextcloud desktop, and start it again.

Linux Fix

Open a terminal window and edit the following file:

nano $HOME/.config/Nextcloud/nextcloud.cfg

Add the following line under the [General] section:

maxChunkSize=50000000

Save the file (Ctl+o, Ctl+x), then quit Nextcloud desktop, and start it again.

Jhutjens92 commented 2 years ago

Hi everybody, thanks for finding this. I found simply setting the maximum chunk size to 50 MB (half of Cloudflare's 100 MB upload size limit) worked to resolve this issue.

Can confirm that this works like a charm

jospoortvliet commented 2 years ago

It took me a while to figure that out too, it goes on the client, not the server. I don't know if it's already possible, but it would be nice to set the preferred chunking settings on the server-side, so it propagates to all clients.

Sorry for being unclear. But yeah, it's a client setting that changes client behavior - changing it from the server could cause issues for the clients. That Cloudflare limits the upload size is the problem here and the client would ideally discover that and set the maximum to whatever maximum it managed to transfer.

My understanding is that the way the client was designed to do it right now is that it tries to increase the size of the chunks over time, as that increases the transfer speed and efficiency. So it starts with 10mb, then goes to, I dunno, 20, 30, something like that. It should be smart about scaling only as far as it works, but that seems to not work.

Gwindalmir commented 2 years ago

I feel the opposite. If the server is operating behind cloudflare, the server operator knows this, and should be able to tell the clients what settings to use to connect to it. Clients shouldn't be expected to know this, or know how to configure it.

I have this same problem on my Android client, and I don't even know if I can fix it so it works properly.

IOW, clients shouldn't be doing guesswork (the user, or the software). It should operate how the server tells it to. Or even better, perform a negotiation on configuration.

CalebFenton commented 2 years ago

Possibly related: I noticed a big difference just unchecking "Ask for confirmation before synchronizing folders larger than X"

I was never prompted for confirmation. It would just silently fail with "Connection closed" after a few seconds.

thomashup commented 2 years ago

I experience the same problems on MacOS (11.6.5) with Nextcloud client 3.4.4.

Adding a nextcloud.cfg doesn't make any difference. The only way I can get it to work is by disabling Proxied DNS at Cloudflare. I by the way didn't have a Nextcloud folder under Library->Preferences.

A nice feature would be to make a setting in the Nextcloud App, where it's possible to change the chunk size to a more CloudFlare friendly size.

jospoortvliet commented 2 years ago

I feel the opposite. If the server is operating behind cloudflare, the server operator knows this, and should be able to tell the clients what settings to use to connect to it. Clients shouldn't be expected to know this, or know how to configure it.

I have this same problem on my Android client, and I don't even know if I can fix it so it works properly.

IOW, clients shouldn't be doing guesswork (the user, or the software). It should operate how the server tells it to. Or even better, perform a negotiation on configuration.

If we're talking about what SHOULD be then - cloudflare shouldn't be blocking uploads >100mb, as that breaks 'the internet' or at least a part of that. Designing the clients to work with random limitations that don't work as the internet is designed is of course very hard. The real solution here is "Tell Cloudflare to fix their system", or "don't use Cloudflare DNS, it is broken". There's a work-around in the desktop client but as you point out, the mobile clients also have issues... It'd be best if this was reported to Cloudflare and they would allow things to work normally.

At the same time, it'd be great if the client was a bit smarter and would change its behavior if things don't work. It does that based on time - not on broken/disappearing uploads and I'm hoping that that is relatively easy to fix.

Of course the mobile clients would also need to learn this, and I don't know how hard it is. Better solution, for now and most users, is probably really to not use Cloudflare until they fix it.

metheis commented 2 years ago

@jospoortvliet There's a number of reasons why different providers provide certain parameters and limits. But what's important is that each application should follow its documentation. Cloudflare specifies the upload size limit per POST request and their network behaves according to the documentation.

Nextcloud documentation specifies the client uploads large files in much smaller chunks than 100 MB, which worked properly by default through client version 3.3.x. The documentation continues to specify the client chunks large files to small sizes by default in the latest version. The application should behave accordingly or the documentation should be updated.

jospoortvliet commented 2 years ago

@metheis yeah the documentation should be fixed - I created a PR for that, once merged it should properly reflect how the client works. But that would still not be compatible with the limitations from Cloudflare ;-)

ThatTallGuy21 commented 2 years ago

Just want to make sure I'm following. So if your domain for nextcloud is leveraging Cloudflare, then the above workaround on setting the maximum chunk size to 50MB doesn't work (or does?), and the documentation is going to be updated to tell users to not use Cloudflare until they fix their chunking limits?

metheis commented 2 years ago

Just want to make sure I'm following. So if your domain for nextcloud is leveraging Cloudflare, then the above workaround on setting the maximum chunk size to 50MB doesn't work (or does?), and the documentation is going to be updated to tell users to not use Cloudflare until they fix their chunking limits?

Yes, the above work around fixes this issue. The discussion we’re haivng now is whether the Nextcloud client should have this set by default. In versions before 3.4.x, it did, but now it does not.

mitohund commented 2 years ago

I've been following this issue for a while, but there are two things I'm not sure I understand:

  1. I was sure there was a way to adjust chunk size on the Nextcloud server side, but I believe somebody mentioned this is not possible after all. Wouldn't this make more sense, since the admin could set this once for all desktop and mobile devices (even if this is only to "fix" Cloudflare's issue - unfortunately CF is a widely used service)?
  2. Somebody mentioned the issue may also be that the timeout time-span is set too short, so that if a server is not powerful enough to re-join all chunks in the given time, the process is aborted. Is this a related or separate issue?
jospoortvliet commented 2 years ago

Yes, the above work around fixes this issue. The discussion we’re haivng now is whether the Nextcloud client should have this set by default. In versions before 3.4.x, it did, but now it does not.

Luckily the feature is there - the documentation was simply broken due to a missing space in a table or something silly like that... A markdown table seems very sensitive ;-)

But one could argue that the algorithm that increases the chunk size should be a little smarter. Right now, it stops increasing chunk size if it hits the maximum time per chunk that is configured, so it tries to make the chunks so big that it takes about 1 minute per chunk to upload. But it would be nice if it ALSO stops increasing chunk size when the chunks start causing errors and time-outs, like with CloudFlare. This is a feature request and simply not done yet.

jospoortvliet commented 2 years ago

I've been following this issue for a while, but there are two things I'm not sure I understand:

1. I was sure there was a way to adjust chunk size on the Nextcloud server side, but I believe somebody mentioned this is not possible after all. Wouldn't this make more sense, since the admin could set this once for all desktop and mobile devices (even if this is only to "fix" Cloudflare's issue - unfortunately CF is a widely used service)?

See my comment above - right now, it's a client setting, not a server setting. Ideally we'd make the clients a bit smarter in handling this, it would make them more robust also in case of other problems with the chunking. But that simply requires somebody having time to do the work.

2. Somebody mentioned the issue may also be that the timeout time-span is set too short, so that if a server is not powerful enough to re-join all chunks in the given time, the process is aborted. Is this a related or separate issue?

That is separate, and would be a server issue the client can't do anything about.

mitohund commented 2 years ago

right now, it's a client setting, not a server setting.

So what is this setting for? -> https://docs.nextcloud.com/server/stable/admin_manual/configuration_files/big_file_upload_configuration.html#adjust-chunk-size-on-nextcloud-side

Ideally we'd make the clients a bit smarter in handling this, it would make them more robust also in case of other problems with the chunking.

That makes sense.

That is separate, and would be a server issue the client can't do anything about.

Cool, I understand. Do you know if this issue actually exists or if it was just one user's interpretation of what was happening to them?

github-actions[bot] commented 2 years ago

This bug report did not receive an update in the last 4 weeks. Please take a look again and update the issue with new details, otherwise the issue will be automatically closed in 2 weeks. Thank you!

BonzTM commented 2 years ago

I randomly put this thing into nextcloud.cfg (in [General] section) and somehow it works for me. (I don't know if it works for other people or not)

chunkSize=10000000
minChunkSize=1000000
maxChunkSize=50000000
targetChunkUploadDuration=6000

Note: I use version 3.4.3 on Manjaro Linux Note 2: I read from https://docs.nextcloud.com/desktop/3.0/advancedusage.html because, in the documentation, I didn't see [General] section in 3.4 🤔

3.5.1 on multiple machines here. Still not fixed I guess? maxChunkSize=50000000 solved for me.

Commenting to keep issue open

Ventilgummi commented 2 years ago

Hi everybody, thanks for finding this. I found simply setting the maximum chunk size to 50 MB (half of Cloudflare's 100 MB upload size limit) worked to resolve this issue.

I put together a short guide to fix this issue with the latest stable release (3.4.4, but should work on any client v3.4+). I tried to make it as accessible as possible to follow.

Windows Fix

Press Win+R on your keyboard to open the Run application. Past the following in the dialog box:

%APPDATA%\Nextcloud\nextcloud.cfg

This will either ask you to pick an application to open nextcloud.cfg or will open in your default text editor (unless you have something else set to open .cfg files). If it asks you to pick an application, feel free to use Notepad or any other editor.

Add the following line under the [General] section:

maxChunkSize=50000000

Save the file, quit Nextcloud desktop, and start it again.

MacOS Fix

Open a Finder window and press Command+Shift+G on your keyboard. This will bring up a 'Go to folder' window. Paste the following in the dialog box:

$HOME/Library/Preferences/Nextcloud

Open the nextcloud.cfg file. If you do not have a default editor for .cfg files, feel free to open the file with TextEdit.

Add the following line under the [General] section:

maxChunkSize=50000000

Save the file, quit Nextcloud desktop, and start it again.

Linux Fix

Open a terminal window and edit the following file:

nano $HOME/.config/Nextcloud/nextcloud.cfg

Add the following line under the [General] section:

maxChunkSize=50000000

Save the file (Ctl+o, Ctl+x), then quit Nextcloud desktop, and start it again.

Thanks! This worked. Can confirm that it's still an issue on Windows app. I'm running NC as a dockercontainer, Traefik as reverse proxy and Cloudflare DNS with proxy

Lightning2X commented 2 years ago

Chiming in to say that maxChunkSize also resolved the issue for me

tanfwc commented 2 years ago

+1 fixed for my side as well. maxChunkSize solved the problem!