nextcloud / docker

⛴ Docker image of Nextcloud
https://hub.docker.com/_/nextcloud/
GNU Affero General Public License v3.0
6.09k stars 1.83k forks source link

Cannot upload large files when using chunking #2295

Closed AndreiArdei closed 2 months ago

AndreiArdei commented 2 months ago

Hi! I've recently set up nextcloud using docker. The OS that docker runs on is Ubuntu 22.04.4 LTS. I provided the docker compose file below. My docker volume is mounted on a sepparate NFS share, therefore all volumes are mounted on the nfs share.

version: '3.8'

services:
  postgres:
    image: postgres:alpine
    container_name: nextcloud-postgres
    restart: unless-stopped
    environment:
      - POSTGRES_PASSWORD=nextcloud
      - POSTGRES_DB=nextcloud
      - POSTGRES_USER=nextcloud
    volumes:
      - nextcloud_db:/var/lib/postgresql/data

  redis:
    image: redis:alpine
    container_name: nextcloud-redis
    restart: unless-stopped

  app:
    image: nextcloud:apache
    restart: unless-stopped
    container_name: nextcloud
    volumes:
      - nextcloud_app_data:/var/www/html/data
      - nextcloud_app_opt:/var/www/html
    environment:
      - POSTGRES_HOST=postgres
      - REDIS_HOST=redis
      - POSTGRES_PASSWORD=nextcloud
      - POSTGRES_DB=nextcloud
      - POSTGRES_USER=nextcloud
      - PHP_UPLOAD_LIMIT=20G
      - APACHE_BODY_LIMIT=0
    depends_on:
      - postgres
      - redis
    ports:
      - 7575:80

volumes:
  nextcloud_db:
  nextcloud_app_data:
  nextcloud_app_opt:

The problem I run into is that when I try to upload larger files, this fails with the error: [PHP] Error: rmdir(/mnt/ncdata/admin/uploads/web-file-upload-15252b578fdf95d6740fb72e4e6c488a-1687753536511): Directory not empty at /var/www/html/lib/private/Files/Storage/Local.php#147

I managed to find the following link to problem however, there it is specified that the error is in the alpine version, where as I am using the nextcloud:apache tag. I was able to have a workaround by turning off chunking with the following command:

sudo -u www-data php occ config:app:set files max_chunk_size --value 0

This seems to allow larger files now, however it has slowed down upload time considerably. Is there any known fix or work-around / other image that I could use to work around this problem?

tzerber commented 2 months ago

Are you uploading the files via web or NC client ? I just finished uploading 38 gig file and had 0 issues. Over the last two years I've uploaded terabytes of data via the NC app and had only one issue, and that was a datacenter outage that caused some weird database issues, but after fixing it I still have no problems with large files. Max i've tried is approx 500 gigabytes file and it went okay.

AndreiArdei commented 2 months ago

Unfortunately with both the web and the NC client. The NC client seems to be a tad faster than over the web but still slow. IT does work if i turn of chunking (i.e. set max_chunk_size) to 0 but then the upload speed is next to nothing for a 1.5GB file. If I set it to the default 10MB or even 20MB the upload speed seems to pick up but it fails.

tzerber commented 2 months ago

Using web for large (>5GB) files is always pain due php limits. The NC desktop client should not have any issues with chunked upload. Can you try to revert the change to max-chunk-size and try with the desktop app ?

AndreiArdei commented 2 months ago

Just for my peace of mind, this is also how I attach the NFS share.

192.168.1.139:/mnt/pool01/docker /var/lib/docker/volumes nfs x-systemd.automount,rw,soft,sync 0 0

Basically I mount the entire docker volumes directory to the nfs share. Meaning all volumes are actually created on the share not on the machine where I run docker (and nextcloud) from.

I did two tests with the NC Client:

  1. set max_chunk_size to 0 -> Completed the upload though it took a very long time
  2. set max_chunk_size to 20MB -> Failed (I got multiple errors but can't find them anymore)

I am now retrying with the 20MB chunking from the desktop and then I will also set it to the default 10MB chunking and try again.

AndreiArdei commented 2 months ago

Just tried with 10MB chunking size. Still fails:

image

image

This time however I don't get any sort of errors in the server logs.

joshtrichards commented 2 months ago
[PHP] Error: rmdir(/mnt/ncdata/admin/uploads/web-file-upload-15252b578fdf95d6740fb72e4e6c488a-1687753536511): Directory not empty at /var/www/html/lib/private/Files/Storage/Local.php#147 

What is /mnt/ncdata? That's not indicated in your provided Compose.

AndreiArdei commented 2 months ago
[PHP] Error: rmdir(/mnt/ncdata/admin/uploads/web-file-upload-15252b578fdf95d6740fb72e4e6c488a-1687753536511): Directory not empty at /var/www/html/lib/private/Files/Storage/Local.php#147 

What is /mnt/ncdata? That's not indicated in your provided Compose.

That is a really good question and i am not sure. I've looke both in the container, on the machine running docker and on the machine that hosts the volumes, it doesn't exist ? I can't find it anywhere. Als it's not something i'd setup since it doesn't follow my format for naming volumes or anyhting else.

https://help.nextcloud.com/t/what-is-mnt-ncdata-for/189789 Edit: I did come across this, however I still cannot find it on the host machine as a real directory.

joshtrichards commented 2 months ago

Did you previously use the All-In-One image? I believe it's the default datadirectory for that image.

Is it referenced under occ config:list system --private | grep ncdata? Or perhaps are there some symbolic links in your data directory?

AndreiArdei commented 2 months ago

I did indeed try an all-in-one image months ago, however that did not work for me so I removed the container, image and volumes related to it. Running the command you provided results in no data. Even if i don't do the grep for ncdata, this directory isn't there anyway.

Also cannot seem to find any symbolic links to it

joshtrichards commented 2 months ago

The way this image works and based on your Compose:

/mnt/ncdata/admin/uploads/web-file-upload-15252b578fdf95d6740fb72e4e6c488a-1687753536511

should be:

/var/www/html/data/admin/web-file-upload-15252b578fdf95d6740fb72e4e6c488a-1687753536511

Do you have any data in your user's home directory within the container - e.g. /var/www/html/data/admin should contain files you've created or uploaded?

AndreiArdei commented 2 months ago

So, I did a bit more testing. To answer some questions:

  1. There is no admin directory in /var/www/htm/data inside the container
  2. If I try uploading something via the web I can follow the PUT requests in the logs. Being that the chunk size is at 10MB it keeps creating parts up until some level, after which it seems as if the upload finished and I get a DELETE request and the error that it cannot delete the data. I attached a screenshot of the logs:

image

The directory is therem but it is empty, running ls -A in there returns nothing.

joshtrichards commented 2 months ago

There is no admin directory in /var/www/htm/data inside the container

I'm skeptical this is something in the image. This really sounds like something in your local environment.

I'm also confused because earlier you had references to /mnt/ncdata but now your log entries say /var/www/html/data?

Do you have multiple instances running?

Can you try just bringing up a simple standalone container running the 29-apache image and/or the 30-apache image:

$ docker run -d \
-p 2980:80 \
-v test_29_nextcloud:/var/www/html \
nextcloud:29-apache
$ docker run -d \
-p 3080:80 \
-v test_30_nextcloud:/var/www/html \
nextcloud:29-apache

Just do an sqlite install so you don't need a separate db.

Then accessing as http://localhost:{29,30}80/ (or IP if not on localhost) and try to reproduce your situation against them.

AndreiArdei commented 2 months ago

Sure!

So a few things that I changed in the meantime and maybe forgot to mention:

  1. I do NOT have more than one nextcloud container.
  2. I used to run the alpine version of redis and postgress, in an effort to not run into this issue again, I changed them in my main docker-compose to also be based on apache.

Now, I ran both of the images as you indicated.

This is the error I get on v29:

joshtrichards commented 2 months ago

Basically I mount the entire docker volumes directory to the nfs share. Meaning all volumes are actually created on the share not on the machine where I run docker (and nextcloud) from.

Now try it w/o using Docker managed volumes so that you can isolate whether it's your NFS /Docker setup or not. e.g.

$ docker run -d \
-p 2980:80 \
-v ./blah/test_29_nextcloud:/var/www/html \
nextcloud:29-apache
AndreiArdei commented 2 months ago

Just tried it. Tried the 2.5GB video file directly.

Off the bat, just starting the container is much much faster. Before I'd have to wait 4-5 minutes before the container would go past "Initializing nextcloud". Uploading went super fast and succedded.

That would then indeed mean that the issue is with the nfs mount.

AndreiArdei commented 2 months ago

After playing a little more with my NFS settings, I seem to have found something that works. Indeed seems as the issue was nfs related and nothing in nextcloud, in hind sight that makes sense since it was quite all over the place before.

For anyone having similar issues in the future, these settings seem to work for me with nc finally:

https://www.truenas.com/community/threads/recommended-options-for-mounting-nfs-shares.88519/