nextcloud / all-in-one

📦 The official Nextcloud installation method. Provides easy deployment and maintenance with most features included in this one Nextcloud instance.
https://hub.docker.com/r/nextcloud/all-in-one
GNU Affero General Public License v3.0
5.33k stars 615 forks source link

New preview images / thumbs (of large photos) are black & white only #4239

Closed kce68 closed 7 months ago

kce68 commented 7 months ago

Steps to reproduce

  1. Make a fresh Install; but it happens also after updating: In my case from AiO 7.9.1 to 7.12.1
  2. Open Photos-folder in Files
  3. Take a Look at the standard "nextcloud group photo" / upload a new colored Photo
  4. Click on it: It's in B&W
  5. Click the editing pencil: Photo becomes colored
  6. Edit something and save it as new Photo: Previews und thumbnails ar generate in B&W

Expected behavior

Colored should be colored :) As ist it works in all my installation with AiO 7.9.1

Actual behavior

Thumbnails and Previews are generated in B&W, when JPEG or PNGs have 12 megapixels I found out that only smaller Photos seem to work. I checked PNG and JPEG.

Any ideas? Let me know if you need more information.

Host OS

KVM-VM in Proxmox with Ubuntu 22.4 LTS all updates as of today has 6 GB RAM and 4 CPUs Docker from the Docker apt-repositories Deployed with docker compose via a Portainer EE Behind a HAproxy

Nextcloud AIO version

7.9.1 working and 7.12.1 not working regardless of updating, fresh install with NC 27 or NC 28 with Containers: Apache ([Running] Database ([Running] Nextcloud ([Running] Notify Push ([Running] Redis ([Running] Collabora ([Running] Talk ([Running] Imaginary ([Running] Fulltextsearch ([Running]

Current channel

Latest

Other valuable info

My docker-compose:

version: "3.8"

volumes:
  nextcloud_aio_mastercontainer:
    name: nextcloud_aio_mastercontainer # This line is not allowed to be changed as otherwise the built-in backup solution will not work

services:
  nextcloud:
    image: nextcloud/all-in-one:latest
    restart: always
    container_name: nextcloud-aio-mastercontainer # This line is not allowed to be changed as otherwise AIO will not work correctly
    volumes:
      - nextcloud_aio_mastercontainer:/mnt/docker-aio-config # This line is not allowed to be changed as otherwise the built-in backup solution will not work
      - /var/run/docker.sock:/var/run/docker.sock:ro # May be changed on macOS, Windows or docker rootless. See the applicable documentation. If adjusting, don't forget to also set 'WATCHTOWER_DOCKER_SOCKET_PATH'!
    ports:
      - 8080:8080
    #  - 8443:8443 # Can be removed when running behind a web server or reverse proxy (like Apache, Nginx and else). See https://github.com/nextcloud/all-in-one/blob/main/reverse-proxy.md
    environment: # Is needed when using any of the options below
https://github.com/nextcloud/all-in-one#how-to-disable-collaboras-seccomp-feature
      - NEXTCLOUD_DATADIR=/smb/nxcdata # Allows to set the host directory for Nextcloud's datadir. See https://github.com/nextcloud/all-in-one#how-to-change-the-default-location-of-nextclouds-datadir
      - NEXTCLOUD_MOUNT=/mnt/ # Allows the Nextcloud container to access the chosen directory on the host. See https://github.com/nextcloud/all-in-one#how-to-allow-the-nextcloud-container-to-access-directories-on-the-host
      - NEXTCLOUD_UPLOAD_LIMIT=20G # Can be adjusted if you need more. See https://github.com/nextcloud/all-in-one#how-to-adjust-the-upload-limit-for-nextcloud
      - NEXTCLOUD_MAX_TIME=7200 # Can be adjusted if you need more. See https://github.com/nextcloud/all-in-one#how-to-adjust-the-max-execution-time-for-nextcloud
      - TALK_PORT=30678 # This allows to adjust the port that the talk container is using.

My haproxy.conf :

# Automaticaly generated, dont edit manually.
# Generated on: 2024-02-17 09:34
global
    maxconn         15000
    stats socket /tmp/haproxy.socket level admin  expose-fd listeners
    uid         80
    gid         80
    nbproc          1
    nbthread            1
    hard-stop-after     15m
    chroot              /tmp/haproxy_chroot
    daemon
    tune.ssl.default-dh-param   2048
    server-state-file /tmp/haproxy_server_state

listen HAProxyLocalStats
    bind 127.0.0.1:2200 name localstats
    mode http
    stats enable
    stats refresh 5
    stats admin if TRUE
    stats show-legends
    stats uri /haproxy/haproxy_stats.php?haproxystats=1
    timeout client 5000
    timeout connect 5000
    timeout server 5000

frontend Nextcloud-INGRES
    bind            136.243.179.76:443 name 136.243.179.76:443   ssl crt-list /var/etc/haproxy/Nextcloud-INGRES.crt_list  
    mode            http
    log         global
    option          http-keep-alive
    option          forwardfor
    acl https ssl_fc
    http-request set-header     X-Forwarded-Proto http if !https
    http-request set-header     X-Forwarded-Proto https if https
    maxconn         8000
    timeout client      30000
    acl         test-acl    var(txn.txnhost) -m sub -i nxctest.bindix
    acl         aclcrt_Nextcloud-INGRES var(txn.txnhost) -m reg -i ^nxctest\.bindix\.de(:([0-9]){1,5})?$
    http-request set-var(txn.txnhost) hdr(host)
    use_backend nxc-test-BE_ipvANY  if  test-acl aclcrt_Nextcloud-INGRES

backend nxc-test-BE_ipvANY
    mode            http
    id          107
    log         global
    timeout connect     300000
    timeout server      300000
    retries         3
    option          httpchk OPTIONS / 
    server          NXC-test-Dienst 192.168.130.36:11000 id 101 check inter 1000

One of my Example Photo 2 reproduce :)

IMG_9782

martin-nohava commented 7 months ago

Can confirm, I have the same issue after update to 7.12.1.

szaimen commented 7 months ago

Hi, I fear I cannot reproduce your issue on my test server :/

image

szaimen commented 7 months ago

But the picture you've sent is also only 6.3MB big... maybe that is the reason? Do you have one example file for me with 12MB as outlined?

kce68 commented 7 months ago

But the picture you've sent is also only 6.3MB big... maybe that is the reason? Do you have one example file for me with 12MB as outlined?

Well, I ment 12 MegaPixel (3000x4000 Pixels).

I can send you an eMail with credentials, if you are interested to investigate in my enviroment. It is a separated VM with no private data on it.

vposloncec commented 7 months ago

For me it happens for all image previews that were created since the update. I'm running nextcloud-aio in a proxmox VM. for setup, I've followed reverse-proxy cloudflare tunnel guide (running with docker run command). For me the pictures are actually B&W negatives. Example image: https://imgur.com/HKaAaFB Preview: https://imgur.com/xyGcAff

kce68 commented 7 months ago

Can confirm, I have the same issue after update to 7.12.1.

Thanks, ... Im happy not to be alone with that issue. Now it's about to find out for the reasons.

PfannenHans commented 7 months ago

The issue is the imaginary container as I just tried to set it up with the standalone nextcloud docker image and i got the same behaviour. Can also be tested using curl -O "http://$IMAGINARY-IP:9000/crop?width=500&height=400&url=https://raw.githubusercontent.com/h2non/imaginary/master/testdata/large.jpg"
Using the older 20240201_120631-latest-Tag for the container instead of the latest also resolves the issue, so there is a regression somewhere.
For me it also happened for any image size.

jan315 commented 7 months ago

Same problem, happened with all new HEIC uploads since update.

szaimen commented 7 months ago

I fear I can still not reproduce this. Can you maybe post the imaginary container logs and nextcloud logs here?

kce68 commented 7 months ago

Hi Simon, here we are ... :)

logs-as-requested.txt

szaimen commented 7 months ago

Hm, the logs do not show anything useful :/

Can you also share the logs that are avaialble via https://yourdomain.com/settings/admin/logging ?

kce68 commented 7 months ago

@szaimen Your help is really wellcome ! Here's the log. But I fear this log is also quite boring ... nextcloud.log Want to come on the machine to investigate any further?

rodonile commented 7 months ago

Hi all, just wanted to share I am experiencing the same issue. It happens only on newly update images.

fireblade2534 commented 7 months ago

Same issue here

CyberCowboy commented 7 months ago

For what it's worth, I'm having the same issue, rebuilt AIO and still happening.

jan315 commented 7 months ago

My temporary fix was to change master container version to previous release '20240201_120631-latest' and move affected files to PC then back to NC to be correctly displayed.

c0urier commented 7 months ago

20240201_120631-latest

I can confirm that downgrading to 20240201_120631-latest solved the problem for me to. Thank you!

corincorvus commented 7 months ago

Downgrade worked now, (image conflict) solved the problem. Thanks.

Waiting for Fix

But how can i fix already uploaded files? occ preview:generate-all didnt solve it. Thanks

szaimen commented 7 months ago

Hi, can some of you for a test switch to 20240215_092413-latest and check if that works as well?

corincorvus commented 7 months ago

Hi, can some of you for a test switch to 20240215_092413-latest and check if that works as well?

On this Version the Error is back image image

szaimen commented 7 months ago

I see. For everyone, can you run

sudo docker stop nextcloud-aio-imaginary
sudo docker rm nextcloud-aio-imaginary
sudo docker image prune -a

And then open the aio interface again and stop and start the containers again and check if that resolves things?

corincorvus commented 7 months ago

At the Moment i cant test with newer Version. My Instance is productive and i only have time slots in the morning and between 15-17 o clock.

jeantoulza commented 7 months ago

Downgrade worked now, (image conflict) solved the problem. Thanks.

Waiting for Fix

But how can i fix already uploaded files? occ preview:generate-all didnt solve it. Thanks

Just fixed it on my instance. Basically you remove all corrupted previews then rescan appdata preview folder. Removed previews will be regenerated on-the-fly when you request them (by opening the photos app for example). You may have to delete your browser cache to see the new previews.

  1. First, open the aio interface and create a backup and start your containers again after the backup is successful.

  2. Afterwards, run via CLI:

    # Go into the container
    sudo docker exec --user www-data -it nextcloud-aio-nextcloud bash
  3. Now inside the container:

    
    DAYS=7  # replace with the number of days since the previews were corrupted (earliest was 8th february)

delete bad previews

find "/mnt/ncdata/appdata_*/preview" -ctime -$DAYS -type f -delete

rescan preview folder in appdata

php occ files:scan-app-data preview

CyberCowboy commented 7 months ago

I see. For everyone, can you run

sudo docker stop nextcloud-aio-imaginary
sudo docker rm nextcloud-aio-imaginary
sudo docker image prune -a

And then open the aio interface again and stop and start the containers again and check if that resolves things?

No good for me, using AIO v7.12.1

usngrimsley commented 7 months ago

I see. For everyone, can you run


sudo docker stop nextcloud-aio-imaginary

sudo docker rm nextcloud-aio-imaginary

sudo docker image prune -a

And then open the aio interface again and stop and start the containers again and check if that resolves things?

No go as well

jonny-379 commented 7 months ago

I have the same problem on my AIO v7.21.1...

I see. For everyone, can you run

sudo docker stop nextcloud-aio-imaginary
sudo docker rm nextcloud-aio-imaginary
sudo docker image prune -a

And then open the aio interface again and stop and start the containers again and check if that resolves things?

Well it works kinda, when I stop and remove it, previews have colour again (when opening the images shown in nextcloud in a new tab (right click) the image is called preview). But when I then stop & start the docker in the AIO admin interface the problem comes back.

So I'll stop and remove it, and just run like that.

szaimen commented 7 months ago

Hi, I just released a fix with v7.13.0 Beta. Testing and feedback is welcome! See https://github.com/nextcloud/all-in-one#how-to-switch-the-channel

⚠️ See how to remove broken previews: https://github.com/nextcloud/all-in-one/issues/4239#issuecomment-1954971592

CyberCowboy commented 7 months ago
  1. First, open the aio interface and create a backup and start your containers again after the backup is successful.
  2. Afterwards, run via CLI:
# Go into the container
sudo docker exec --user www-data -it nextcloud-aio-nextcloud bash
  1. Now inside the container:
DAYS=7  # replace with the number of days since the previews were corrupted (earliest was 8th february)

# delete bad previews
find "/mnt/ncdata/appdata_*/preview" -ctime -$DAYS -type f -delete

# rescan preview folder in appdata
php occ files:scan-app-data preview

I'm putting the above command (with modification for app_* and replacing days with 70) in a file called ncimagefix then chmod +x

When I run it I"m getting ./ncimagefix Could not open input file: occ

if I try to sudo while in the bash instance it asks for www-data password which I don't know/have.

I'm fairly new to NC, having started it about 2 weeks ago, so this problem has really been a hassle as I thought I was doing something wrong.

CyberCowboy commented 7 months ago

BTW I did update to the Beta, so am trying to see if that fixed the issue

CyberCowboy commented 7 months ago

I modified the suggested way to regenerate previews, and I think it runs correctly, but if so no change.

I'm running from the docker host with this code:

DAYS=70  # replace with the number of days since the previews were corrupted
NEXTCLOUD_APPDATA=/mnt/storage/appdata_oc73brnah5op #replace with your own appdata folder

 # delete bad previews
find "$NEXTCLOUD_APPDATA/preview" -ctime -$DAYS -type f -delete

 # rescan preview folder in appdata
 docker exec --user www-data -it nextcloud-aio-nextcloud php occ files:scan-app-data preview

Can anyone confirm if that should work? As I said, if so with 7.13.0 beta it doesn't seem to change.

usngrimsley commented 7 months ago

Updated to v7.13.0 beta. Uploaded several pictures and looks to have fixed the issue with new upload. (Apple iOS 17.2.1 / App version 5.0.1.0 and 5.1)

m4nges commented 7 months ago

Hi, I just released a fix with v7.13.0 Beta. Testing and feedback is welcome! See https://github.com/nextcloud/all-in-one#how-to-switch-the-channel

⚠️ See how to remove broken previews: #4239 (comment)

This worked like charme!

zvarnes commented 7 months ago

Beta update worked for me. Thanks for the quick fix 🫂

matteoipri commented 7 months ago

I did not experience the issue before, but I switched my all-in-one test instance to the beta channel and the previews are generate correctly for iPhone taken photos.

PS: I am waiting to have the nextcloud/docker [28.0.2] and nextcloud/all-in-one [28.0.1] on the same version to do a migration, mostly for these previews missing with my current deployment.

jonny-379 commented 7 months ago

Hi, I just released a fix with v7.13.0 Beta. Testing and feedback is welcome! See https://github.com/nextcloud/all-in-one#how-to-switch-the-channel

⚠️ See how to remove broken previews: #4239 (comment)

The beta fixed the problem for me to.

corincorvus commented 7 months ago

Beta Channel i cant use fulltextsearch and clamav container. both are only shows errors. Without it starts. Photo problem is fixed (50 testruns)

fulltextsearch cant start cause problem with db version exception during geoip databases update | @timestamp=2024-02-22T13:27:42.530Z log.level=ERROR ecs.version=1.2.0 service.name=ES_ECS event.dataset=elasticsearch.server process.thread.name=elasticsearch[d45d52cf6edd][generic][T#4] log.logger=org.elasticsearch.ingest.geoip.GeoIpDownloader elasticsearch.cluster.uuid=rKwx2yHURCSG5KM3Ki17mw elasticsearch.node.id=Q7PmvilARy2_Imj_MtpsFA elasticsearch.node.name=d45d52cf6edd elasticsearch.cluster.name=nextcloud-aio error.type=org.elasticsearch.ElasticsearchException error.message=not all primary shards of [.geoip_databases] index are active error.stack_trace=org.elasticsearch.ElasticsearchException: not all primary shards of [.geoip_databases] index are active

clamav cant find Socket

Starting Freshclamd
Starting ClamAV

Socket for clamd not found yet, retrying (0/90) ...ClamAV update process started at Thu Feb 22 13:24:45 2024
daily.cld database is up-to-date (version: 27193, sigs: 2053961, f-level: 90, builder: raynman)
main.cvd database is up-to-date (version: 62, sigs: 6647427, f-level: 90, builder: sigmgr)
bytecode.cvd database is up-to-date (version: 334, sigs: 91, f-level: 90, builder: anvilleg)

Socket for clamd not found yet, retrying (1/90) ...
Socket for clamd not found yet, retrying (2/90) ...
Socket for clamd not found yet, retrying (3/90) ...
Socket for clamd not found yet, retrying (4/90) ...
Socket for clamd not found yet, retrying (5/90) ...
Socket for clamd not found yet, retrying (6/90) ...
Socket for clamd not found yet, retrying (7/90) ...
Socket for clamd not found yet, retrying (8/90) ...
Socket for clamd not found yet, retrying (9/90) ...
Socket for clamd not found yet, retrying (10/90) ...
Socket for clamd not found yet, retrying (11/90) ...
Socket for clamd not found yet, retrying (12/90) ...
Socket for clamd not found yet, retrying (13/90) ...
Socket for clamd not found yet, retrying (14/90) ...
Socket for clamd not found yet, retrying (15/90) ...
Socket for clamd not found yet, retrying (16/90) ...
Socket for clamd not found yet, retrying (17/90) ...
Socket for clamd not found yet, retrying (18/90) ...
Socket for clamd not found yet, retrying (19/90) ...
Socket for clamd not found yet, retrying (20/90) ...
Socket for clamd not found yet, retrying (21/90) ...
Socket for clamd not found yet, retrying (22/90) ...
Socket for clamd not found yet, retrying (23/90) ...
Socket for clamd not found yet, retrying (24/90) ...
Socket for clamd not found yet, retrying (25/90) ...
Socket for clamd not found yet, retrying (26/90) ...
Socket for clamd not found yet, retrying (27/90) ...
Socket for clamd not found yet, retrying (28/90) ...
Socket for clamd not found yet, retrying (29/90) ...
Socket for clamd not found yet, retrying (30/90) ...
Socket for clamd not found yet, retrying (31/90) ...
Socket for clamd not found yet, retrying (32/90) ...
Socket for clamd not found yet, retrying (33/90) ...
Socket for clamd not found yet, retrying (34/90) ...
Socket for clamd not found yet, retrying (35/90) ...
Socket for clamd not found yet, retrying (36/90) ...
Socket for clamd not found yet, retrying (37/90) ...
Socket for clamd not found yet, retrying (38/90) ...
Socket for clamd not found yet, retrying (39/90) ...
Socket for clamd not found yet, retrying (40/90) ...
Socket for clamd not found yet, retrying (41/90) ...
Socket for clamd not found yet, retrying (42/90) ...
Socket for clamd not found yet, retrying (43/90) ...
Socket for clamd not found yet, retrying (44/90) ...
Socket for clamd not found yet, retrying (45/90) ...
Socket for clamd not found yet, retrying (46/90) ...
Socket for clamd not found yet, retrying (47/90) ...
Socket for clamd not found yet, retrying (48/90) ...
Socket for clamd not found yet, retrying (49/90) ...
Socket for clamd not found yet, retrying (50/90) ...
Socket for clamd not found yet, retrying (51/90) ...
Socket for clamd not found yet, retrying (52/90) ...
Socket for clamd not found yet, retrying (53/90) ...
Socket for clamd not found yet, retrying (54/90) ...
Socket for clamd not found yet, retrying (55/90) ...
Socket for clamd not found yet, retrying (56/90) ...
Socket for clamd not found yet, retrying (57/90) ...
Socket for clamd not found yet, retrying (58/90) ...
Socket for clamd not found yet, retrying (59/90) ...
Socket for clamd not found yet, retrying (60/90) ...
Socket for clamd not found yet, retrying (61/90) ...
Socket for clamd not found yet, retrying (62/90) ...
Socket for clamd not found yet, retrying (63/90) ...
Socket for clamd not found yet, retrying (64/90) ...
Socket for clamd not found yet, retrying (65/90) ...
Socket for clamd not found yet, retrying (66/90) ...
Socket for clamd not found yet, retrying (67/90) ...
Socket for clamd not found yet, retrying (68/90) ...
Socket for clamd not found yet, retrying (69/90) ...
Socket for clamd not found yet, retrying (70/90) ...
Socket for clamd not found yet, retrying (71/90) ...
Socket for clamd not found yet, retrying (72/90) ...
Socket for clamd not found yet, retrying (73/90) ...
Socket for clamd not found yet, retrying (74/90) ...
Socket for clamd not found yet, retrying (75/90) ...
Socket for clamd not found yet, retrying (76/90) ...
Socket for clamd not found yet, retrying (77/90) ...
Socket for clamd not found yet, retrying (78/90) ...
Socket for clamd not found yet, retrying (79/90) ...
Socket for clamd not found yet, retrying (80/90) ...
Socket for clamd not found yet, retrying (81/90) ...
Socket for clamd not found yet, retrying (82/90) ...
Socket for clamd not found yet, retrying (83/90) ...
Socket for clamd not found yet, retrying (84/90) ...
Socket for clamd not found yet, retrying (85/90) ...
Socket for clamd not found yet, retrying (86/90) ...
Socket for clamd not found yet, retrying (87/90) ...
Socket for clamd not found yet, retrying (88/90) ...
Socket for clamd not found yet, retrying (89/90) ...
Socket for clamd not found yet, retrying (90/90) ...
Failed to start clamd
Starting Freshclamd
Starting ClamAV

Socket for clamd not found yet, retrying (0/90) ...ClamAV update process started at Thu Feb 22 13:26:24 2024
daily.cld database is up-to-date (version: 27193, sigs: 2053961, f-level: 90, builder: raynman)
main.cvd database is up-to-date (version: 62, sigs: 6647427, f-level: 90, builder: sigmgr)
bytecode.cvd database is up-to-date (version: 334, sigs: 91, f-level: 90, builder: anvilleg)
hmarti54 commented 7 months ago

I updated to beta and this did not fix the issue for me for photos that are already on nextcloud. Haven't tried on newly uploaded pics but don't really want to have to re-upload a ton of photos..

Downgrade worked now, (image conflict) solved the problem. Thanks. Waiting for Fix But how can i fix already uploaded files? occ preview:generate-all didnt solve it. Thanks

Just fixed it on my instance. Basically you remove all corrupted previews then rescan appdata preview folder. Removed previews will be regenerated on-the-fly when you request them (by opening the photos app for example). You may have to delete your browser cache to see the new previews.

  1. First, open the aio interface and create a backup and start your containers again after the backup is successful.
  2. Afterwards, run via CLI:
# Go into the container
sudo docker exec --user www-data -it nextcloud-aio-nextcloud bash
  1. Now inside the container:
DAYS=7  # replace with the number of days since the previews were corrupted (earliest was 8th february)

# delete bad previews
find "/mnt/ncdata/appdata_*/preview" -ctime -$DAYS -type f -delete

# rescan preview folder in appdata
php occ files:scan-app-data preview

This isn't working for me, says no such file or directory (I have tried several options of renaming the file path to see if it works)

szaimen commented 3 months ago

Hi, we upgraded Alpine Linux in the Imaginary container again and released that with v9.1.0 Beta. Please test uploading Photos from the iOS app and check if the previews look fine. If not, please log an issue. Testing and feedback is welcome! See https://github.com/nextcloud/all-in-one#how-to-switch-the-channel