haugene / docker-transmission-openvpn

Docker container running Transmission torrent client with WebUI over an OpenVPN tunnel
GNU General Public License v3.0
4.14k stars 1.21k forks source link

High passive memory usage (leak?) since version 4.3 (Upgrade to Ubuntu 22.04 and Transmission 3.00) #2469

Closed ameinild closed 1 year ago

ameinild commented 1 year ago

Is there a pinned issue for this?

Is there an existing or similar issue/discussion for this?

Is there any comment in the documentation for this?

Is this related to a provider?

Are you using the latest release?

Have you tried using the dev branch latest?

Docker run config used

version: '3'

services:

  transmission:
    image: haugene/transmission-openvpn:latest
    container_name: transmission-openvpn
    healthcheck:
      disable: true 
    environment:
      - TRANSMISSION_DOWNLOAD_DIR=/data/download
      - CREATE_TUN_DEVICE=true
      - OPENVPN_PROVIDER=CUSTOM
      - OPENVPN_USERNAME=****
      - OPENVPN_PASSWORD=****
      - WEBPROXY_ENABLED=false
      - TRANSMISSION_PEER_PORT=49115
      - TRANSMISSION_PORT_FORWARDING_ENABLED=false
      - LOCAL_NETWORK=10.10.0.0/22
      - DROP_DEFAULT_ROUTE=true
      - TRANSMISSION_WATCH_DIR_ENABLED=false
      - TRANSMISSION_DHT_ENABLED=true
      - TRANSMISSION_PEX_ENABLED=true
      - PUID=1000
      - PGID=1000
      - NO_LOGS=true
    ports:
      - '8333:9091'
    volumes:
      - /mnt/zfs/postern/torrents:/data
      - /mnt/docker-data/transmission/str-ams102_a309011.ovpn:/etc/openvpn/custom/default.ovpn
      - /mnt/docker-data/transmission:/config
      - /etc/localtime:/etc/localtime:ro
    cap_add:
      - NET_ADMIN
    logging:
      driver: json-file
      options:
        max-size: 10m
    restart: unless-stopped

Current Behavior

The Transmission Container uses over 10 GB of memory after running for 10 days with around 25 torrents.

Transmission_Memory_Leak

Expected Behavior

I expect the container to not use over 10 GB of memory when only seeding a couple of torrents at a time.

How have you tried to solve the problem?

Works without issue for version 4.2 (Ubuntu 20.04 and Transmission 2,X)

Log output

2022-12-11T01:00:39.554835090Z Starting container with revision: b33d0fe4c938259a0d4eb844e55468f387456121
2022-12-11T01:00:39.659198076Z Creating TUN device /dev/net/tun
2022-12-11T01:00:39.664931765Z Using OpenVPN provider: CUSTOM
2022-12-11T01:00:39.664976101Z Running with VPN_CONFIG_SOURCE auto
2022-12-11T01:00:39.664985258Z CUSTOM provider specified but not using default.ovpn, will try to find a valid config mounted to /etc/openvpn/custom
2022-12-11T01:00:39.671329834Z No VPN configuration provided. Using default.
2022-12-11T01:00:39.671385155Z Modifying /etc/openvpn/custom/default.ovpn for best behaviour in this container
2022-12-11T01:00:39.672167391Z Modification: Point auth-user-pass option to the username/password file
2022-12-11T01:00:39.695956777Z sed: cannot rename /etc/openvpn/custom/sedPEe4Fh: Device or resource busy
2022-12-11T01:00:39.696593127Z Modification: Change ca certificate path
2022-12-11T01:00:39.755163243Z Modification: Change ping options
2022-12-11T01:00:39.755246763Z sed: cannot rename /etc/openvpn/custom/sedL2NPCH: Device or resource busy
2022-12-11T01:00:39.775036205Z sed: cannot rename /etc/openvpn/custom/sedUOYd6Q: Device or resource busy
2022-12-11T01:00:39.798124953Z sed: cannot rename /etc/openvpn/custom/sed0MejDr: Device or resource busy
2022-12-11T01:00:39.841101873Z Modification: Update/set resolv-retry to 15 seconds
2022-12-11T01:00:39.931314660Z Modification: Change tls-crypt keyfile path
2022-12-11T01:00:39.931325205Z Modification: Set output verbosity to 3
2022-12-11T01:00:39.931341936Z sed: cannot rename /etc/openvpn/custom/sedYAEKv9: Device or resource busy
2022-12-11T01:00:39.931352007Z sed: cannot rename /etc/openvpn/custom/sedpmpJtK: Device or resource busy
2022-12-11T01:00:39.931360395Z sed: cannot rename /etc/openvpn/custom/sedunTnJi: Device or resource busy
2022-12-11T01:00:39.931368830Z sed: cannot rename /etc/openvpn/custom/sednSwLKg: Device or resource busy
2022-12-11T01:00:39.931377274Z sed: cannot rename /etc/openvpn/custom/sedMBlNI2: Device or resource busy
2022-12-11T01:00:39.931385634Z sed: cannot rename /etc/openvpn/custom/sed3qQXNV: Device or resource busy
2022-12-11T01:00:39.941248316Z sed: cannot rename /etc/openvpn/custom/sedP9qI2j: Device or resource busy
2022-12-11T01:00:39.950407893Z Modification: Remap SIGUSR1 signal to SIGTERM, avoid OpenVPN restart loop
2022-12-11T01:00:40.019717774Z sed: cannot rename /etc/openvpn/custom/sedTeXOgM: Device or resource busy
2022-12-11T01:00:40.033250085Z Modification: Updating status for config failure detection
2022-12-11T01:00:40.033315038Z sed: cannot rename /etc/openvpn/custom/sedzWkYUw: Device or resource busy
2022-12-11T01:00:40.067269476Z sed: cannot rename /etc/openvpn/custom/sedNeVS5M: Device or resource busy
2022-12-11T01:00:40.078484365Z sed: cannot rename /etc/openvpn/custom/seddiPlVF: Device or resource busy
2022-12-11T01:00:40.103030778Z Setting OpenVPN credentials...
2022-12-11T01:00:40.264497533Z adding route to local network 10.10.0.0/22 via 172.23.0.1 dev eth0
2022-12-11T01:00:40.264546724Z 2022-12-11 02:00:40 WARNING: Compression for receiving enabled. Compression has been used in the past to break encryption. Sent packets are not compressed unless "allow-compression yes" is also set.
2022-12-11T01:00:40.319442490Z 2022-12-11 02:00:40 DEPRECATED OPTION: --cipher set to 'AES-256-CBC' but missing in --data-ciphers (AES-256-GCM:AES-128-GCM). Future OpenVPN version will ignore --cipher for cipher negotiations. Add 'AES-256-CBC' to --data-ciphers or change --cipher 'AES-256-CBC' to --data-ciphers-fallback 'AES-256-CBC' to silence this warning.
2022-12-11T01:00:40.346038041Z 2022-12-11 02:00:40 OpenVPN 2.5.5 x86_64-pc-linux-gnu [SSL (OpenSSL)] [LZO] [LZ4] [EPOLL] [PKCS11] [MH/PKTINFO] [AEAD] built on Jul 14 2022
2022-12-11T01:00:40.346090577Z 2022-12-11 02:00:40 library versions: OpenSSL 3.0.2 15 Mar 2022, LZO 2.10
2022-12-11T01:00:40.346100072Z 2022-12-11 02:00:40 WARNING: --ns-cert-type is DEPRECATED.  Use --remote-cert-tls instead.
2022-12-11T01:00:40.346109059Z 2022-12-11 02:00:40 NOTE: the current --script-security setting may allow this configuration to call user-defined scripts
2022-12-11T01:00:40.346117744Z 2022-12-11 02:00:40 Outgoing Control Channel Authentication: Using 256 bit message hash 'SHA256' for HMAC authentication
2022-12-11T01:00:40.346126644Z 2022-12-11 02:00:40 Incoming Control Channel Authentication: Using 256 bit message hash 'SHA256' for HMAC authentication
2022-12-11T01:00:40.346135291Z 2022-12-11 02:00:40 WARNING: normally if you use --mssfix and/or --fragment, you should also set --tun-mtu 1500 (currently it is 1400)
2022-12-11T01:00:40.608963329Z 2022-12-11 02:00:40 TCP/UDP: Preserving recently used remote address: [AF_INET]176.67.80.9:1194
2022-12-11T01:00:40.609027167Z 2022-12-11 02:00:40 Socket Buffers: R=[131072->131072] S=[16384->16384]
2022-12-11T01:00:40.609038286Z 2022-12-11 02:00:40 Attempting to establish TCP connection with [AF_INET]176.67.80.9:1194 [nonblock]
2022-12-11T01:00:40.627760642Z 2022-12-11 02:00:40 TCP connection established with [AF_INET]176.67.80.9:1194
2022-12-11T01:00:40.627803799Z 2022-12-11 02:00:40 TCP_CLIENT link local: (not bound)
2022-12-11T01:00:40.627812954Z 2022-12-11 02:00:40 TCP_CLIENT link remote: [AF_INET]176.67.80.9:1194
2022-12-11T01:00:40.642535460Z 2022-12-11 02:00:40 TLS: Initial packet from [AF_INET]176.67.80.9:1194, sid=9f40cbc3 86e2f62d
2022-12-11T01:00:40.676952982Z 2022-12-11 02:00:40 VERIFY OK: depth=1, C=US, ST=TX, L=Dallas, O=strongtechnology.net, CN=strongtechnology.net CA, emailAddress=lecerts@strongtechnology.net
2022-12-11T01:00:40.676999175Z 2022-12-11 02:00:40 VERIFY OK: nsCertType=SERVER
2022-12-11T01:00:40.677008417Z 2022-12-11 02:00:40 NOTE: --mute triggered...
2022-12-11T01:00:40.716204565Z 2022-12-11 02:00:40 2 variation(s) on previous 3 message(s) suppressed by --mute
2022-12-11T01:00:40.716254334Z 2022-12-11 02:00:40 [openvpn] Peer Connection Initiated with [AF_INET]176.67.80.9:1194
2022-12-11T01:00:41.723682262Z 2022-12-11 02:00:41 SENT CONTROL [openvpn]: 'PUSH_REQUEST' (status=1)
2022-12-11T01:00:41.745246746Z 2022-12-11 02:00:41 PUSH: Received control message: 'PUSH_REPLY,dhcp-option DNS 198.18.0.1,dhcp-option DNS 198.18.0.2,ping 1,ping-restart 60,comp-lzo no,route-gateway 100.64.32.1,topology subnet,socket-flags TCP_NODELAY,ifconfig 100.64.32.8 255.255.254.0,peer-id 0,cipher AES-256-GCM'
2022-12-11T01:00:41.745296865Z 2022-12-11 02:00:41 OPTIONS IMPORT: timers and/or timeouts modified
2022-12-11T01:00:41.745306305Z 2022-12-11 02:00:41 NOTE: --mute triggered...
2022-12-11T01:00:41.745314835Z 2022-12-11 02:00:41 2 variation(s) on previous 3 message(s) suppressed by --mute
2022-12-11T01:00:41.745323254Z 2022-12-11 02:00:41 Socket flags: TCP_NODELAY=1 succeeded
2022-12-11T01:00:41.745331612Z 2022-12-11 02:00:41 OPTIONS IMPORT: --ifconfig/up options modified
2022-12-11T01:00:41.745340021Z 2022-12-11 02:00:41 OPTIONS IMPORT: route-related options modified
2022-12-11T01:00:41.745348382Z 2022-12-11 02:00:41 OPTIONS IMPORT: --ip-win32 and/or --dhcp-option options modified
2022-12-11T01:00:41.745356890Z 2022-12-11 02:00:41 NOTE: --mute triggered...
2022-12-11T01:00:41.745365192Z 2022-12-11 02:00:41 3 variation(s) on previous 3 message(s) suppressed by --mute
2022-12-11T01:00:41.745373568Z 2022-12-11 02:00:41 Data Channel: using negotiated cipher 'AES-256-GCM'
2022-12-11T01:00:41.745382004Z 2022-12-11 02:00:41 Outgoing Data Channel: Cipher 'AES-256-GCM' initialized with 256 bit key
2022-12-11T01:00:41.745390287Z 2022-12-11 02:00:41 Incoming Data Channel: Cipher 'AES-256-GCM' initialized with 256 bit key
2022-12-11T01:00:41.745398588Z 2022-12-11 02:00:41 net_route_v4_best_gw query: dst 0.0.0.0
2022-12-11T01:00:41.745407018Z 2022-12-11 02:00:41 net_route_v4_best_gw result: via 172.23.0.1 dev eth0
2022-12-11T01:00:41.758771221Z 2022-12-11 02:00:41 ROUTE_GATEWAY 172.23.0.1/255.255.0.0 IFACE=eth0 HWADDR=02:42:ac:17:00:02
2022-12-11T01:00:41.772554152Z 2022-12-11 02:00:41 TUN/TAP device tun0 opened
2022-12-11T01:00:41.772606012Z 2022-12-11 02:00:41 net_iface_mtu_set: mtu 1400 for tun0
2022-12-11T01:00:41.775118954Z 2022-12-11 02:00:41 net_iface_up: set tun0 up
2022-12-11T01:00:41.775161323Z 2022-12-11 02:00:41 net_addr_v4_add: 100.64.32.8/23 dev tun0
2022-12-11T01:00:43.826150917Z 2022-12-11 02:00:43 net_route_v4_add: 176.67.80.9/32 via 172.23.0.1 dev [NULL] table 0 metric -1
2022-12-11T01:00:43.826223721Z 2022-12-11 02:00:43 net_route_v4_add: 0.0.0.0/1 via 100.64.32.1 dev [NULL] table 0 metric -1
2022-12-11T01:00:43.826234178Z 2022-12-11 02:00:43 net_route_v4_add: 128.0.0.0/1 via 100.64.32.1 dev [NULL] table 0 metric -1
2022-12-11T01:00:43.899186309Z sed: cannot rename /etc/openvpn/custom/sedWasA9G: Device or resource busy
2022-12-11T01:00:43.913757854Z sed: cannot rename /etc/openvpn/custom/sedoLg3bb: Device or resource busy
2022-12-11T01:00:43.958914937Z Up script executed with device=tun0 ifconfig_local=100.64.32.8
2022-12-11T01:00:43.958962694Z Updating TRANSMISSION_BIND_ADDRESS_IPV4 to the ip of tun0 : 100.64.32.8
2022-12-11T01:00:43.965486737Z TRANSMISSION_HOME is currently set to: /config/transmission-home
2022-12-11T01:00:43.965570702Z WARNING: Deprecated. Found old default transmission-home folder at /data/transmission-home, setting this as TRANSMISSION_HOME. This might break in future versions.
2022-12-11T01:00:43.965613573Z We will fallback to this directory as long as the folder exists. Please consider moving it to /config/<transmission-home>
2022-12-11T01:00:43.974559387Z Enforcing ownership on transmission config directory
2022-12-11T01:00:43.983238151Z Applying permissions to transmission config directory
2022-12-11T01:00:43.983291867Z Setting owner for transmission paths to 1000:1000
2022-12-11T01:00:43.994004629Z Setting permissions for download and incomplete directories
2022-12-11T01:00:44.142834012Z Mask: 002
2022-12-11T01:00:44.142884610Z Directories: 775
2022-12-11T01:00:44.142893883Z Files: 664
2022-12-11T01:00:44.175889217Z Setting permission for watch directory (775) and its files (664)
2022-12-11T01:00:44.207608299Z 
2022-12-11T01:00:44.207668538Z -------------------------------------
2022-12-11T01:00:44.207678341Z Transmission will run as
2022-12-11T01:00:44.207686675Z -------------------------------------
2022-12-11T01:00:44.207694852Z User name:   abc
2022-12-11T01:00:44.207702811Z User uid:    1000
2022-12-11T01:00:44.207710850Z User gid:    1000
2022-12-11T01:00:44.207719004Z -------------------------------------
2022-12-11T01:00:44.207727200Z 
2022-12-11T01:00:44.207735196Z Updating Transmission settings.json with values from env variables
2022-12-11T01:00:44.370908924Z Attempting to use existing settings.json for Transmission
2022-12-11T01:00:44.370959457Z Successfully used existing settings.json /data/transmission-home/settings.json
2022-12-11T01:00:44.370969000Z Overriding bind-address-ipv4 because TRANSMISSION_BIND_ADDRESS_IPV4 is set to 100.64.32.8
2022-12-11T01:00:44.370977670Z Overriding dht-enabled because TRANSMISSION_DHT_ENABLED is set to true
2022-12-11T01:00:44.370996490Z Overriding download-dir because TRANSMISSION_DOWNLOAD_DIR is set to /data/download
2022-12-11T01:00:44.371027272Z Overriding incomplete-dir because TRANSMISSION_INCOMPLETE_DIR is set to /data/incomplete
2022-12-11T01:00:44.371036866Z Overriding peer-port because TRANSMISSION_PEER_PORT is set to 49115
2022-12-11T01:00:44.371045267Z Overriding pex-enabled because TRANSMISSION_PEX_ENABLED is set to true
2022-12-11T01:00:44.371256010Z Overriding port-forwarding-enabled because TRANSMISSION_PORT_FORWARDING_ENABLED is set to false
2022-12-11T01:00:44.371265355Z Overriding rpc-password because TRANSMISSION_RPC_PASSWORD is set to [REDACTED]
2022-12-11T01:00:44.371274214Z Overriding rpc-port because TRANSMISSION_RPC_PORT is set to 9091
2022-12-11T01:00:44.371282825Z Overriding rpc-username because TRANSMISSION_RPC_USERNAME is set to 
2022-12-11T01:00:44.371291386Z Overriding watch-dir because TRANSMISSION_WATCH_DIR is set to /data/watch
2022-12-11T01:00:44.371299668Z Overriding watch-dir-enabled because TRANSMISSION_WATCH_DIR_ENABLED is set to false
2022-12-11T01:00:44.399837435Z sed'ing True to true
2022-12-11T01:00:44.401112807Z DROPPING DEFAULT ROUTE
2022-12-11T01:00:44.414481264Z STARTING TRANSMISSION
2022-12-11T01:00:44.414530300Z Transmission startup script complete.
2022-12-11T01:00:44.468961129Z 2022-12-11 02:00:44 WARNING: this configuration may cache passwords in memory -- use the auth-nocache option to prevent this
2022-12-11T01:00:44.471995265Z 2022-12-11 02:00:44 Initialization Sequence Completed

HW/SW Environment

No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 22.04.1 LTS
Release:        22.04
Codename:       jammy

Client: Docker Engine - Community
 Version:           20.10.21
 API version:       1.41
 Go version:        go1.18.7
 Git commit:        baeda1f
 Built:             Tue Oct 25 18:01:58 2022
 OS/Arch:           linux/amd64
 Context:           default
 Experimental:      true

Server: Docker Engine - Community
 Engine:
  Version:          20.10.21
  API version:      1.41 (minimum version 1.12)
  Go version:       go1.18.7
  Git commit:       3056208
  Built:            Tue Oct 25 17:59:49 2022
  OS/Arch:          linux/amd64
  Experimental:     false
 containerd:
  Version:          1.6.12
  GitCommit:        a05d175400b1145e5e6a735a6710579d181e7fb0
 runc:
  Version:          1.1.4
  GitCommit:        v1.1.4-0-g5fd4c4d
 docker-init:
  Version:          0.19.0
  GitCommit:        de40ad0

Anything else?

There appears to be a fix in a later version of Transmission 3, mentioned on the Transmission Github:

https://github.com/transmission/transmission/issues/3077

It appears this Nightly build fixes the issue.

kjwill555 commented 1 year ago

+1

for me, 28 torrents, container has only been up for 2 days and is currently using 2.97 GB out of my system's 4.00 GB RAM

onthecliff commented 1 year ago

I also have this problem.

jucor commented 1 year ago

Same problem. Is there a way to update the version of transmission used, please?

ameinild commented 1 year ago

Use the beta branch with Transmission 4.0.0 beta-2 - this works for me.

This image: https://hub.docker.com/layers/haugene/transmission-openvpn/beta/images/sha256-e7193f482b62412b7ded7a244cf3f857c8ed08961dcc18b660ad1b55cb96efe5?context=explore

And this command to pull: docker pull haugene/transmission-openvpn:beta

jucor commented 1 year ago

Thanks a lot @ameinild , ~that sorted it!~ Edit: I might have spoken too fast. RAM usage rising up again, with only a single torrent. Will let it grow overnight and report. Running on a low-RAM NAS, so this could be enough of a problem to require running an alternate client -- but rare to find one nicely packaged for VPN like this (thanks @haugene !)

onthecliff commented 1 year ago

Use the beta branch with Transmission 4.0.0 beta-2 - this works for me.

This image: https://hub.docker.com/layers/haugene/transmission-openvpn/beta/images/sha256-e7193f482b62412b7ded7a244cf3f857c8ed08961dcc18b660ad1b55cb96efe5?context=explore

And this command to pull: docker pull haugene/transmission-openvpn:beta

unfortunately transmission 4.00 is banned on about a third of the private trackers that I use so this isn't an option but thank you for the suggestion.

ameinild commented 1 year ago

Yeah, I know it's an issue that Transmission Beta is banned on private trackers. In this case, I would suggest instead reverting to an earlier Docker image, based on Ubuntu 20.04 and Transmission 2.9X, like 4.2 or 4,1. Or possibly the Dev branch - don't know if this fixes the issue yet? But there should be other possibilities. 😎

There are actually several Docker tags to try out:

https://hub.docker.com/r/haugene/transmission-openvpn/tags

jucor commented 1 year ago

@ameinild I'm still getting the memory issue :( I'm now at 5.43 GB after running 2 days with one single torrent :/ image Weirdly enough, the host system (a Synology DSM 6) only reports 809 MB total memory used, so I'm really puzzled. Any idea what's going on please?

ameinild commented 1 year ago

I have no idea - the Beta version works perfectly for me on Ubuntu. You could try rolling back to an earlier release. Else, wait for a stable version of Transmission where they fix the memory leaks that are clearly present. 😬

jucor commented 1 year ago

Weirdly, this morning RAM is back to 51MB. Go figure... 🤔

joe-eklund commented 1 year ago

I am stopping in to say I am having memory leaks even on the new beta. I have 128 GB of ram so I didn't really notice this until recently when it also filled up all my swap.

Here is the memory after running for a few hours with < 40 torrents.

ram
jucor commented 1 year ago

Thanks for sharing! I wonder whether we should move this convo to an issue upstream (i.e. on transmission repo)?

ameinild commented 1 year ago

It's strange. It seems the memory leak issue is hitting randomly for different versions of Transmission and on different OS'es. On Ubuntu 22.04 I had no issue with Trans 2.94, huge memory leak an Trans 3.00, and no problem again on Trans 4.0-beta. This would make it very difficult to troubleshoot, I guess.. 😬

joe-eklund commented 1 year ago

I am also using Ubuntu 22.04.

After it quickly jumped back up to over 20GB, I instituted a memory limit through Portainer and that has helped. Now it doesn't go above whatever limit I set. I am not sure if it will affect the functionality of the container though. Guess we'll see.

I also switched back from beta to latest since that didn't fix it anyway and I would rather run a stable version.

DaveLMSP commented 1 year ago

I'm running Linux Mint 20.3 with v4.3.2 of the container. I haven't tried alternate versions of Transmission, but I became aware of this issue when I noticed high swap usage on the host. After running for about a week with 17 torrents, the container was using 31.5GB of memory and another 6GB of swap. I've been using a limit through Portainer for the last several days without any major issues. I have seen its status listed as 'unhealthy' a couple times, but it resumed running normally after restarting via Portainer.

ivalkenburg commented 1 year ago

Same issue here. Im not sure what changed, it started doing this recently. The image im using was pulled 2 months ago. Either i didnt notice it until now, or something changed...

seanmuth commented 1 year ago

Same here. Capped the container @ 12GB (64GB system) and it ramps to 12GB super quickly. I restart the containernightly as well.

image
haugene commented 1 year ago

I was hoping that Transmission 4.0.0 would be our way out of this, troubled to hear that some is still experiencing issues with it :disappointed:

The release is now out :tada: https://github.com/transmission/transmission/releases/tag/4.0.0 :tada: and already starting to get whitelisted on private trackers. But if there's still memory issues then we might have to consider doing something else.

If this could get fixed upstream, or we could narrow it down to a server OS and then report it then that would be the best long term solution I guess. If not the only thing that comes to mind is changing the distro of the base image to see if that can have an effect. Before we start automatically restarting Transmission within the image or other hackery :grimacing:

The beta tag of the image was updated with the 4.0.0 release version so :crossed_fingers:

DaveLMSP commented 1 year ago

A couple weeks ago I noticed the web interface getting unresponsive when the container was listed as unhealthy and set up a cron job to restart the container every 4 hours. Initially I tried longer intervals, but the container would go unhealthy as soon as the 1GB memory limit was reached and essentially stop working. With a 4 hour restart window I'm able to catch the container before it goes unresponsive, and it's been working great. If it would be helpful, I can adjust the restart interval and post container logs.

ameinild commented 1 year ago

The latest version with Transmission 4.0.0 release still works good for me on Ubuntu 22.04 server. 👍

theythem9973 commented 1 year ago

Saw this thread on transmission itself about high memory usage, even with 4.0; may be pertinent: https://github.com/transmission/transmission/issues/4786

haugene commented 1 year ago

Very curious to follow that thread @theythem9973. Hopefully they'll find something that can help us here as well :crossed_fingers:

But this issue was reported here when upgrading to 4.3 of this image which is using a much older build of Transmission, and we also previously ran v3.00 of Transmission under alpine without issues (tag 3.7.1). So, I'm starting to doubt the choice of Ubuntu 22.04 as a stable option :disappointed: We moved from alpine for a reason as well, so I'm not sure if we want to go back there or go Debian if Ubuntu doesn't pan out.

Before we change the base image again I'm just curious if there's a possibility to solve this by rolling forward instead. I've created a new branch for the 22.10 (kinetic) release. Anyone up for pulling the kinetic tag of the image and see if that works any better?

haugene commented 1 year ago

In addition to the kinetic tag I now also tried rolling back to focal as the base image and installing Transmission via the ppa so that we still stay on Transmission 3.00. So you can also try using tag focal and see if that's better.

CurtC2 commented 1 year ago

Found this was clobbering me as well. I pulled kinetic I'll let you know how it goes. Pre-kinetic this was all transmission: image

Hmm, kinetic isn't working for me: Checking port... transmission-vpn | Error: portTested: http error 400: Bad Request transmission-vpn | ####################### transmission-vpn | SUCCESS
transmission-vpn | ####################### transmission-vpn | Port: transmission-vpn | Expiration Fri Mar 3 00:00:00 EST 2023 transmission-vpn | ####################### transmission-vpn | Entering infinite while loop transmission-vpn | Every 15 minutes, check port status transmission-vpn | 60 day port reservation reached transmission-vpn | Getting a new one transmission-vpn | curl: (3) URL using bad/illegal format or missing URL transmission-vpn | Fri Mar 3 23:28:02 EST 2023: getSignature error transmission-vpn | transmission-vpn | the has been a fatal_error transmission-vpn | curl: (3) URL using bad/illegal format or missing URL transmission-vpn | Fri Mar 3 23:28:02 EST 2023: bindPort error transmission-vpn | transmission-vpn | the has been a fatal_error

Trying focal. Update focal has been running download & seed for 12+ hours with zero memory increase.

pkishino commented 1 year ago

I see that 4.0.1 is released with possible fix in transmission. I’ll make a new build with this on the beta branch

ameinild commented 1 year ago

The latest beta with Transmission 4.0.1 is (still) working fine for me on Ubuntu 22.04 and Docker 23.0.1. 👍

pkishino commented 1 year ago

I’m running latest transmission 4.0.1 since last week on two containers with 5/10 gb limits, larger container is fairly constant with 50ish seeding torrents, always around 7-8gb used die last 4 months of stats I have(since older versions as well). Second one is main dl container, carrying between 0-10 ish torrents and seldomly goes above 2gb. Running on an old Mac mini, Monterrey and latest docker for Mac.

Salamafet commented 1 year ago

Same problem here with the 4.0.1 version.

After restarting the container, everything back to normal. I have modified my docker-compose file to set a limit of RAM just in case.

timothe commented 1 year ago

Same for me on Synology DSM 7.1.1 and latest Docker image. Always maxing out the available memory. I have to limit the usage in Docker.

sgtnips commented 1 year ago

Been running Focal for almost 3 weeks and it's looking good. Below can see my memory settle down halfway through week 9.

However bizarrely the container thinks it's chewing up 10GB of mem when the total system is barely using 2GB. Maybe it's all cached and I've just never looked too closely before.

Anyway, Focal looks good for me on Ubuntu 22.04 and Docker 23.0.1.

mem

system2 1month

Moving4407 commented 1 year ago

Switched to the focal branch, no more issues w/ Ubuntu 20.04.6 LTS on ARMv71 w/ 4 GB of RAM and Docker 23.0.2.

onthecliff commented 1 year ago

Focal has been running great for the last few weeks for me.

pkishino commented 1 year ago

tried updating focal branch with latest updates but cannot get the GitHub action to build..builds fine locally.. any suggestions?

ilike2burnthing commented 1 year ago

CMake errors from the build log:

list sub-command REMOVE_ITEM requires two or more arguments

Could NOT find DEFLATE: Found unsuitable version "1.5", but required is at least "1.7" (found /usr/include)

Could NOT find CURL (missing: CURL_LIBRARY) (found suitable version "7.68.0", minimum required is "7.28.0")

Outdated build dependencies? https://github.com/transmission/transmission/issues/4142 https://github.com/microsoft/vcpkg/issues/18412

Sabb0 commented 1 year ago

Hi everyone, I also had this problem. Pulled the focal branch and memory usage overnight appears to be stable at around 250MB with 50ish seeding torrents.

pkishino commented 1 year ago

Is anyone actually seeing a leak on 5.0.2?? That uses 22.04 as base but runs transmission 4.0.3

Sabb0 commented 1 year ago

The extent of my knowledge is..... I pulled the latest branch a week or so ago and the memory usage crashed the container within an hr. Only just got round to looking into it, so pulled the focal banch (no change in seed number etc) and it's been fine overnight.

I'll pull the latest branch again and see what happens.

pkishino commented 1 year ago

I don’t recall what version :latest is but I’ve been running :5.0.2 and it works great. When you pulled latest, what version of transmission was it using?

On Wed, 26 Apr 2023 at 16:09, Sabb0 @.***> wrote:

The extent of my knowledge is..... I pulled the latest branch a week or so ago and the memory usage crashed the container within an hr. Only just got round to looking into it, so pulled the focal banch (no change in seed number etc) and it's been fine overnight.

— Reply to this email directly, view it on GitHub https://github.com/haugene/docker-transmission-openvpn/issues/2469#issuecomment-1522898788, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA7OFYTWZEJ4JZPDBYXUAALXDDC3NANCNFSM6AAAAAATA4O64U . You are receiving this because you commented.Message ID: @.***>

Sabb0 commented 1 year ago

Unfortunately, I don't have that info - not very useful I know! I will report back later once it's been running for a few hours.

I don’t recall what version :latest is but I’ve been running :5.0.2 and it works great. When you pulled latest, what version of transmission was it using?

ilike2burnthing commented 1 year ago

:latest and :5.0.2 are the same.

ameinild commented 1 year ago

I'm now running :latest (:5.0.2), and still no issues for me. Was previously running :beta (with Transmission 4.0.2), which also worked fine.

Sabb0 commented 1 year ago

After a day, the latest branch is running around 600MB, so not crazy.

For some comparison, I have a version 3.3 container at 350MB. The same seeds etc. I assume the difference is due to 3.3 running alpine.

theythem9973 commented 1 year ago

I've been using a focal for a couple weeks now. Previously using "latest" (don't know the exact version). focal has been great - it's sitting pretty at ~700 MB when previously it'd grow to upwards of 18 GB until it hits swap / crashes.

haugene commented 1 year ago

Glad to hear it @theythem9973 :ok_hand: We're on to something :smile: Are you also up for testing with the newest 5.x release? The 5.0.2 tag?

timothe commented 1 year ago

Latest version is drastically reducing memory usage. I'm running at 88Mb with 2 days running... Case closed imo

joe-eklund commented 1 year ago

FWIW, I am running latest and still have this issue. Hitting my 4 GB limit I have set through Portainer.

haugene commented 1 year ago

latest is a bit of a floating reference. Do you have logs showing the revision. Or double checked that you've pulled lately? Can you change for tag 5.0.2 just to be sure?

fre. 28. apr. 2023, 17:54 skrev Joe Eklund @.***>:

FWIW, I am running latest and still have this issue. Hitting my 4 GB limit I have set through Portainer.

— Reply to this email directly, view it on GitHub https://github.com/haugene/docker-transmission-openvpn/issues/2469#issuecomment-1527765098, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAH5ODXGNPAB5W3NZZVRCS3XDPR2DANCNFSM6AAAAAATA4O64U . You are receiving this because you were mentioned.Message ID: @.***>

joe-eklund commented 1 year ago

I am using the tag: haugene/transmission-openvpn:latest. I just tried to pull again and nothing changed. Portainer is reporting that the image is up to date. I poked around the logs but didn't see anything that jumped out at me and said that I was using latest. But I am fairly confident I am using https://hub.docker.com/layers/haugene/transmission-openvpn/latest/images/sha256-df0b4b4c640004ff48103d8405d0e26b42b0d3631e35399d9f9ebdde03b6837e, given that Portainer says what the container is using is the most up to date.

I swapped to 5.0.2 and now Portainer has the same image as being tagged for both 5.0.2 and latest, so it's the same image whether I change to 5.0.2 or use latest. I will leave it as 5.0.2 and monitor, but I suspect it will exhibit the same behavior since the actual image being used didn't change. Right now it's at ~600MB and every few seconds it is going up by ~30 MB.

EDIT: I looked at the logs and see Starting container with revision: 1103172c3288b7de681e2fb7f1378314f17f66cf.

haugene commented 1 year ago

Sounds like you had the correct image all along then. So it will probably use a lot of memory now as well. What OS and version are you running, and docker version?

fre. 28. apr. 2023, 18:14 skrev Joe Eklund @.***>:

I am using the tag: haugene/transmission-openvpn:latest. I just tried to pull again and nothing changed. Portainer is reporting that the image is up to date. I poked around the logs but didn't see anything that jumped out at me and said that I was using latest. But I am fairly confident I am using https://hub.docker.com/layers/haugene/transmission-openvpn/latest/images/sha256-df0b4b4c640004ff48103d8405d0e26b42b0d3631e35399d9f9ebdde03b6837e, given that Portainer says what the container is using is the most up to date.

I swapped to 5.0.2 and now Portainer has the same image as being tagged for both 5.0.2 and latest, so it's the same image whether I change to 5.0.2 or use latest. I will leave it as 5.0.2 and monitor, but I suspect it will exhibit the same behavior since the actual image being used didn't change. Right now it's at ~600MB and every few seconds it is going up by ~30 MB.

— Reply to this email directly, view it on GitHub https://github.com/haugene/docker-transmission-openvpn/issues/2469#issuecomment-1527789820, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAH5ODXOE6V6K3GGLJK4PFDXDPUGFANCNFSM6AAAAAATA4O64U . You are receiving this because you were mentioned.Message ID: @.***>

joe-eklund commented 1 year ago

OS is Ubuntu 22.04 LTS. Docker 23.0.3. And after restarting the container a couple hours ago it's back up to my Portainer memory limit (4GB).