debridmediamanager / zurg-testing

A self-hosted Real-Debrid webdav server you can use with Infuse. Together with rclone it can mount your Real-Debrid torrent library into your filesystem and load it to Plex or Jellyfin.
432 stars 30 forks source link

Regularly crashing with message `runtime: out of memory: cannot allocate 4194304-byte block (125960192 in use)` #25

Closed westsurname closed 8 months ago

westsurname commented 8 months ago

Actual Behavior Regular crashes

Operating System Linux

Environment Setup I'm running the amd64 binary. Built At: 2023-12-18T01:53:52 Commit: 274d60f5ed9b1544faee36f76a714500fe1fb637 Version: v0.9.2-hotfix.4

Logs zurg.log

Rclone Configuration

[zurg]
type = webdav
url = http://localhost:9999/dav/
vendor = other
pacer_min_sleep = 0

[zurghttp]
type = http
url = http://localhost:9999/http/
no_head = false
no_slash = false
no_auth = false

Zurg Configuration

# Zurg configuration version
zurg: v1
token: [removed] # https://real-debrid.com/apitoken

# basic functionality
host: "[::]" # do not change this if you are running it inside a docker container
port: 9999 # do not change this if you are running it inside a docker container
concurrent_workers: 20
check_for_changes_every_secs: 5

# misc configs
retain_folder_name_extension: true # if true, zurg won't modify the filenames from real-debrid
retain_rd_torrent_name: false # if true, it will strictly follow RD API torrent name property w/c should make this more compatible with rdt-client
auto_delete_rar_torrents: true # if true, zurg will delete unstreamable rar files (these torrents will always be compressed in a rar archive no matter what files you select)
use_download_cache: true # if true, during zurg initialization, it will fetch all downloads to unrestrict links faster
enable_repair: true # BEWARE! THERE CAN ONLY BE 1 INSTANCE OF ZURG THAT SHOULD REPAIR YOUR TORRENTS
# on_library_update: sh plex_update.sh "$@"
# on_library_update: |
#   for arg in "$@"
#   do
#       echo "detected update on: $arg"
#   done

# network configs
network_buffer_size: 1048576 # 1 MiB
serve_from_rclone: false # serve file data from rclone, not from zurg (zurg will only provide rclone the link to download)
verify_download_link: true # if true, zurg will check if the link is truly streamable; only relevant if serve_from_rclone is set to true (as it already does this all the time if serve_from_rclone is false)
force_ipv6: false # force connect to real-debrid ipv6 addresses
rate_limit_sleep_secs: 6 # wait time after getting a 429 from Real-Debrid API
realdebrid_timeout_secs: 60 # api timeout
retries_until_failed: 5 # api failures until considered failed
# preferred_hosts: # Run ./zurg network-test
#   - 20.download.real-debrid.com
#   - 21.download.real-debrid.com
#   - 22.download.real-debrid.com
#   - 23.download.real-debrid.com
#   - 30.download.real-debrid.com
#   - 31.download.real-debrid.com
#   - 32.download.real-debrid.com
#   - 34.download.real-debrid.com
#   - 40.download.real-debrid.com

# List of directory definitions and their filtering rules
directories:
  # Configuration for anime shows
  # anime:
  #   group: media # directories on different groups have duplicates of the same torrent
  #   group_order: 10 # group order = priority, it defines who eats first on a group
  #   filters:
  #     - and: # you can use nested 'and' & 'or' conditions
  #       - has_episodes: true # intelligent detection of episode files inside a torrent
  #       - any_file_inside_regex: /^\[/ # usually anime starts with [ e.g. [SubsPlease]
  #       - any_file_inside_not_regex: /s\d\de\d\d/i # and usually anime doesn't use SxxExx

  # shows:
  #   group: media
  #   group_order: 20
  #   filters:
  #     - has_episodes: true  # intelligent detection of episode files inside a torrent

  torrents:
    # group: media  # because anime, shows and movies are in the same group,
    # group_order: 30 # and anime and shows has a lower group_order number than movies, all torrents that doesn't fall into the previous 2 will fall into movies
    # only_show_the_biggest_file: true # let's not show the other files besides the movie itself
    filters:
      - regex: /.*/ # you cannot leave a directory without filters because it will not have any torrents in it
yowmamasita commented 8 months ago

try with use_download_cache: false

also since you have a default __all__ directory you don't really need to setup any directories as that also consumes memory

westsurname commented 8 months ago

I'll try this and get back with an update.

Currently I'm attempting to use this as a plug and play replacement for parts of my current setup, so I'd like to change as little as possible. It's definitely good to know that it's more efficient to stick with the __all__ directory, though I'm curious what is the likelihood that it's consuming enough memory to cause this issue.

yowmamasita commented 8 months ago

125960192 bytes doesn't sound that much honestly

westsurname commented 8 months ago

Same issue with use_download_cache: false, though this time it's 190709760 in use.

yowmamasita commented 8 months ago

I think you have a very low limit that you have set

westsurname commented 8 months ago

I'm on a shared server with a large amount of ram and afaik no limit, but I'll run some tests and see if I come across a limit.

              total        used        free      shared  buff/cache   available
Mem:          503Gi       184Gi       5.7Gi       841Mi       313Gi       313Gi
Swap:            0B          0B          0B
westsurname commented 8 months ago

I just ran a simple script to consume 3 GB of ram and it did so without issue:

Allocating 100.0 MB of RAM
Available RAM: 313.2041702270508 GB, Used RAM: 184.20080947875977 GB
...
Available RAM: 310.5371856689453 GB, Used RAM: 186.91444778442383 GB
Allocating 3000.0 MB of RAM
yowmamasita commented 8 months ago

this is the first time i've seen the error so unfortunately I cannot help. Maybe best to ask in plex_debrid Discord

westsurname commented 8 months ago

Not running in docker.

Is this being developed hand-in-hand with plex_debrid?

yowmamasita commented 8 months ago

No but I use their discord a lot because a lot of them also use zurg

westsurname commented 7 months ago

Good news and bad new. Good news is that this appears to be working nicely, bad news is that the issue was probably pebcak and I was running the arm64 version instead of the amd64 one.

yowmamasita commented 7 months ago

@westsurname ohh 🤦🏼

westsurname commented 1 month ago

So it turns out that this issue never really went away and was just masked by pthread_create error instead. It happens because my user has a virtual memory limit of 10GB and as the ram usage increases to over 100MB the virtual memory allocation increases above the 10GB.

$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 2062595
max locked memory       (kbytes, -l) 65536
max memory size         (kbytes, -m) unlimited
open files                      (-n) 200000
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 2000
virtual memory          (kbytes, -v) 10000000
file locks                      (-x) unlimited