Thefrank / jellyfin-server-freebsd

jellyfin-server component for freebsd
187 stars 16 forks source link

Memory use spiraling out of control after upgrade to 10.9 #84

Open nx6 opened 2 months ago

nx6 commented 2 months ago

I was running Jellyfin 10.8.13 in a 13.2 jail, under TrueNAS 13.0, and got stuck with that issue where jellyfin 10.9 would not work on a jail that did not match the kernel of the underlying system. So I rolled back my jail and stayed on 10.8.13.

I waited for TrueNAS 13.3 to come out and then again updated my jail and jellyfin to 10.9 (I'm using the pkg install). So I went directly to 10.9.6.

Within a couple days of bring things back online I noticed the RAM usage on my system seemed much higher than it had been under the old TrueNAS 13.0 system. But it wasn't the OS upgrade itself that was the start of the RAM change. It actually was later on that day when I reactivated and upgraded my jellyfin jail/server app that the graphs started to skew off. I checked htop and found this:

KwN7jmo

Note: Plex is running with essentially the same media files as jellyfin here.

I shut down the jail and rebooted the whole system to clear any OS cache and get my RAM back. I didn't start jellyfin back up after that. Two days ago I booted the jail back up (so it could scan and update libraries from a bunch of media file changes/renaming I've been doing. Here is where we are now:

uMgBMqR

Thefrank commented 2 months ago

Do you have "Enable real time monitoring" turned on in Jellyfin? It usually exhausts fds long before RAM but if it is on, it might be worth testing with it off.

nx6 commented 2 months ago

Looks like that is active by default on the library creation. I thought it didn't so anything on BSD. I'll disable it on all the libraries and relaunch the jail.

badrpc commented 1 month ago

I don't know if it helps but here are some graphs. Unfortunately I only have stats for virtual memory. I does look exceptionally high but to me it looks more like one off jump with further jumps possibly corresponding to me adding more media files. I'm not immediately sure it's a leak and not something like memory-mapped files for example.

90 days with a jump clearly visible: image

Shorter range with more details: image

Resident memory usage remains under 4G which seems to match the virtual memory usage before I upgraded from one of older versions.

badrpc commented 1 month ago

I would say that number of open files didn't change with jellyfin upgrade: image

nx6 commented 1 month ago

It was showing in the "Inactive" memory usage for me.

mem_labeled

Thefrank commented 1 month ago

@nx6 or @badrpc are either of you seeing memory pressure elsewhere because of this? Is swap maxing out? Is the OS killing off processes due to low memory?

EDIT: The RES value will give a better number of what is actually being used. 8G seems rather high if Jellyfin is idle

nx6 commented 1 month ago

My system doesn't really use its swap space, so I can't say I've seen it. Nothing being dropped. Just my overall system memory usage kept going up and up the longer I left jellyfin running. I didn't wait to see what happened when it ran out.

Thefrank commented 1 month ago

The only critical value is RES and that should not constantly go up over time if idle. Up and down is fine. It will grow if being actively used. It using 8 of 32G that your system has does seem high especially so if it has been idle the entire time. Does Jellyfin still use that same amount after disabling real time monitoring?

Note on some of the behavior: Versions of Jellyfin before 10.9 were built using dotNET6. With dotNET8 there were a number of fixes to how dotnet based programs determine memory available and GC under FreeBSD. This likely accounts for some of the behavior changes as the program now has a better idea of how much memory is available to it

nx6 commented 1 month ago

Does Jellyfin still use that same amount after disabling real time monitoring?

I have disabled the setting on all libraries and rebooted the overall system yesterday. Currently things look like this:

DfeIAoq

badrpc commented 1 month ago

@Thefrank I don't see any evidence of memory pressure on this machine. I only went to check my stats because I saw the discussion in this issue. I still have a decent amount of free RAM (64G total, 14G free according to top), swap is not used and no traces of processes being killed due to out of memory situation. I was also trying to suggest (probably too vague) that amount of virtual memory reported for a process is usually not a problem and RES is a better signal.

@nx6 I don't see the overall size of a process (the 265G in my case) reflected in inactive memory stats for the system. My system reports inactive memory about 29.79G: vm.stats.vm.v_inactive_count: 7808987 (in 4k pages). I also don't think the amount of inactive memory as shown on a graph in https://github.com/Thefrank/jellyfin-server-freebsd/issues/84#issuecomment-2327538721 matches the virtual size of jellyfin process on your machine so I wouldn't link those two. And generally I wouldn't worry about the inactive memory growth - I believe inactive memory can be quickly reused by the OS when needed, it's just not being freed proactively.

Thefrank commented 1 month ago

How is it going after a few weeks? Is it better with file monitoring off?

nx6 commented 1 month ago

How is it going after a few weeks? Is it better with file monitoring off?

It definitely seemed slowed down after I fixed that. Currently at 17% total system memory:

a7dRgYE