Closed jypma closed 10 months ago
@jypma can you try the latest beta please? https://github.com/sblantipodi/firefly_luciferin/wiki/How-to-install-a-BETA
What Linux distro are you using? KDE or GNOME or whatelse?
I'll have a go with the beta, thanks!
I'm using Arch linux. Using neither KDE or GNOME, just i3 as window manager on plain X11.
I've tried with a beta version, and I'm still seeing the same behavior. The memory usage increases with about 2MB per second when the streaming is active.
@jypma I was able to reproduce the issue, I'll keep you posted. Thanks for reporting :+1:
@jypma I correct myself. I'm not able to reproduce it... memory usage tops out a bit higher than XMX since Luciferin uses "off heap" memory due to native calls to the OS using JNA but then stops growing...
Using -Xmx256M is not recommended since it's not enough for all the "capture, post process"...
Using the default Xmx1024 my RAM usage tops out at 1.1GiB.
Have you tried waiting to see what is the max memory used? What tool you use to monitor the RAM usage?
I'm monitoring the heap usage using jconsole
, and the process usage using top
. With -Xmx256
, GC seems to be keeping up. Note that it's not the heap usage that's increasing, it's non-heap usage (memory allocated by the Java process, but not heap). So while I'm running the application:
jconsole
).top
).Do you know which native libraries might be in use that stream the video frames into Java? Perhaps it's something outside of firefly that has the leak.
I'll also revert to 2.9.2 (the last version I was running before the upgrade), to see what behavior we get there.
@jypma GStreamer is a lib that uses native APIs, which version are you using?
you can check it via
gst-launch-1.0 --version
GStreamer is not bundled in the Luciferin's deb or rpm package so this depends on your installation.
Reverting back to a previous version of GStreamer may worth a try but be sure to correctly use the package manager in order to not "compromise" your installation.
It's currently
gst-launch-1.0 version 1.22.6
GStreamer 1.22.6
and I seem to have upgraded from gstreamer-1.20.3
during the upgrade. I'll try those permutations as well, once I have some time :)
ok please keep me posted @jypma :)
Here's what I've been able to test so far.
MQTTManager - Can't send MQTT msg
and crashes. Memory usage does not seem to rise in the same way as 2.12.5 does in those seconds, though.
WARNING: JNA: Callback org.freedesktop.gstreamer.elements.AppSink$2@705b6b8c threw the following exception
java.lang.IndexOutOfBoundsException: Index -634 out of bounds for length 57600
at java.base/jdk.internal.util.Preconditions.outOfBounds(Unknown Source)
at java.base/jdk.internal.util.Preconditions.outOfBoundsCheckIndex(Unknown Source)
at java.base/jdk.internal.util.Preconditions.checkIndex(Unknown Source)
at java.base/java.util.Objects.checkIndex(Unknown Source)
at java.base/java.nio.Buffer.checkIndex(Unknown Source)
at java.base/java.nio.DirectIntBufferU.get(Unknown Source)
at org.dpsoftware.grabber.ImageProcessor.calculateBlackPixels(ImageProcessor.java:282)
at org.dpsoftware.grabber.ImageProcessor.autodetectBlackBars(ImageProcessor.java:225)
at org.dpsoftware.grabber.GStreamerGrabber$AppSinkListener.rgbFrame(GStreamerGrabber.java:188)
at org.dpsoftware.grabber.GStreamerGrabber$AppSinkListener.newSample(GStreamerGrabber.java:322)
at org.freedesktop.gstreamer.elements.AppSink$2.callback(AppSink.java:232)
at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(Unknown Source)
at java.base/java.lang.reflect.Method.invoke(Unknown Source)
at com.sun.jna.CallbackReference$DefaultCallbackProxy.invokeCallback(CallbackReference.java:585)
at com.sun.jna.CallbackReference$DefaultCallbackProxy.callback(CallbackReference.java:616)
and on the latest beta (slightly different):
WARNING: JNA: Callback org.freedesktop.gstreamer.elements.AppSink$2@157b24bd threw the following exception
java.lang.IndexOutOfBoundsException
at java.base/java.nio.Buffer$1.apply(Buffer.java:757)
at java.base/java.nio.Buffer$1.apply(Buffer.java:754)
at java.base/jdk.internal.util.Preconditions$4.apply(Preconditions.java:213)
at java.base/jdk.internal.util.Preconditions$4.apply(Preconditions.java:210)
at java.base/jdk.internal.util.Preconditions.outOfBounds(Preconditions.java:98)
at java.base/jdk.internal.util.Preconditions.outOfBoundsCheckIndex(Preconditions.java:106)
at java.base/jdk.internal.util.Preconditions.checkIndex(Preconditions.java:302)
at java.base/java.nio.Buffer.checkIndex(Buffer.java:768)
at java.base/java.nio.DirectIntBufferU.get(DirectIntBufferU.java:358)
at org.dpsoftware.grabber.ImageProcessor.calculateBlackPixels(ImageProcessor.java:290)
at org.dpsoftware.grabber.ImageProcessor.autodetectBlackBars(ImageProcessor.java:233)
at org.dpsoftware.grabber.GStreamerGrabber$AppSinkListener.rgbFrame(GStreamerGrabber.java:187)
at org.dpsoftware.grabber.GStreamerGrabber$AppSinkListener.newSample(GStreamerGrabber.java:323)
at org.freedesktop.gstreamer.elements.AppSink$2.callback(AppSink.java:232)
at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)
at java.base/java.lang.reflect.Method.invoke(Method.java:580)
at com.sun.jna.CallbackReference$DefaultCallbackProxy.invokeCallback(CallbackReference.java:585)
at com.sun.jna.CallbackReference$DefaultCallbackProxy.callback(CallbackReference.java:616)
These are continually occurring, possibly on every frame. I didn't pay attention to them before, figuring it might be a different issue. Don't know JNA that well, but an exception inside a callback there might be reason for some memory to disappear.
Now that you mention gstreamer
, it might be related? Still, it's strange... looking at ImageProcessor
, it sensibly assumes that IntBuffer
to have width * height
size.
I think the above exception is the biggest smoking gun.
mmm... this is interesting... but does LEDs works correctly when the exception occurs?
can you attach the complete FireflyLuciferin.yaml you are using please? please use the latest beta with the latest firmware, save settings just to overwrite the configuration file before sending me it.
The LEDs work fine when the exception occurs. Which is strange :smile:
Current config is here: FireflyLuciferin.yaml.gz (I actually have 180 LEDs but I haven't updated the config for that yet)
As mentioned above, I'm using firmware 5.11.8, which is the latest release I found on Github. Should I use a different one? I don't think it's likely that the firmware would be causing a memory leak in the client though. It's displaying LED data from both Luciferin versions just fine.
I decided to dive a little deeper, and a bit of extra logging showed offsetY
becoming negative, resulting in the buffer index being negative as well. The root cause for that was calculateBorders
diving below zero for low resolutions. Blame my still-not-dead low-res plasma TV :-)
PR is here: https://github.com/sblantipodi/firefly_luciferin/pull/150
I love this kind of issue that ends with a pull request :D thank you very much @jypma, this is much appreciated!!!
PS: A new release is coming soon and it will obviously include your fix.
Firefly Luciferin version
2.12.5
Glow Worm Luciferin version
5.11.8
Firmware type
FULL
What is the stream method?
MQTT Stream
Fiefly Luciferin config file
Relevant log output
How to reproduce
Start the video grabber, everything with default options, except for streaming through MQTT.
I've manually started the application using
(on JDK21), to rule out heap usage. The heap and metaspace both look fine, but the RES memory usage of the process increases with ~100MB every few seconds. I've looked at some hints to diagnose, but nothing stands out there. NIO ByteBuffers aren't much in use. I did in
pmap
find a lot of the following entries:So something is allocating memory in 64MB chunks (and it's probably not heap).
That's as far as I got...any hints? I'm unsure which version introduced this, it's been a while since I've attempted to upgrade :-)