SneaksAndData / arcane-operator

Kubernetes-native data streaming service based on Akka.NET
Apache License 2.0
0 stars 2 forks source link

[BUG] Potential Memory Leak #120

Open jrbentzon opened 1 month ago

jrbentzon commented 1 month ago

Description

We've seen our Arcane operator increase in memory until it hits our limits to then get restarted by kubernetes There are only 5 streams attached to the operator, so looks like it could be a memory leak: image

Steps to reproduce the issue

Monitor Arcane Operator Memory

Describe the results you expected

More or less constant

System information

v0.0.10

s-vitaliy commented 1 month ago

Need to investigate. I don't see this behavior on our cluster, but we did not perform the massive migration of our streams. I will add more streams and take a look at how much memory Arcane.Operator consumes. Also a problem may exist with cache and events deduplication.