Closed GoogleCodeExporter closed 8 years ago
Here is the memory usage for version 1.4, the previous was from 1.3. Also here
is a java dump from the past. Anything else you need just yell. Thanks
Original comment by abign...@gmail.com
on 21 Feb 2013 at 3:27
Attachments:
And a little bit more. Its a bit more detailed, from the last run of the
example code, none of my code is used.
Original comment by abign...@gmail.com
on 21 Feb 2013 at 4:01
Attachments:
Thanks for all of the info. The only thing really jumping out at me is the
AbstractQueuedSynchronizer which suggests that I messed up something in the
single thread executor in Java. I'll setup the demo app to run with jvisualvm
and see if I can duplicate your results. If you care to take a look, the code
in question is located in GlobalScreen.dispatchEvent(final NativeInputEvent e).
Original comment by a...@1stleg.com
on 21 Feb 2013 at 6:03
Ok I was able to produce a crash after about an 1:52 although it looks like it
may have been caused by a different problem. I did notice that the heap size
continued to balloon up until the crash occurred although the amount of memory
actually used by the VM appeared to be garbage collected normally. I was
unable to produce the large AbstractQueuedSynchronizer you were experiencing.
I will try again with a version from the trunk with debugging symbols to try
and get a reasonable back-trace. Can you let me know what you were profiling
when you observed the memory leak? Was it the example application or was this
after you integrated the library into a project of your own?
Thanks
Original comment by a...@1stleg.com
on 21 Feb 2013 at 9:23
Hi, Thanks for your continued efforts with this issue. The profiles in the
second and third of my posts are from your example code "Global Keyboard
Listener". I have attached an image of the settings I use in Netbeans. I am
using Java 1.7.0_09 on the Win7 machine. I will update to the latest version
and re run the test and post my results. Thanks
Original comment by abign...@gmail.com
on 21 Feb 2013 at 11:48
Opps, forgot to attached my image
Original comment by abign...@gmail.com
on 21 Feb 2013 at 11:49
Attachments:
Hi, any head way with this issue? I have not had a chance to look into it
myself, not until I have most other features of my program implemented. Thanks
for your support.
Original comment by abign...@gmail.com
on 25 Feb 2013 at 1:19
Hi, I used the 1.1 trunk to test under Linux with -Xms set to 20M and I was
unable to produce the problem with the built-in example application. The only
code I changed was clearing the output JTextArea every 100 lines or so.
Memory, heap size and garbage collection appear to be stable. I have attached
the working build I am using. Give it a try and see if its still leaking
memory for you.
If it is still leaking, there are a few things I would like to try. Have you
tested outside of Netbeans? I believe the Oracle JDK (not sure about
IcedTea/OpenJDK) ships with jvisualvm. Give it a try and see if it reports the
same memory usage. Also, try starting the app from Netbeans and from the
command line to see if there results are the same. If you are still
experiencing an issue, can you attach the .java file you are using the produce
the problem.
Thanks.
Original comment by a...@1stleg.com
on 25 Feb 2013 at 5:24
Attachments:
That should have been -Xmx not -Xms*
Original comment by a...@1stleg.com
on 25 Feb 2013 at 6:48
Thanks, I'll be trying this today, I'll post the results.
Original comment by abign...@gmail.com
on 27 Feb 2013 at 9:53
Hey,
So I have now replicated the leak on a linux, xp and win7 machine. This latest
run was done using the jvisualvm attached to the jar file run from the cmd
prompt. As you can see from the screen shot the leak is still occurring however
the memory is no longer assigned to the concurrent lock classes but to
primitive types byte[], char[] and also a tree map object. I have attached my
findings and the code I have been testing found on your wiki. Thanks.
Original comment by abign...@gmail.com
on 27 Feb 2013 at 11:54
Attachments:
I'll let it run for a couple days on my system see if it leaks. It seems to go
up to about 5 MB and back down to 1.9 MB over and over consistently until I
enable the profiler. At that point I see the memory spike, followed by some
GC, followed by a gradual increase in memory usage until the profiler
terminates. After that it looks like memory is back to normal.
Original comment by a...@1stleg.com
on 28 Feb 2013 at 12:59
Attachments:
I am pretty sure that if there was a leak is definitely fixed at this point.
The screen shot attached was the sample program running with the latest
library. The max heap size was set to 6MB. It starts out around 1MB and
appears to "leak" a bit of memory (~400K) each minute. This continued until the
JVM performed GC (automatically) at about 2:06:45 bringing the heap usage back
down to below the 1M starting point. This cycle appears to continue
indefinitely. I believe this is a performance "feature" of the JVM as doing
complete garbage collection all the time would be very expensive. The only
other thing I noticed is that memory usage does appear to increase when the
profiler is running as I mentioned in my previous post. That is probably also
normal behavior.
Original comment by a...@1stleg.com
on 28 Feb 2013 at 10:26
Attachments:
Ok, I'll run it overnight and see if it crashes or reallocates the memory. In
the past the jvm crashes instead of performing the massive gc. I'll post the
results tomorrow.
Original comment by abign...@gmail.com
on 1 Mar 2013 at 12:41
I am going to call this fixed for now. If you notice again, please reopen.
Original comment by a...@1stleg.com
on 6 Mar 2013 at 11:10
Original issue reported on code.google.com by
abign...@gmail.com
on 20 Feb 2013 at 9:48Attachments: