Open dmuatdbh opened 5 years ago
How big is the source file? If it's "small", can you post it? Or at least have a small program that demonstrates the problem?
Is there anything special about the code? For example, do you have an enum with 5 billion values?
Thanx for your answer. We want to compile *.java source with a size of 98MB. That is just our source dir. Additionally we use external libraries with a size of 190MB in all.
There is nothing special about our source code. The biggest java file has a size of 116KB. I can't post our source files, but I will build a dummy project that comes as close to the original as possible (amout of files and external dependencies) and post that one, if the OOME is repeatable.
Hello @rspilker,
I managed to reproduce the OOME in a small Programm. You can clone the repo from here. Just open the pom.xml in your IDE and run the main from the ClassGenerator.java. That will generate a bunch of files with apache freemarker. The files use the @Getter and @Setter Anotations from lombok. If you compile all these newly generated files, you'll get the Error.
If I write the getter and setter myself without using lombok, I don't get the OutOfMemoryError while compiling all files.
I hope it'll help you to unterstand the problem.
Thank you.
Has anybody got an idea how to handle this problem? Or might has the same problem? Our stakeholders, that use our framework can't compile our sources without running into tha OutOfMemoryError.
I will look into it, I already have an idea what might cause this problem
Small update:
ImportScope
objects for every annotation processing round. This is gone away using java 14 so most likely they have already fixed that problem. Simply switching to a newer compiler version might solve your problem.asts
in JavacTransformer
can be removed completly, it does not make sense to keep all asts in memory.WeakHashMaps
in JavacAugments
, even if these maps are weak they never get cleaned up because the keys are parts of the compilation unit. That means that these maps require additional memory just by keeping milions of references. It might be possible to clean up them by hand at some point but this requires some bigger refactoring.Can you try to compile it using Java 14? I will create a PR to fix the first issue I found, that might be useful too.
Nice work :)
On Wed, Jul 8, 2020 at 10:53 AM Rawi01 notifications@github.com wrote:
Small update:
- After looking at some heap dumps I think there is a bug in javac 8, it creates millions of additional ImportScope objects for every annotation processing round. This is gone away using java 14 so most likely they have already fixed that problem. Simply switching to a newer compiler version might solve your problem.
- asts in JavacTransformer can be removed completly, it does not make sense to keep all asts in memory.
- WeakHashMaps in JavacAugments, even if these maps are weak they never get cleaned up because the keys are parts of the compilation unit. That means that these maps require additional memory just by keeping milions of references. It might be possible to clean up them by hand at some point but this requires some bigger refactoring.
Can you try to compile it using Java 14? I will create a PR to fix the first issue I found, that might be useful too.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/rzwitserloot/lombok/issues/2240#issuecomment-655385179, or unsubscribe https://github.com/notifications/unsubscribe-auth/AABIERITXQDXV3ZVMXXSSOLR2QXYHANCNFSM4IX62MBQ .
-- "Don't only practice your art, but force your way into it's secrets, for it and knowledge can raise men to the divine." -- Ludwig von Beethoven
When compiling with Java 14, the OutOfMemoryError has gone. Thank you for that hint. As we will compile with Java 8 for a while, your fix might help us in the meantime.
I am not sure if this change will really help in this case, the javac problem consumes way more memory. Is there a specific reason you have to use an old compiler? IIRC you can compile for an older Java version using a modern compiler without any problems.
@dmuatdbh Could you switch to using the java 14 compiler to generate java 8 class files? Or won't the code or some dependencies not even compile using a more modern version of java?
Thank you both for your fast reply. This is exactly the problem. We are still using dependencies from the com.sun-package, that won't compile in this case. Anyhow, we should get rid of it and then compile with a Java 14 compiler.
Just a short feedback. In a (locally built) lombok.jar with the fixes in the JavacTransformer (master-branch) the OutOfMemoryError remains. The solution ist to compile with JAVA 14.
@dmuatdbh - Is JAVA 14 the minimum version needed or JAVA 11 will also work ?
Yes, only the JAVA 14 compiler in combination with the fix in Lombok (see former post) could solve our problem.
Checking if this problem was addressed in Java 8. If not, does this work in Java 11?
Yes, compiling a huge amount of Files with JAVA 8, using Lombok, you might get this Error. Compiling with JAVA 14 the Error is gone. I don't know, if it is gone using JAVA 11.
It was not working with Java 11 when I tried last time about a year back.
When compiling one of out product java-source, we get an OutOfMemoryError after a while. We compile with up to 4GB of RAM.
We can simply reproduce the OOME when compiling the sources. We tried lombok 1.18.4 and upgraded to 1.18.9, but the result remains. The heapDump (up to 3 GB) shows, that lombok addresses 1.5 GB of memory.
Here is the stackTrace:
Our expectations where that no OOME occurs.