Open scottkurz opened 1 year ago
2023-11-15 DXDI UPDATE:
Received data from internal users; HC data shows large amt of time in compile.
Adding another idea: what if we were to introduce a behavior of only compiling Java files if they are newer than the classes?
And perhaps this would be too complicated to compute.... if every BB.java is newer than the corresponding BB.class I still might need to recompile BB if its dependency AA has been updated.
But say every .class file were newer than every .java file? Then we wouldn't need to compile in that circumstance, right? Could this actually be typical enough to make a difference? Might this only work with a larger compileWait
parm setting (than the dflt .5s)?
Something to consider as we prototype?
Are there any "correctness" concerns here? Is there any type of race condition such that for classes A and B, with A dependent on B, the pair could be compiled such that the A.class file in target/classes ends up having been compiled against the old version of B?
Though I don't have a public recreate yet, I have an internal report of dev mode slowdown with a project with a large number ~7K Java source files.
Tools like process explorer show high CPU usage and the overall system becomes unresponsive and slow.
Have experienced this with both Semeru and Temurin Java 17 JDKs
One idea to consider: could it make sense, e.g. in a Liberty Tools IDE use case, to take advantage of the fact that the IDE is already watching for certain file changes and doing builds upon detection (e.g. javac upon .java file changes) and do have a "lighter" dev mode which, say, doesn't bother to watch for Java changes?
What about something like filtered web resources? I'm not sure if, in Eclipse, for example, the m2e integration would do call the appropriate mvn goal if a filtered web resource were detected.
We need to start by profiling and understanding where the performance slowdown is coming from in more detail.