Open CarlosMOGoncalves opened 2 years ago
Hi @CarlosMOGoncalves,
Thank you for the very detailed analysis, compiled results and reproducer - It's much appreciated! I agree there is definitely not a hardware limitation here as I was able to observe the same performance hit on an i5-12600k and very fast NVMe storage.
I have been able to validate your claims using Zulu JDK 8 / 11 / 17 observing between 60% - 70% increased deployment times on JDK 17 over JDK 8 using Windows 10.
When using Ubuntu 20.04.3 LTS, I observed almost no increase in deployment time on JDK 17 over JDK 8, where a few deployments were slightly faster on JDK 17.
We agree this should be looked into and therefore have raised this issue with the development team with the JIRA FISH-6432
.
Thank you again for the detailed report, James
Hello @JamesHillyard ,
Thanks for checking it out. I never got to try it with a Linux distribution as all our development environment are Windows. Clearly there is something fishy here when running on Windows, which now I am very curious about.
However, I now have a delicate question for you: since we are fast approaching the final Payara 5 Community release, is is possble that any fix for this matter could make it into it?
Because otherwise this would only make it into Payara 6 and, although I plan on upgrading ASAP, it would also leave JDK 17 support on 5 a bit troublesome.
Thanks
Hi @CarlosMOGoncalves,
We can't guarantee the fix will be ready in time for the final community release, however we would like to include it in Payara Community.
If the fix is ready after the final release, it will be available in Payara 5 Enterprise, which would be recommended if you, or anyone else, wishes to continue using Payara 5 after the final Community release.
We hope you understand, James
Hello James,
I perfectly understand it and I know this issue probably came a bit "at the last minute". I really hope you make it so that I can avoid the overhead of moving all codebase to the jakarta namespace as well as to JDK 17 in one go on Payara 6.
Thanks all
@JamesHillyard Any news for us? :-)
Hi @CarlosMOGoncalves and @mkarg,
Our development team has been looking into this recently, and it appears this issue is significantly deeper than first meets the eye. We are continuing to work on this, and your patience in the meantime is much appreciated.
Thanks, James
Hello @JamesHillyard ,
Thanks for the update. I have to admit that I am very much looking forward for this matter to be worked on because I really want to jump to JDK 17, but I suspected that it could be something tricky as soon as it became clear it was a difference from OS to OS.
I would be very glad if you could keep us updated on this matter.
Hello @JamesHillyard ,
I have just noticed that the latest release for Payara Community seems to have included a fix for this matter, although I haven't tested it myself.
Can you provide some information of what and why was happening? And why creating that file to read the canonical name was bottlenecking the whole deploy business on JDK 12 and above? This is merely for my own curiosity's sake 😄
Also, I noticed it was merged into Payara 5... however, we are past the final release for Payara 5 Community. Is there any change, even if just extraordinarily, that you could consider releasing it, as a final treat for the users, because of the impact it has on Windows users?
Regardless of that, thank you for this fix!
We are seeing similar issue with Payara 6.2024.2 (Full) and Zulu Open JDK 17. The deployment with 6.2023.2 and Zulu open jdk 11 used take around 90 Seconds to deploy, but now with Payara 6.2024.2 and JDK 17 takes around 300 seconds. Are there are any configurations to tweak to get the deployment faster?
Description
Hello everyone.
I'm not quite sure this is a bug per se, but it certainly needs looking at.
I am trying out JDK 17 on Payara Platform, both Server and Micro. My goal is to upgrade several applications from JDK 8 straight to JDK 17.
The moment I have made the (small) changes in the pom.xml in order to compile the code to JDK 17 and actually ran it in a Payara Micro uberJAR I found out that the JDK 17 app starts a lot slower.
In my particular case, it will take somewhere from 50% up to almost 100% more time. Let's say 10 seconds on JDK 8 up to 18s on JDK 17. And it is not even a particularly large or complex app.
This happens only with JDK 17 and not with JDK 11 (I also tried it out in order to check whether this was 17 specific).
The issue seems to be connected with a large increase in the Class Scanning phase of deployment and, I suspect, with the ASM library.
Expected Outcome
Well, I expected JDK 17 apps to actually start faster that with JDK 8, but at least it should be as fast as with JDK 8.
Current Outcome
Currently, any application ran in JDK 17 will take longer to be deployed. This increase in time is directly proportional to the number of dependencies the app has, as well as the size, in classes, that those dependencies package.
In order to get some more info I have used the deployment trace feature that @pdudits has written about in an unrelated post on your forum (thanks Patrick, I needed it for something else, but it was really useful here).
This is what I have found out:
In JDK 8 This is a trimmed down actual example of an app being deployed on Payara Micro. You are likely familiar with this, it shows some phases of deployment. In here, for example the CREATE_DEPLOYMENT_CONTEXT Full takes around 6 seconds.
In JDK 11 The very same app takes roughy the same time (in this case, a further one second can be caused by anything, subsequent deployments took around the same as JDK 8)
In JDK 17 This is where it gets rough, the total deployment time went up some full 15 seconds from JDK 8 on the very same app. However, it clearly shows where it is taking the lion share of the difference.
We can now see that the deployment context is taking almost 3 times longer (6147 to 16554ms).
In order to have a clearer picture of what is happening, I have increased the logging detail and dug into the code. This is what I have found:
On the DeployCommand.class we have a line that scans for deployable types (DeployCommand::515)
Which then builds a Parser on ApplicationLifecycle::721 which in turn will scan the entire archive on Parse.parse(). This parser will use an Executor Service, which in my machine uses 3 threads to process (I haven't checked where it is created, so it might have different specs depending on the machine). This executor will be in charge of running the tasks wich call doJob()
doJob will define the action the task takes when scanning the archives. When parsing, it will either handleJar or handleEntry - ReadableArchiveScannerAdapter.onSelectedEntries() handleJar will try to find a sub archive and handleEntry will actually parse each class it finds. And here is where the issue seems to lie - when visiting the class using the ClassReader.
Somehow, on JDK 17, this parsing step of each class will take a lot longer that its JDK 8 or even JDK 17 counterpart. I suspect it has something to do with ASM's ClassReader taking longer to visit a JDK 17 compiled class than a JDK 8, though I am still not quite sure. In order to try to figure it further I tried replacing the asm jars with more recent ones (9.2 and 9.3) to see if it would improve the performance, but it did not have any effect - all in all, it took roughly the same time.
In here you can see a comparison of timings between JDK 8 class scanning and JDK 17. This was taken straight from the logs, by increasing the detail of
javax.enterprise.system.tools.deployment.common.level=FINE
. I then parsed the logs myself in order to keep only the actual message with the timings taken.In JDK 8
Notice that some smaller JARs are quickly parsed, but larger ones like Guava (637 ms) or POI (500ms) take way longer.
In JDK 17
All dependencies are taking longer, with the difference being even worse on the larger JARs. Here Guava (2740 ms) and POI (2355 ms) really paint the picture.
When we take the added times of each dependency - even with multiple threads parsing - we can see how this takes way longer on JDK 17. This particular app has around 117 dependencies, which is a relatively fair number of dependencies. Larger apps will take longer on the same machine.
Steps to reproduce
I have prepared a sample application that runs in Payara Micro. It can be found here
This app does nothing. It has a Jakarta REST endpoint but that is just for show. What it does have are some dependencies. Not too many, just enough to test the point.
master
,jdk11
andjdk8
master
has JDK 17 as default.mvn clean install
with a matching JDK version on your systemmvn payara-micro:start
. This will start the microbundleFinal notes
That app has a rather simple setup. In all other applications I have (and which my company uses in production) I actually have a permanent domain dir. This means everytime I deploy a new version of the app on my environment it will first redeploy (there should probably be a config to avoid this) the older version that is kept on the domain dir and only after that it will deploy the current one - this means it also doubles the time to deploy it, which means it is even worse on JDK 17.
My laptop is a fairly potent one, so I don't expect that to make a difference on hardware. I have left it on the envirnoment in any case.
I am not entirely sure that the CLASS_SCANNING span in deployment tracing is registering the right value. I think it is registering it too soon, because it should be closer to the total time it spends on CREATE_DEPLOYMENT_CONTEXT, since this is the part that is taking too long. Or I might have just missed the point by far :D
It might be possible to speed this up just by adding more threads to the Executor Service pool. If the issue really lies on ASM, then just adding more threads might do the trick.
I have here, in attachment, the full deployment traces of the app using JDK 8, 11 and 17. It might be useful to you. Maintenance_Corretto_JDK8.txt Maintenance_Corretto_JDK11.txt Maintenance_Corretto_JDK17.txt
Environment