Closed mwangcaibmcom closed 1 year ago
For the first deadlock reported in the description it relates to Open J9 issue: https://github.com/eclipse-openj9/openj9/issues/14037. In Liberty I think we can work around this and I have started a PR #24660 to provide a workaround.
This issue was originally reported with issue #12704.
@jhanders34 can you please let us know at what versions the issue has been fixed. we are currently at version 21.0.0.10. So, we know to what version we need to upgrade
@jhanders34 can you please let us know at what versions the issue has been fixed. we are currently at version 21.0.0.10. So, we know to what version we need to upgrade
Presently Liberty 23.0.0.3 is the next release that will be made available. There isn't a guarantee that the pull request that I merged in will be included in 23.0.0.3. When the build is chosen for 23.0.0.3, and if the pull request is in that build, this issue will be marked with a release:23003
label if it is fixed in that release. I anticipate that to be what happens, but cannot guarantee or promise it.
Describe the bug
Liberty Server hang happened randomly within docker containers which are deployed in k8s cluster in an Integration Pipeline. The javacore dumps were collected to analyze the issue. In one javacore, one deadlock was identified and all operational threads are blocked due to the deadlock
In another javacore generated in different time when server hang, there were thousands locks of
JIT-QueueSlotMonitor-<number> lock
like below, which is very abnormalSteps to Reproduce
The issue happened randomly in a secured environments and no steps to reproduce it
Expected behavior
A clear and concise description of what you expected to happen.
Diagnostic information:
$WLP_OUTPUT_DIR/messages.log
Additional context