OpenLiberty / open-liberty

Open Liberty is a highly composable, fast to start, dynamic application server runtime environment
https://openliberty.io
Eclipse Public License 2.0
1.15k stars 592 forks source link

Test Failure: io.openliberty.http.monitor.fat.ContainerServletApplicationTest.cs1_simplePathPost #29001

Open fmhwong opened 4 months ago

fmhwong commented 4 months ago

RTC 300818

cs1_simplePathPost:junit.framework.AssertionFailedError: 2024-07-03-23:55:23:105 null
    at io.openliberty.http.monitor.fat.ContainerServletApplicationTest.cs1_simplePathPost(ContainerServletApplicationTest.java:119)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at componenttest.custom.junit.runner.FATRunner$1.evaluate(FATRunner.java:204)
    at org.testcontainers.containers.FailureDetectingExternalResource$1.evaluate(FailureDetectingExternalResource.java:29)
    at componenttest.custom.junit.runner.FATRunner$2.evaluate(FATRunner.java:364)
    at componenttest.custom.junit.runner.FATRunner.run(FATRunner.java:178)
Channyboy commented 2 months ago

Appears to be a delay from when metrics is sent from OpenLiberty to the Otel collector and when the test queries for the metric data. Fix would be to increase wait time (also decrease the export interval) to ensure that metric data is sent.

Channyboy commented 2 months ago

From evaluating recent builds linked, the problem appears to be related to the exporter having issues connecting to the OpenTelemetry collector: For example:

"[8/23/24, 5:51:52:176 UTC] 0000004e io.opentelemetry.exporter.internal.grpc.GrpcExporter         W Failed to export metrics. Server responded with gRPC status code 2. Error message: timeout"