DataDog / dd-trace-java

Datadog APM client for Java
https://docs.datadoghq.com/tracing/languages/java
Apache License 2.0
564 stars 283 forks source link

-Djava.library.path=WARNING: Unable to get VM args through reflection... #7413

Closed tvp-dfine closed 2 weeks ago

tvp-dfine commented 1 month ago

With version v1.37.1 we have experienced issues.

A server runs with datadog agent 7.55.3 and Java applications on that server do not properly start up since their java arguments contain

-Djava.library.path=WARNING: Unable to get VM args through reflection. A custom java.util.logging.LogManager may not work correctly

so that "Unable" is interpreted as a Java class (the warning is defined in https://github.com/DataDog/dd-trace-java/blob/master/dd-java-agent/src/main/java/datadog/trace/bootstrap/AgentBootstrap.java)

As a side note: Where can I find the dd-trace-java version that is bundled together with the datadog agent (https://github.com/DataDog/datadog-agent/)?

PerfectSlayer commented 4 weeks ago

Hello @tvp-dfine

Thanks for reporting an issue. Could you please reach the support and open a case so we can follow up the issue? Please attach the server logs or the tracer debug logs (enabled using -Ddd.trace.debug=true). It would be helpful to quickly identify the issue.

Thank you in advance

tvp-dfine commented 4 weeks ago

@PerfectSlayer Thank you, I will do that. In the meantime, I could pin the issue down a bit more.

The issue arises when trying to start HBase. The code that is being executed can be found in https://github.com/apache/hbase/blob/master/bin/hbase on lines 351, 352 you find:

HADOOP_JAVA_LIBRARY_PATH=$(HADOOP_CLASSPATH="$CLASSPATH${temporary_cp}" "${HADOOP_IN_PATH}" \
                             org.apache.hadoop.hbase.util.GetJavaProperty java.library.path)

The variable HADOOP_JAVA_LIBRARY_PATH is then later used to set -Djava.library.path.

Now, during execution I see that the datadog APM injection is happening

/opt/datadog-packages/datadog-apm-inject/0.16.0-1/inject/process inject /usr/lib/jvm/java-17-amazon-corretto.x86_64/bin/java /usr/lib/jvm/jre-17/bin/java -Dproc_org.apache.hadoop.hbase.util.GetJavaProperty -Djava.net.preferIPv4Stack=true -server -XX:+ExitOnOutOfMemoryError --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED --add-exports=java.base/sun.security.x509=ALL-UNNAMED --add-exports=java.base/sun.security.util=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.math=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.text=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/java.time=ALL-UNNAMED --add-opens=java.base/java.util.regex=ALL-UNNAMED --add-opens=java.base/jdk.internal=ALL-UNNAMED --add-opens=java.base/jdk.internal.ref=ALL-UNNAMED --add-opens=java.base/jdk.internal.reflect=ALL-UNNAMED --add-opens=java.sql/java.sql=ALL-UNNAMED --add-opens=java.base/jdk.internal.util=ALL-UNNAMED --add-opens=java.base/jdk.internal.util.random=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=jdk.management/com.sun.management.internal=ALL-UNNAMED -Dyarn.log.dir=/usr/lib/hadoop/logs -Dyarn.log.file=hadoop.log -Dyarn.home.dir=/usr/lib/hadoop-yarn -Dyarn.root.logger=INFO,console -Djava.library.path=:/usr/lib/hadoop-lzo/lib/native:/usr/lib/hadoop/lib/native -Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str=hbase -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Dhadoop.security.logger=INFO,NullAppender -Dsun.net.inetaddr.ttl=30 org.apache.hadoop.hbase.util.GetJavaProperty java.library.path

The warning as present in https://github.com/DataDog/dd-trace-java/blob/master/dd-java-agent/src/main/java/datadog/trace/bootstrap/AgentBootstrap.java#L399 is written to stdout and that way makes it into HADOOP_JAVA_LIBRARY_PATH causing the issue downstream.

Is there a way to disable the output of AgentBootstrap to stdout?

PerfectSlayer commented 4 weeks ago

Thanks for your investigation. It helped to diagnostic what happens and reproduce the issue to properly fix it. I set up HBase and make sure to have both Hadoop in my path, and a JVM that triggers the warning.

Is there a way to disable the output of AgentBootstrap to stdout?

Sadly, there is not. This is an issue and AgentBootstrap should not output to stdout. I made a fix for it (#7432) and backport it for the coming fix release (#7433).

With this change, HBase can properly retrieve the Hadoop setup and reuse it. It should also fix any possible issue with the HBase CLI tool org.apache.hadoop.hbase.util.HBaseConfTool.