Noticed, when diagnosing a broken spark job.
Not sure if we should fix the environment, remove the loader if it's not used, or if it's expected, change this to INFO log without the ST.
23/09/19 18:36:46 ERROR MetricsConfig: Error loading configuration file /home/crap/config/metrics.properties
java.io.FileNotFoundException: /home/crap/config/metrics.properties (No such file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at java.io.FileInputStream.<init>(FileInputStream.java:93)
at org.apache.spark.metrics.MetricsConfig.loadPropertiesFromFile(MetricsConfig.scala:132)
at org.apache.spark.metrics.MetricsConfig.initialize(MetricsConfig.scala:55)
at org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:95)
at org.apache.spark.metrics.MetricsSystem$.createMetricsSystem(MetricsSystem.scala:233)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:357)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:101)
at org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:67)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:220)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:96)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
at org.gbif.pipelines.ingest.pipelines.VerbatimToEventPipeline.run(VerbatimToEventPipeline.java:216)
at org.gbif.pipelines.ingest.pipelines.VerbatimToEventPipeline.run(VerbatimToEventPipeline.java:87)
at org.gbif.pipelines.ingest.pipelines.VerbatimToEventPipeline.main(VerbatimToEventPipeline.java:83)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:688)
Noticed, when diagnosing a broken spark job. Not sure if we should fix the environment, remove the loader if it's not used, or if it's expected, change this to INFO log without the ST.