cerndb / SparkPlugins

Code and examples of how to write and deploy Apache Spark Plugins. Spark plugins allow runnig custom code on the executors as they are initialized. This also allows extending the Spark metrics systems with user-provided monitoring probes.
Apache License 2.0
79 stars 13 forks source link

cloud storage on grafana always show zero #6

Open cometta opened 2 months ago

cometta commented 2 months ago

image

did i miss out any other config? i'm using s3a to read/write files

my config

    "spark.jars.packages": "ch.cern.sparkmeasure:spark-measure_2.12:0.24,ch.cern.sparkmeasure:spark-plugins_2.12:0.3",
    "spark.plugins": "ch.cern.HDFSMetrics,ch.cern.CloudFSMetrics",
    "spark.cernSparkPlugin.cloudFsName": "s3a",
    "spark.executor.metrics.fileSystemSchemes": "file"
LucaCanali commented 2 months ago

The config seems OK. Can you double check that you are using Spark 3.x with Scala 2.12, and that you are using s3a, that is your file paths should look like: "s3a://..." ?

cometta commented 2 months ago

@LucaCanali that's correct. i'm confirmed, using Spark 3.5.1, Scala 2.12 and s3a with custom endpoint