Open zak-hassan opened 5 years ago
@elmiko @tmckayus Let me know what you think?
my first thought is that this seems like a reasonable idea.
second thought, since this image is based on radanalyticsio/openshift-spark
it should already have the jar file in it. this might be a simple matter of locating it and adding the necessary option to start the metrics on the command line like you suggest.
Thats really good. I think that PR with the metrics config just got merged. Did you get a chance to cut a new image for that? I'd like to test drive this.
the metrics config pr you posted did get merged into the master, but we have not cut a new release from that.
there is an autobuild that gets generated at quay.io/radanalyticsio/openshift-spark:master
and quay.io/radanalyticsio/openshift-spark-py36:master
for that repo, but it looks like the transitive dependencies (ie this repo) have not been rebuilt.
if you want to play around with metrics and see what you can do with the s2i you will need to generate a new s2i image locally.
hope that helps!
It's in there. Perfect. Thanks @elmiko
Background
We currently have an environment variable when set to --metrics='prometheus' then the spark master and driver are instrumented. However we may be missing out on some good metrics coming out of the running driver application.
Proposal
Why don't we include the agent-bond.jar and the agent-config.yaml into the s2i images and when the same environment variable is set: --metrics=='prometheus' then lets have s2i automatically setup the java-agent to instrument the application.
Details:
The driver would need to passed in the following as spark options:
SPARK_MASTER_URL= # spark://10.230.8.242:7077