I'm currently using this library to send metrics by sending influx configuration through application.conf. Something like this
/usr/lib/spark/bin/spark-submit --files /home/ec2-user/application.conf --conf spark.executor.extraClassPath=./
This is restricting me to create a file on local machine. We already have a default config that is used to update this information. Can you please let me know how I can send these configuration from default config I use from code rather than from application.conf? Is there a way to programmatically update the configuration required for this library?
I'm currently using this library to send metrics by sending influx configuration through application.conf. Something like this /usr/lib/spark/bin/spark-submit --files /home/ec2-user/application.conf --conf spark.executor.extraClassPath=./
This is restricting me to create a file on local machine. We already have a default config that is used to update this information. Can you please let me know how I can send these configuration from default config I use from code rather than from application.conf? Is there a way to programmatically update the configuration required for this library?