Open YoavNordmann opened 4 years ago
Hey ,
We are also facing the same issue, Did you solve it by any chance?
Actually, we did...
We are using TypeSafe's Config library.
Our configuration is written in conf files.
We take all properties which start by "spark" and create key-value pairs in a map to be "fed" into the spark session on creation.
Turns out, when Config handles the key as a path and therefore whenever there was a "\" in the key, Config would wrap it with quotation marks and thus be "path" compliant. Problem is, this is not how a properties key is handles and spark is having a fit on that.
We added a small function to handle the keys:
def unwrapKey(key: String): String = String.join(".", ConfigUtil.splitPath(key))
Hope this helps
We have the same issue here, @YoavNordmann , your suggestion works. Thx a lot, spent a lot of time on this, @liyinan926, it this a bug or something? hope it can be solved
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
I am running SparkOperator v1beta2-1.1.0-2.4.5 on K3s with the webhook turned on.
There is a big difference when running the SparkJob via kubectl apply, and extracting the spark-submit from the sparkoperator logfile and running it from the sparkoperator pod. In my case, the executors which had to be raised failed immediately on some application exception. The difference is the following: Using the sparkoperator, a strange exception was raised in the driver:
Only after about 20 seconds, the real exception was show in the driver log file.
On the other hand, when running the very same spark-submit from the sparkoperator pod, I did not receive this strange Exception, rather the real exception was shown in the driver logfile right away.
As you can see, this happens only in the executor, not in the driver. After trying to run this manually, I retrieved the "spark-submit" command from the SparkOperator and ran it from the SparkOperator Pod, and the same thing happened as well.
Needless to say that this exception threw me off completely and I tried to understand why "sparkoperator.k8s.\"io/submission-id\"" is being distorted and that actually this is my problem and not the actual application exception.