Closed yeruoforever closed 2 years ago
https://github.com/dfdx/Spark.jl/issues/99#issue-1007636213 In some case, a config file can be as follows:
spark.metrics.conf.driver.source.jvm.class=org.apache.spark.metrics.source.JvmSource spark.metrics.conf.executor.source.jvm.class=org.apache.spark.metrics.source.JvmSource
The function named load_spark_defaults didn't take it into account. https://github.com/dfdx/Spark.jl/blob/d870fa6d742ba884bc51386635f96323b722f03c/src/init.jl#L69 So when i start my first Spark.init(), an error was thrown out.
load_spark_defaults
Spark.init()
ERROR: BoundsError: attempt to access 1-element Vector{SubString{String}} at index [2] Stacktrace: [1] getindex @ ./array.jl:805 [inlined] [2] load_spark_defaults(d::Dict{Any, Any}) @ Spark ~/.julia/packages/Spark/9bsuG/src/init.jl:70 [3] init() @ Spark ~/.julia/packages/Spark/9bsuG/src/init.jl:5 [4] top-level scope @ REPL[4]:1
Thanks for fixing it!
https://github.com/dfdx/Spark.jl/issues/99#issue-1007636213 In some case, a config file can be as follows:
The function named
load_spark_defaults
didn't take it into account. https://github.com/dfdx/Spark.jl/blob/d870fa6d742ba884bc51386635f96323b722f03c/src/init.jl#L69 So when i start my firstSpark.init()
, an error was thrown out.