databricks / spark-xml

XML data source for Spark SQL and DataFrames
Apache License 2.0
500 stars 226 forks source link

Getting error on latest cluster version (java.lang.NoClassDefFoundError: scala/$less$colon$less) #607

Closed joe-chewning closed 1 year ago

joe-chewning commented 1 year ago

Hi,

Just configured a new cluster:

{ "autoscale": { "min_workers": 2, "max_workers": 8 }, "cluster_name": "Joe Chewning's Cluster", "spark_version": "11.2.x-scala2.12", "spark_conf": { "spark.databricks.delta.preview.enabled": "true" }, "azure_attributes": { "first_on_demand": 1, "availability": "ON_DEMAND_AZURE", "spot_bid_max_price": -1 }, "node_type_id": "Standard_DS5_v2", "driver_node_type_id": "Standard_DS5_v2", "ssh_public_keys": [], "custom_tags": {}, "spark_env_vars": {}, "autotermination_minutes": 120, "enable_elastic_disk": true, "cluster_source": "UI", "init_scripts": [], "single_user_name": "joe.chewning@hlramerica.com", "data_security_mode": "LEGACY_SINGLE_USER_STANDARD", "runtime_engine": "STANDARD", "cluster_id": "------------" }

Have the latest version com.databricks:spark-xml_2.13:0.15.0 installed and confirmed all dependencies loaded. when I run the following commend I get the error above.

from pyspark.sql import SparkSession spark = SparkSession.builder.getOrCreate()

df = spark.read.format('xml').options(rowTag='TXLifeRequest').load('dbfs:/mnt/Ingest/myfile_0.xml') display(df)

Full Error: Py4JJavaError Traceback (most recent call last)

in () 2 spark = SparkSession.builder.getOrCreate() 3 ----> 4 df = spark.read.format('xml').options(rowTag='TXLifeRequest').load('dbfs:/mnt/Ingest/part-00000-706c0098-e795-4a23-af94-eccd816251a1-c000_1664819398681_0.xml') 5 display(df) /databricks/spark/python/pyspark/instrumentation_utils.py in wrapper(*args, **kwargs) 46 start = time.perf_counter() 47 try: ---> 48 res = func(*args, **kwargs) 49 logger.log_success( 50 module_name, class_name, function_name, time.perf_counter() - start, signature /databricks/spark/python/pyspark/sql/readwriter.py in load(self, path, format, schema, **options) 175 self.options(**options) 176 if isinstance(path, str): --> 177 return self._df(self._jreader.load(path)) 178 elif path is not None: 179 if type(path) != list: /databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py in __call__(self, *args) 1319 1320 answer = self.gateway_client.send_command(command) -> 1321 return_value = get_return_value( 1322 answer, self.gateway_client, self.target_id, self.name) 1323 /databricks/spark/python/pyspark/sql/utils.py in deco(*a, **kw) 194 def deco(*a: Any, **kw: Any) -> Any: 195 try: --> 196 return f(*a, **kw) 197 except Py4JJavaError as e: 198 converted = convert_exception(e.java_exception) /databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name) 324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client) 325 if answer[1] == REFERENCE_TYPE: --> 326 raise Py4JJavaError( 327 "An error occurred while calling {0}{1}{2}.\n". 328 format(target_id, ".", name), value) Py4JJavaError: An error occurred while calling o371.load. : java.lang.NoClassDefFoundError: scala/$less$colon$less at com.databricks.spark.xml.XmlOptions$.apply(XmlOptions.scala:79) at com.databricks.spark.xml.DefaultSource.createRelation(DefaultSource.scala:66) at com.databricks.spark.xml.DefaultSource.createRelation(DefaultSource.scala:52) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:385) at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:368) at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:324) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:324) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:237) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380) at py4j.Gateway.invoke(Gateway.java:306) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:195) at py4j.ClientServerConnection.run(ClientServerConnection.java:115) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: scala.$less$colon$less at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:419) at com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader.loadClass(ClassLoaders.scala:151) at java.lang.ClassLoader.loadClass(ClassLoader.java:352) ... 21 more please help!
srowen commented 1 year ago

It's a Scala version mismatch. You can see you are on a Spark + Scala 2.12 cluster, but are using the library _2.13 build. Use _2.12.

joe-chewning commented 1 year ago

Thank you! That was it.