Closed aldoorozco closed 4 months ago
cc - @rashtao
Can you please share the pyspark
version? Make sure this matches your Spark version (3.3.2
).
Yup, same version (3.3.2)
scala.runtime.ScalaRunTime$.wrapRefArray()
is a method present in Scala 2.13, but not Scala 2.12.
Can you please make sure you are using arangodb-spark-datasource-3.3_2.12
and not arangodb-spark-datasource-3.3_2.13
?
Closing as resolved, please reopen in case of further questions.
Setup:
Description:
I'm reading data from Google BigQuery and want to load it into ArangoDB using Apache Spark. I confirmed that the data can be read properly from BigQuery. I followed the instructions in this document. I'm using but I'm getting the following error while writing:
The dataframe has the following schema and is 61K rows:
The sample code I'm using:
And the way that I'm submitting my job is as follows:
Wondering if anyone has seen this error. I've tried using previous versions, but to no avail.
Happy to share more details if needed
Thanks, Aldo