Closed michTalebzadeh closed 3 years ago
Hi,
I built the jar file with instructions given in
https://github.com/RedisLabs/spark-redis/issues/193
mvn -P scala-2.12 clean package -DskipTests
And it worked.
dept = [("Finance", 10), ("Marketing", 20), ("Sales", 30), ("IT", 40) ]
deptColumns = ["dept_name", "dept_id"]
deptDF = self.spark.createDataFrame(data=dept, schema=deptColumns)
deptDF.printSchema()
deptDF.show()
deptDF.write.format("org.apache.spark.sql.redis") \
.option("table", "testme") \
.option("key.column", "dept_id") \
.mode("overwrite") \
.save()
loadedDf = self.spark.read.format("org.apache.spark.sql.redis") \
.option("table", "testme") \
.option("key.column", "dept_id") \
.option("infer.schema", True) \
.load()
loadedDf.show()
Output
root
|-- dept_name: string (nullable = true)
|-- dept_id: long (nullable = true)
+---------+-------+
|dept_name|dept_id|
+---------+-------+
| Finance| 10|
|Marketing| 20|
| Sales| 30|
| IT| 40|
+---------+-------+
+---------+-------+
|dept_name|dept_id|
+---------+-------+
| IT| 40|
| Sales| 30|
|Marketing| 20|
| Finance| 10|
+---------+-------+
sorted closing it
Hi,
New to Redis. Testing Redis-Spark connector
I have put jar file that I made
in directory $SPARK_HOME/jars.
The redis version is
Wrote a basic Python code in PyCharm as follows:
I get this output and error
Is this because the jar file was created with Scala 2.1.1 whereas Spark 3.1.1 uses Scala 2.1.2?
Thanks