Closed drei34 closed 1 year ago
Might be related to the issue in #842
edit: oh definitely. You just made another issue for it, I got you 😄
Ha right it is related to #842 I guess but more related to #442. #442 was closed, but I am unsure why because the answer is not there. So I figure I can directly address that here ...
This was resolved after we installed Java 11 and used the malest mleap. We used mleap-databricks-runtime-fat-assembly-0.22.0
. You also need to change JAVA_HOME inside of spark. The command below might work.
`
sudo apt-get -y install openjdk-11-jdk export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64 pip install mleap
export PYSPARK_PYTHON=/opt/conda/miniconda3/bin/python export PYSPARK_DRIVER_PYTHON=/opt/conda/miniconda3/bin/python
cp mleap-databricks-runtime-fat-assembly-0.22.0.jar $SPARL_HOME/jars echo "export JAVA_HOME=\"/usr/lib/jvm/java-11-openjdk-amd64\"" >> $SPARK_HOME/conf/spark-env.sh `
I'm not sure if this issue has a resolution. Why would this error happen and how can you fix it?
I'm on Java 8, mleap 0.20.0, Scala 2.12 and pyspark 3.1.3.
https://github.com/combust/mleap/issues/442