Closed project-defiant closed 2 years ago
hey @PROJECT-DEFIANT this error
TypeError: 'JavaPackage' object is not callable
means that the spark drivers and executors cannot find the Jars on your class path, my understanding is that you have to explicitly add spark.driver.extraClassPath
and spark.executor.extraClassPath
to the JAVA_OPTS to resolve the issue.
This is what we did for the Glow Docker container in Databricks,
And also you can refer to this thread: https://github.com/JohnSnowLabs/spark-nlp/issues/232#issuecomment-458888900
What environment are you installing Glow in? Please share more details
Hey @williambrandler, many thanks for the feedback, as if I am setting up my environment within python module with spark local session created in module I was able to solve the above issue with the JohnSnowLabs/spark-nlp#232 (comment). It turned out that I needed to provide the jar file at the end.
Many thanks yet again for resolving this issue. Keep up the great work
Hey, I am trying to run tests with pytest for functions that utilize
io.projectglow
anddeltalake
from databricks.I have created spark session object with the following code
when I try to run the python script with former code, I get following issue
It turns out that I can not call register method on sparkSession. Any idea what could resolve this issue? Should I provide projectglow package in different way?
My default test environment is venv with following packages: