Open aman-solanki-kr opened 1 year ago
@evanye
@aman-solanki-kr Try filing a support ticket with your support rep. Unfortunately I don't know the answer to this.
Just add your jars to /databricks/python3/lib/python3.10/site-packages/pyspark/jars. This is the location for pyspark jars.
BASE IMAGE - databricksruntime/python:10.4-LTS
I successfully installed the Python dependencies, and the tasks that depend on python in the workflow run fine, but I’m struggling to install the Maven and Jar dependencies.
The jar files are in the docker image (databricks/jars) and are visible in the spark environment path when the cluster starts, but when I trigger the workflow, I see a “Java Package not callable error” since the script is unable to use the classes in the jar files.