Open vpacik opened 7 months ago
@vpacik I've had the same issue, it's because the spark connect client uses GRPC not HTTP, so to resolve the SSL error you need to set GRPC_DEFAULT_SSL_ROOTS_FILE_PATH
You may run into more issues though, details here: https://community.databricks.com/t5/administration-architecture/proxy-zscaler-amp-databricks-spark-connect-quot-cannot-check/m-p/94737#M2115
Describe the bug SSL Handshake fails when trying to run pyspark code via Databricks-connect locally on the machine with corporate VPN (Zscaler) running WSL2.
To Reproduce Steps to reproduce the behavior:
System information: Version: 1.88.1 (user setup) Commit: e170252f762678dec6ca2cc69aba1570769a5d39 Date: 2024-04-10T17:41:02.734Z Electron: 28.2.8 ElectronBuildId: 27744544 Chromium: 120.0.6099.291 Node.js: 18.18.2 V8: 12.0.267.19-electron.0 OS: Windows_NT x64 10.0.22621
Additional context We are using WSL2 on the machine with corporate VPN (Zscaler) with the exported root CA used. Pinging the domain with this certificate via openssl works fine (like
openssl s_client -connect {servername}:443
) Databricks CLI is working fine on the same machine. File synchronization via Databricks-Connect also works as expected. EDIT: Authentication is done via PAT from DBX.