Closed PrestonGiorgianni closed 7 months ago
the default location https://cdm-schema.microsoft.com/logical/ for resolving cdm:
URIs has an outdated TLS cert.
The TLS cert has been updated and the spark-cdm connector is working just fine now.
The TLS cert has been updated and the spark-cdm connector is working just fine now.
Closing this as resolved
This issue is back.
Opened support case 2404010010003788 with Azure as well.
Maybe releated? I saw this on the CDM store repo
The CDM Schema Store will be shut down by end of March '24, and any services still using the older CDM SDK releases may start failing due to unavailability of the store.
Hello @carlo-quinonez @PrestonGiorgianni, if you are using an older cdm connector version, please upgrade to the latest. Please check out the releases and issue #162
Thank you for the fix
Did you read the pinned issues and search the error message?
Yes, but I didn't find the answer.
Summary of issue
I have been writing to an azure storage account with this connector from Databricks for several months using the following config.
twice now the connector has started to throw errors about the
foundations.cdm.json
. While throwing these errors it will consistently fail for several days, and then seemingly recover on its own.Are there any ideas on why this failure is popping up and fixing itself seemingly at random?
Error stack trace
Platform name
Databricks
Spark version
3.3.0
CDM jar version
spark3.3-1.19.5
What is the format of the data you are trying to read/write?
.csv