-
### Configuration
We have staged pipeline that creates a Metastore together with a data_access_storage_credential first in a preprod environment and then in a prod.
Therefore, the same TF code is …
-
I am using azure databricks. When I attempt to modify the `display_name` of an existing `databricks_service_principal` resource on either the account level or the workspace level, the databricks provi…
-
If I use Postman, I receive the same error calling the Databricks API. If I change get-status?path=%5CShared to get-status?path=/Shared or get-status?path=%2fShared api call is successful. Notebook …
-
## Summary
I am attempting to dynamically trigger pyspark ops to be run on a Databricks cluster. However, it seems that that when doing so, dagster-databrick's databricks_pyspark_step_launcher uplo…
-
# Overview
When I attempt to use a datepart function with text('YEAR'), the compile code for the databricks dialect throws and error (see sample code below as it is trying to treat the text part as…
-
### Configuration
```hcl
# Copy-paste your Terraform configuration here
resource "databricks_permissions" "user_insights_use_cluster" {
provider = databricks.workspace
cluster_id = da…
-
### Description of the bug
`TypedDict` classes gets a griffe warning "Parameter does not appear in the function signature" when correctly including the parameters in the docstring. (Note: `TypedDict`…
-
### Configuration
```hcl
provider "databricks" {
alias = "account"
host = "https://accounts.azuredatabricks.net"
account_id = "XXX-XXXXXXXXX"
}
data "databricks_group" "adm…
-
Hi @roberth-k,
Thanks for maintaining this. This docset has been very helpful to me so far.
It looks like the current value *index.html* of `dashIndexFilePath` is invalid as the file can not be …
-
https://github.com/databricks/spark-redshift/blob/master/src/main/scala/com/databricks/spark/redshift/TableName.scala#L42
Shouldn't this follow the rules of Redshift and respect the search_path of …