dbt-labs / terraform-provider-dbtcloud

dbt Cloud Terraform Provider
https://registry.terraform.io/providers/dbt-labs/dbtcloud
MIT License
82 stars 19 forks source link

Creating `dbt_cloud_databricks_credential` fails #136

Closed lassebenni closed 1 year ago

lassebenni commented 1 year ago

Hi,

I am getting The request was invalid. Please double check the provided data and try again:

╷
│ Error: POST url: https://cloud.getdbt.com/api/v3/accounts/x/projects/x/credentials/x/, status: 400, body: {"status":{"code":400,"is_success":false,"user_message":"The request was invalid. Please double check the provided data and try again.","developer_message":""},"data":{"credential_details":"{'fields': None, 'field_order': None} is not valid under any of the given schemas"}}
│ 
│   with dbt_cloud_databricks_credential.databricks,
│   on main.tf line 52, in resource "dbt_cloud_databricks_credential" "databricks":
│   52: resource "dbt_cloud_databricks_credential" "databricks" {

when trying to create


resource "dbt_cloud_databricks_credential" "databricks" {
  project_id  = local.project_id
  adapter_id  = dbt_cloud_connection.databricks.adapter_id
  num_threads = 16
  target_name = "default"
  token       = "<my-databricks-token>"
}

Seems that the post endpoint does not like the data being sent? I tried to look at the API docs in https://docs.getdbt.com/dbt-cloud/api-v2 but couldn't find the right endpoint to create credentials.. Not sure how to proceed.

b-per commented 1 year ago

Which version of the provider are you using?

The doc for the endpoint is stored on the page with the different v3 endpoints.

grindheim commented 1 year ago

We get a slightly different error message using provider:

dbt = {
      source = "GtheSheep/dbt-cloud"
      version = "0.1.9"
    }

Though it's a bit confusing as the error message also says "is valid under each of.." in reference to the schema used:

│ Error: POST url: https://cloud.getdbt.com/api/v3/accounts/x/projects/x/credentials/, status: 400, body: {"status":{"code":400,"is_success":false,"user_message":"The request was invalid. Please double check the provided data and try again.","developer_message":""},"data":{"credential_details":"{'fields': {'token': {'metadata': {'label': 'Token', 'description': 'Personalized user token.', 'field_type': 'text', 'encrypt': True, 'overrideable': False, 'validation': {'required': False}}, 'value': 'x'}}, 'field_order': []} is valid under each of {'$ref': '#/definitions/DatabricksCredentialsSchema'}, {'$ref': '#/definitions/BaseSparkAdapterCredentialsSchema'}"}}
│ 
│   with dbt_cloud_databricks_credential.databricks_credential,
│   on dbt_cloud_environment.tf line 10, in resource "dbt_cloud_databricks_credential" "databricks_credential":
│   10: resource "dbt_cloud_databricks_credential" "databricks_credential" {
b-per commented 1 year ago

I will try to reproduce it if more people have the same issue

b-per commented 1 year ago

I could reproduce the issue.

While not being too complex the fix is not a one liner either so it might take me a couple of days to get it all working and making sure that we are not breaking anything else along the way.

The complexity with Databricks is that different dbt adapters support different parameters (e.g. catalog working with dbt-databricks but not dbt-spark)