Describe the bug
In my opinion brickflow shouldn’t make use of databricks secrets as brickflow shouldn’t make this decision.
If I want to have a single key vault in my cloud environment I should be able to use this one across different technologies.
Say for example I’m on AWS and want to use AWS secrets manager i.s.o databricks secrets I should be able to do so.
The same holds for GCP and the Secret Manager there. This is also true for Azure Key Vault even though there you can have an Azure Key Vault backed Databricks secret scope.
To Reproduce
Expected behavior
Brickflow should just accept the token as an input argument and treat it as a pydantic SecretStr
Screenshots
Cloud Information
[x] AWS
[x] Azure
[x] GCP
[ ] Other
Desktop (please complete the following information):
Describe the bug In my opinion brickflow shouldn’t make use of databricks secrets as brickflow shouldn’t make this decision. If I want to have a single key vault in my cloud environment I should be able to use this one across different technologies. Say for example I’m on AWS and want to use AWS secrets manager i.s.o databricks secrets I should be able to do so. The same holds for GCP and the Secret Manager there. This is also true for Azure Key Vault even though there you can have an Azure Key Vault backed Databricks secret scope.
To Reproduce
Expected behavior Brickflow should just accept the token as an input argument and treat it as a pydantic SecretStr
Screenshots
Cloud Information
Desktop (please complete the following information):
Additional context