Open fritz-astronomer opened 4 days ago
I can get this working with the client directly - the problem is 100% in the get_fs
method
from azure.identity.aio import ClientSecretCredential
from adlfs import AzureBlobFileSystem
print(AzureBlobFileSystem(
account_name="...",
credential=ClientSecretCredential(
tenant_id="...",
client_id="...",
client_secret="...",
).ls('/'))
Apache Airflow version
2.9.2
If "Other Airflow 2 version" selected, which one?
No response
What happened?
Connection parsing seems buggy with the Azure implementation for ObjectStoragePath - requiring specific extras in specific places that don't really make sense. This is also inconsistent with the
AzureDataLakeStorageV2Hook
connection parsingAdditionally - there is no documentation at all about an Azure implementation for ObjectStoragePath - so we should make sure to have a doc associated with the provider.
Furthermore, this is a Microsoft problem - but why there are three solutions for the same thing, each with different terminology, in varying degrees of supported or deprecated - is wicked confusing.
What you think should happen instead?
No response
How to reproduce
1) ✅
extras.connection_string
- works for both the Hook and Object Storage, without issue:2) ❌
host
+login
+password
+extras.tenant_id
- ✅ works for the Hook, ❌ DOES NOT WORK for Object Storage:Error from
adlfs.spec@do_connect
3) ❌
host
+login
+password
+extras.tenant_id
+extras.account_name
(not documented).Works for both(edit: I initially thought this was working, as
get_fs
returns successfully, but as soon as I attempt to use it it fails. I've tried a number of other combinations, such as includingaccount_url
andclient_secret_auth_config
inextra
- none are working)Operating System
Astronomer/Docker
Versions of Apache Airflow Providers
No response
Deployment
Astronomer
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
Code of Conduct