dlt-hub / verified-sources

Contribute to dlt verified sources 🔥
https://dlthub.com/docs/walkthroughs/add-a-verified-source
Apache License 2.0
50 stars 40 forks source link

DLT issues error on certain SQL SERVER Datatypes #491

Open ankit48365 opened 2 months ago

ankit48365 commented 2 months ago

dlt version

dlt 0.4.8

Describe the problem

DLT throws error on reading data from SQL SERVER tables, if those tables have data types as mentioned below:

Below are some Columns in Microsoft provided AdventureWorks2019 OLTP Database (Human Resource Schema)


1. [OrganizationNode] [hierarchyid] NULL,
2. [rowguid] [uniqueidentifier] ROWGUIDCOL  NOT NULL,
3. [Rate] [money] NOT NULL

there could be more of these, but I received error on above three so far, below is error details:

2024-04-24 15:37:45,591|[WARNING              ]|1394170|139724520918592|dlt|schema_types.py|sqla_col_to_column_schema:79|A column with name **rowguid** contains unknown data type **UNIQUEIDENTIFIER** which cannot be mapped to `dlt` data type. When using sqlalchemy backend such data will be passed to the normalizer. In case of `pyarrow` backend such data will be ignored. In case of other backends, the behavior is backend-specific.

2024-04-24 15:37:45,458|[WARNING              ]|1394170|139723820918592|dlt|schema_types.py|sqla_col_to_column_schema:79|A column with name **Rate** contains unknown data type **MONEY** which cannot be mapped to `dlt` data type. When using sqlalchemy backend such data will be passed to the normalizer. In case of `pyarrow` backend such data will be ignored. In case of other backends, the behavior is backend-specific.

__init__.py:76: SAWarning: Did not recognize type '**hierarchyid**' of column 'OrganizationNode'
  metadata.reflect(bind=engine)

finally, this was the bottom of the error:

<class 'dlt.extract.exceptions.ResourceExtractionError'>
In processing pipe Employee: extraction of resource Employee in generator table_rows caused an exception: (pyodbc.ProgrammingError) ('**ODBC SQL type -151 is not yet supported**.  column-index=3  type=-151', 'HY106')

Expected behavior

No response

Steps to reproduce

was running this function (credentials were passed from secrets.toml) Source table is from Microsoft provided AdventureWorks2019 OLTP Database (Human Resource Schema)

def load_entire_database() -> None:
    """Use the sql_database source to completely load all tables in a database"""
    pipeline = dlt.pipeline(
        pipeline_name="AW_MoveItAll_pipeline", destination='postgres', dataset_name="stg_advwrks"
    )

    # By default the sql_database source reflects all tables in the schema
    # The database credentials are sourced from the `.dlt/secrets.toml` configuration
    # source = sql_database()
    source = sql_database(schema='HumanResources')

    # Run the pipeline. For a large db this may take a while
    info = pipeline.run(source, write_disposition="replace")
    print(
        humanize.precisedelta(
            pipeline.last_trace.finished_at - pipeline.last_trace.started_at
        )
    )
    print(info)

Operating system

Linux

Runtime environment

Local

Python version

3.9

dlt data source

MS Open Public Adventure Works 2019 OLTP

dlt destination

Postgres

Other deployment details

No response

Additional information

No response