rdagumampan / yuniql

Free and open source schema versioning and database migration made natively with .NET/6. NEW THIS MAY 2022! v1.3.15 released!
https://yuniql.io
Apache License 2.0
417 stars 63 forks source link

CSV import towards SQLServer with a uniqueidentifier column #239

Closed automada closed 2 years ago

automada commented 2 years ago

Hi, great project!

I have a CSV import towards SQLServer where one column is a uniqueidentifier (guid) type. It fails with the error:

"The given ColumnMapping does not match up with any column in the source or destination".

Do you know if this type is supported by any formatting conventions (or similar rules)? If I force the type (via the header name) in the SqlServerBulkImportService.cs it works fine:

foreach (string csvColumn in csvColumns)
{
    var columnName = csvColumn;
    Type columnDataType = null;

    if (columnName.Contains("{guid}"))
    {
        columnName = columnName.Replace("{guid}", string.Empty);
        columnDataType = typeof(Guid);

    }

    var dataColumn = new DataColumn(columnName);
    dataColumn.AllowDBNull = true;

    if(columnDataType != null)
    {
        dataColumn.DataType = typeof(Guid);
    }

    csvDatatable.Columns.Add(dataColumn);
}

Thanks in advance!

rdagumampan commented 2 years ago

@automada Thanks for reaching out. I will look into it sometime tonight. Im surprised this doesnt work as bull insert I think I disabled the type checking.

Perhaps we also need test coverage for datatype checks.

automada commented 2 years ago

Thanks for the (really) fast response! I may be an issue only affecting uniqueidentifier (based on google), but SQLServer might have some other type surprises as well :)

rdagumampan commented 2 years ago

Thanks for the (really) fast response! I may be an issue only affecting uniqueidentifier (based on google), but SQLServer might have some other type surprises as well :)

I have looked at it and I think best approach is to consider the solution we have for PostgresSqlBulkInsertService. There we extract the destination schema and prepare each column based on destination column's data type. Such as statement like this SELECT column_name, data_type FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = 'dbo' AND TABLE_NAME = '__yuniql_schema_version'. You are very much welcome to PR this.

https://github.com/rdagumampan/yuniql/blob/master/yuniql-platforms/postgresql/PostgreSqlBulkImportService.cs

P.S. Appreciate if you give the repo a Star and a Fork for better stats. Was hoping to promote this project more this 2022 :) Tak!

automada commented 2 years ago

Sounds like the best long term solution. I will have a look to see if I can get a PR for this. The repo is starred and forked, thanks for the effort. I've been using dbup for ages, but you have added some of the stuff I usually customize it with. Takker og bukker :)