jonwagner / Insight.Database

Fast, lightweight .NET micro-ORM
Other
859 stars 145 forks source link

BulkCopy a list of dynamic objects / dictionaries ? (postgres) #436

Closed elexisvenator closed 4 years ago

elexisvenator commented 4 years ago

I am trying to create some code that takes a list of IDictionary<string,object> and bulk inserts them into postgres. The dictionary maps to the columns on the table and the data types all map to the column's data types. Unfortunately I have yet to be successful. I have also tried converting to FastExpando/Dynamic but that hasn't worked.

The error is a null reference exception on the object returned from Insight.DataBase.CodeGenerator.ObjectListDbDataReader.GetValue which looks to be because its internal _objectReader (Insight.Database.CodeGenerator.ObjectReader), the _accessors are all null.

I am not even sure what I am doing is meant to be possible, based on the error I feel its expecting a concrete type instead of a dictionary/dynamic and is failing to map the keys to columns on the table.

Is a bulkCopy of fastexpando or dictionary intended to be supported? How would I go about doing this?

The error messages I am getting look very similar to #342 - I get the Npgsql.PostgresException: 22P04: unterminated CSV quoted field, but its an error that happens after the null reference exception because the stream dies with a half-written file.

Postgres10.3 Windows 10 .Net Standard 2.0 (being called by .net core 3.1) Insight.Database 6.2.11 Insight.Database,Providers.PostgreSQL 6.2.11 Npgsql 4.0.3

jonwagner commented 4 years ago

The BulkCopy functionality doesn't support dynamic objects directly.

We would have to create a mapping for the dynamic object to the table, and the caching system would then bind it to the first dynamic object you sent in. This would cause issues when the first object was not representative of the entire set, or if you ever sent a different set of object to the same table.

One option is to use the signature for BulkCopy that takes an IDataReader.

        public static Task<int> BulkCopyAsync(
            this IDbConnection connection,
            string tableName,
            IDataReader source,
            …

If you impement an IDataReader that can safely read dynamic objects, then we could bulk copy them. I suppose it would need to scan the list of objects for all of the possible properties and define columns. If we had a reader like that, it might be possible to integrate it into the library directly.

Another, simpler option is to just project things onto an anonymous type:

     .BulkCopy (list.Select (item => new { prop = item.Prop, … 
elexisvenator commented 4 years ago

Thanks for the response :)

I ended up calling BeginBinaryImport directly on the NpgsqlConnection which was able to get me what I needed. At runtime I knew that every dictionary/object had the same fields in it, I just couldn't figure out how to tell the BulkCopy command that

Jaxelr commented 4 years ago

Great, closing then.