zzzprojects / Bulk-Operations

C# SQL Bulk Operations | High-performance C# bulk insert, update, delete and merge for SQL Server, SQL Azure, SQL Compact, MySQL, and SQLite.
https://bulk-operations.net
142 stars 36 forks source link

Problem with BulkInserting an IList of dynamic/IDictionary<string, object> on Oracle 12.2 #41

Closed driekus77 closed 5 years ago

driekus77 commented 5 years ago

I'm new to ZZZProjects and Bulk-Operations and Trying to copy data from one Oracle Environment to another in batches of 200 records.

The "normal" Dapper insert (Execute) works as expected but is somewhat slow ( 3 a 4 seconds for 200 records...) I'm doing it by using and Querying with the dynamic type which a transform right after the Query to an IDictionary<string, object>.

But when trying to use the BulkInsert I'm stuck trying different settings. Are dynamics not supported for BulkInsert?

Code I have so far is:

             DapperPlusManager.Entity<IDictionary<string, object>>("mapperKey_001")
                                                                        .Table(table.Value.Meta.TargetTableName)
                                                                        .Map(x => x["ID"] );
             transaction.BulkInsert(("mapperKey_001", data.Rows);

Am I doing something wrong?

Thanks for any help and suggestions.

Kind regards, Henry Roeland

JonathanMagnan commented 5 years ago

Hello @driekus77 ,

Do you think you could provide a project with this issue? We just want to make sure that we will investigate a similar issue as your.

We will be happy to look at why that's slow or check about your dynamic scenario.

You can send private code here: info@zzzprojects.com

Best Regards,

Jonathan

driekus77 commented 5 years ago

Dear Jonathan,

I've send you an e-mail with a Test console project. See this mail for details.

Kind regards, Henry Roeland

JonathanMagnan commented 5 years ago

Got it ;)

We will look at it by the end of the day.

JonathanMagnan commented 5 years ago

This issue has been answered via mail,

One solution was to build an expando from the dictionary

var expandoList = rows.Select(x => x.ToExpando()).ToList();
bop.BulkInsert(expandoList);

public static class Extensions
{
    /// <summary>
    /// Extension method that turns a dictionary of string and object to an ExpandoObject
    /// </summary>
    public static ExpandoObject ToExpando(this IDictionary<string, object> dictionary)
    {
        var expando = new ExpandoObject();
        var expandoDic = (IDictionary<string, object>)expando;

        // go through the items in the dictionary and copy over the key value pairs)
        foreach (var kvp in dictionary)
        {
            // if the value can also be turned into an ExpandoObject, then do it!
            if (kvp.Value is IDictionary<string, object>)
            {
                var expandoValue = ((IDictionary<string, object>)kvp.Value).ToExpando();
                expandoDic.Add(kvp.Key, expandoValue);
            }
            else if (kvp.Value is ICollection)
            {
                // iterate through the collection and convert any strin-object dictionaries
                // along the way into expando objects
                var itemList = new List<object>();
                foreach (var item in (ICollection)kvp.Value)
                {
                    if (item is IDictionary<string, object>)
                    {
                        var expandoItem = ((IDictionary<string, object>)item).ToExpando();
                        itemList.Add(expandoItem);
                    }
                    else
                    {
                        itemList.Add(item);
                    }
                }

                expandoDic.Add(kvp.Key, itemList);
            }
            else
            {
                expandoDic.Add(kvp);
            }
        }

        return expando;
    }
}
driekus77 commented 5 years ago

Sorry didn't test it because we already where using oracle's array binding! Kind regards, Henry

JonathanMagnan commented 5 years ago

Oh no problem ;)

Thank for letting us know.