zzzprojects / Dapper-Plus

Dapper Plus - High-Efficient Bulk Actions (Insert, Update, Delete, and Merge) for .NET
https://dapper-plus.net/
383 stars 85 forks source link

Performance issue with bulk insert #85

Closed ElenaShlykova closed 3 years ago

ElenaShlykova commented 3 years ago

Hello, I have a strange performance issue with BuilkInsert operation. I have a program list that keeps a collection of programs:

   public class ProgramList
   {
      public IList<Program> Programs { get; set; }
...
   }

   public class Program
   {
      public long ProgramListId { get; set; }
...
    }

And I try to insert the program lsit with programs like this: actionSet.BulkInsert(programList, x => x.Programs).CurrentItem Sometimes this operation takes too long or even fails when timed out error.

I checked the SQL profiler and see that it always hangs when bulk inserting programs image There are 15 programs in one list of programs, and we do not have fields containing a lot of data, so it's not because of the amount of data..

Do you have any thoughts about why this problem might arise?

Stacktrace: image

MS Sql 2016. Isolation level Read Committed Snapshot

JonathanMagnan commented 3 years ago

Hello @ElenaShlykova ,

One common problem is that a table has a trigger that causes the performance issue. For example, the table has a trigger that on an insert, for every row, make some custom logic such as calling a stored procedure. So the timeout doesn't really happy because of our library but all custom logic around the insert that we do not control.

Could it be the cause?

If you don't think:

Let me know more about your issue and we will try to help you the best we can.

Best Regards,

Jon

ElenaShlykova commented 3 years ago

We have a trigger, but we disable it and still receive this error. We have nothing special. This error occurs randomly, but more often on slow machines

ElenaShlykova commented 3 years ago

Is there any setting to not use SqlBulkCopy, but use SQL variables instead?

JonathanMagnan commented 3 years ago

Sure,

You need to set the Resolution to WorkingTable. Make sure to specify also a batch size, otherwise, you might end up getting an error caused by the limit of the number of parameters allowed (2100 maximum).

You can do it for a specific operation:

connection.UseBulkOptions(options =>
{
    options.BatchSize = 50;
    options.Resolution = ResolutionType.WorkingTable;
}).BulkInsert(list);

or you can enable it globally to every bulk operations:

BulkOperationManager.BulkOperationBuilder = options =>
{
    options.BatchSize = 50;
    options.Resolution = ResolutionType.WorkingTable;

};

if the error happens randomly and most of the time you get very good performance, that would probably mean that's not related to our library (the same code is executed after all!). It might be due to not having enough memory, the table is currently locked for a few seconds or another problem with the database.

Let me know if you still got the problem by stoping using the SqlBulkCopy

JonathanMagnan commented 3 years ago

Hello @ElenaShlykova

Since our last conversation, we haven't heard from you.

Did you get the chance to try one of the provided solutions?

Don't hesitate to contact us for further assistance,

Jon

ElenaShlykova commented 3 years ago

Hello! I have increased the value for the TemporaryTableMinRecord property to 50. After that, we did not reproduce this problem. Thanks :)

JonathanMagnan commented 3 years ago

Awesome @ElenaShlykova !

We are glad to hear that the issue is fixed!

Don't hesitate to contact us with any questions or further assistance,

Jon