zzzprojects / Bulk-Operations

C# SQL Bulk Operations | High-performance C# bulk insert, update, delete and merge for SQL Server, SQL Azure, SQL Compact, MySQL, and SQLite.
https://bulk-operations.net
142 stars 36 forks source link

BulkUpdate with PostgreSql sets wrong IDs in list #64

Closed isenmann closed 3 years ago

isenmann commented 3 years ago

Hi,

we have a very strange behaviour again with DapperPlus and Npgsql. I will try to explain it here and I have also linked a repo to reproduce this issue.

How to reproduce:

  1. Clone the project
  2. Adjust the connection string in Startup.cs to your needs
  3. Compile and start the demo project
  4. Open swagger on http://localhost:5000/swagger
  5. Execute the GET​ /IssueDemo (this will add 50 properties to the database. Everything is fine with that, that's only necessary to have 50 properties)
  6. Execute the GET ​/IssueDemo​/UpdateProperties (this will add 10 new properties and take 10 random properties from DB and will change their name of those random properties)

If you now add a breakpoint in line 87 of IssueDemoController.cs before you execute step 6 and then execute step 6 you will see that the "insertOrUpdate" has 10 new properties to add and max. 10 properties which should be updated. If you have a look at the IDs of the properties which should be updated, they are now correct (in sync with the DB) before executing BulkUpdate. After BulkUpdate the IDs are altered to totally random one from the database and are not in sync with the ones from DB. The BulkUpdate does everything correct on DB, but the properties from the list which should be updated has the wrong IDs. Our problem is now that we use those IDs which are inside this list, it will be returned to the caller. Which now have totally wrong IDs to work on.

Another problem is that the BulkMerge operation which is commented in line 89 is not working. If you uncomment the BulkMerge and comment the BulkUpdate and BulkInsert you will see that the BulkMerge throws an exception. But that's another problem.

Hopefully you have some answers to the mentioned problems. Thanks in advance....

Cheers Daniel

https://github.com/isenmann/DapperPlus_IssueDemo

Versions I used: PostgreSQL 13.2, compiled by Visual C++ build 1914, 64-bit Npgsql.EntityFrameworkCore.PostgreSQL Version 5.0.2 DapperPlus 3.0.36 EntityFrameworkCore 5.0.3

JonathanMagnan commented 3 years ago

Hello @isenmann ,

Thank you for reporting,

We will look at it.

Best Regards,

Jon

JonathanMagnan commented 3 years ago

Hello @isenmann ,

Just to let you know that we indeed found an issue in our library. We currently output the identity when we should not (this is an UPDATE statement). We still investigate it, so I will provide a more detailed answer in a few days.

We hope that a fix will be available next Monday.

Best Regards,

Jon

isenmann commented 3 years ago

Thanks for this fast reply! Just to let you know that this code works fine with an MS SQL Server, just as an information for you.

Do you have also some information about the exception with the BulkMerge command (which also works fine with a MS SQL Server)?

JonathanMagnan commented 3 years ago

It will be all the same error we believe. I will have some soon.

isenmann commented 3 years ago

Any news about the issue?

JonathanMagnan commented 3 years ago

Hello @isenmann ,

Sorry for the delay, the change had a breaking change so we wanted to make sure before releasing a major version.

The v4.0.0 contains the fix for the BulkUpdate and BulkMerge.

Could you try it and let us know if now everything is working correctly?

Best Regards,

Jon

isenmann commented 3 years ago

Hello @JonathanMagnan,

Thanks for the information. I will test it tomorrow (Thursday) and will give feedback then.

Cheers Daniel

isenmann commented 3 years ago

Thanks @JonathanMagnan, the v4.0.0 fixes the mentioned problems!

I'm impressed by the very fast response and fix times of your product team!

Cheers Daniel

JonathanMagnan commented 3 years ago

Awesome @isenmann !

Thank you for your kind words!

Don't hesitate to contact us with any questions or further assistance,

Jon