Closed FranVillaVainilla closed 4 years ago
Hi @FranVillaVainilla, I am familiar with this part of the code. Have a look at https://github.com/Mimetis/Dotmim.Sync/blob/master/Projects/Dotmim.Sync.Core/CoreProvider.ApplyChanges.cs As you guessed deletes are executed before updates and inserts. It maybe as.simple as reordering deletes to be last. I can't think of any side effects. It would be a very good idea to add some tests to explore a few scenarios to be sure everything is good before making a change. gbo0
Hi Gb0o, i'm happy to see the project is alive and with active contributors.
Ok, i need to resolve my sync scenario quickly... a customer is waiting! :) So, i'm not sure to use Dotmim Sync for this first project.
Seems not complicated reordering deletes, but, in my opinion (maybe i'm wrong), they must execute after updates but before inserts to avoid posible unique constraints violations.
Regards
Hi @FranVillaVainilla, n no harm in forking the project and making a change. If you wish to make it a PR also make new tests. The project is still in its infancy so be warned when using it with real customers. Overall it is pretty good sync solution.
Hello @FranVillaVainilla
Yes, ordering the tables is very important, because Dotmim.Sync
needs to inserts / delete / update rows in the correct order.
In your 2nd scenario, basically, you have:
Product
(basically, foreign key FK_Category_Id
set to null)Category
item.Basically, the code, as said @gb0o is something like:
For Each table.Reverse()
{
Delete rows
}
For Each table
{
Add new Rows
Update new Rows
}
And as you mentionned it failed in your scenario.
So I think it could be a good solution to change things to:
For Each table
{
Update existing rows
}
For Each table.Reverse()
{
Delete old rows
}
For Each table
{
Insert new rows
}
But we have some problems here: Imagine we do this scenario:
Category
row with a key like CAT_001
Category
row replacing old key CAT_002
to CAT_001
I know this is a weird scenario because updating a primary key is not a good thing, but it's still possible.
In this scenario, making Update
, Delete
, Insert
won't work...
We will always find a scenario where order is not the good one...
Anyway, your scenario is pretty common, and makes sense.
I will update the ApplyChanges
method to have basically something like Update
, Delete
, Insert
Seb
Happy to hear plans to change the ApplyChanges behavior. No problems with primary key updates cos, as you said, it's a weird scenario, and not in my bad practices catalog.
Regards
@Mimetis Can we make it like:
For Each Table
{
Insert new rows
Update existing rows
}
For Each Table.Reverse()
{
Delete old rows
}
This is almost the same as current code except deletions are last. It would cover my scenario and @FranVillaVainilla. Doing all updates first would break my scenario where I create a new row on the remote client and update a FK on an existing row to link it up.
If too tricky to cover everyone perhaps the operation order could be made configurable?
Configurable operation order would be great, but when you have a lot of tables to sync, is posible that even with configurable operation order , we get on trouble with some scenarios.
Apart, i'm wondering how dotmim sync deal with self related tables (ie, multi-level categories with some categories poiting to parents categories). I'll try to do some test with this.
Yes, configurable operation is not an option, imo.
@gb0o : If you're inserting new rows before update existing one, you may potentially insert a row that have a constraint on an existing row that needs to be updated first to have the correct key
We will always find a way where the order is not correct .... :)
Seb
@FranVillaVainilla I have another scenario where my proposed order failed:
For Each table
{
Update existing rows
}
For Each table.Reverse()
{
Delete old rows
}
For Each table
{
Insert new rows
}
This will fail if:
Category
with PK_CategoryID
to CAT10
Product
item with a FK_CategoryID
to CAT10
this scenario is working today, but will fail tomorrow if I reorder the ApplyChanges
method...
Seb
I'ts a pitty that SQL Server does not support deferrable constraints like Oracle and other dbm's.
The idea is check constraints just before commit a transaction but not for each insert/update or delete. Probably that will be a great solution... but not available for SQL Server.
One thing we can think, but I don't want to do in Dotmim.Sync
:
But we can defer this solution to the application. Let's say:
It could be a good solution, what do you think ?
On my side, I will make some tests
Just made some tests.
In the coming version, you will be able to do this:
var interceptor = new Interceptors();
// Before applying all changes, disable all constraints on all synced tables
interceptor.OnDatabaseChangesApplying(tca =>
{
foreach(var table in agent.LocalProvider.Configuration.Schema.Tables)
{
var cmd = tca.Connection.CreateCommand();
// ParserName will replace ObjectNameParser
var tableName = ParserName.Parse(table).Schema().Quoted().ToString();
cmd.CommandText = $"ALTER TABLE {tableName} NOCHECK CONSTRAINT ALL";
cmd.Connection = tca.Connection;
cmd.Transaction = tca.Transaction;
cmd.ExecuteNonQuery();
}
});
// After changes are applied, re-enable all constraints on all synced tables
interceptor.OnDatabaseChangesApplied(tca =>
{
foreach (var table in agent.LocalProvider.Configuration.Schema.Tables)
{
var cmd = tca.Connection.CreateCommand();
// ParserName will replace ObjectNameParser
var tableName = ParserName.Parse(table).Schema().Quoted().ToString();
cmd.CommandText = $"ALTER TABLE {tableName} CHECK CONSTRAINT ALL";
cmd.Connection = tca.Connection;
cmd.Transaction = tca.Transaction;
cmd.ExecuteNonQuery();
}
});
agent.SetInterceptor(interceptor);
// try sync
await agent.SynchronizeAsync();
The result is something like that, in SQL Server Profiler:
Now the question is : Should we do this:
Interceptors
If you're suggesting to do it by default, the next question is: How to do it with MySql
;)
First, I must say that I am not an database expert, therefore, I do not know how to deal with disabling/ re-enabling constraints in MySql.
At this moment, I am finishing a Point Of Sale system and I need to synchronize the information about products, families, prices... in several stores, while each one carries its own information about sales, orders, etc. And that's why i need something like Dotmim Sync (or the old Ms Sync Framework)... but, honestly, i like the look, capacities and ease of use of Dotmim Sync.
My apologizes because just now i have no time to contribute coding, but happy to share impressions and ideas about it.
Solution looks esay with interceptors, a new concept for me (i made a break of several years with programming... a long history). But i think it would be better as a configurable option inside Dotmim, because of two main reasons:
1.- Is a generalizable behaivor, Dotmim can do this for the user 2.- If you give that responsablility to the users, they must take care about enlisting the disable/enable constrainsts command to the Sync proccess transaction as you do in the posted code. In other case, they can let database with disabled constraints in case of failure. (Correct me if i'm wrong).
Everything is under a DbTransaction
; so if a failure happens after user has disable constraints then everything is rollbacked even disabled constraints.
I will take some time to evaluate integrating this feature as a configuration option.
Thx for your feedback !
The last part of https://github.com/Mimetis/Dotmim.Sync/issues/143#issuecomment-463173881 is exactly the scenario I'm dealing with. Very interested in a solution that would disable constraints and enable the user to sync any database hierarchy. This also means that the user could specify tables in any order? @Mimetis I have been following your implementation of interceptors and look forward to giving this a try. btw how do other sync libraries handle the same problem?
Just submit an update on the master branch. Please @FranVillaVainilla, @gb0o, test if and make some feedbacks :)
@Mimetis I don't have time right now but keen to try it out soon.
I have no time at this moment, but will test soon.
Hi Sébastien, i believe i'm having problems syncing related tables. Suppose i have two related tables with FK Constraint: products (child) and categories (parent). On tables parameters in SyncAgent constructor, i pass { Categories, Products } and all is ok for inserted categories and products related to the new categories. But, if i pass { Products, Categories } to SyncAgent, i get a foreign key exception, so, seems that the order of parameters is so important. All ok at this point...
But... imagine that we update some products changing their FKCategory=X to FKCategory=null, and then, delete Category X , then, the sync proccess fails because seems like is deleting Category before updating Products...
Hope i'm doing something wrong, because this issue could be a show stopper for me.
Regards