Open lucasmonstrox opened 8 years ago
Up
@shamasis ?? :D
Ah yes. Spotted your question. That's a very interesting question.
If I've read you correctly, you have an item in list A and you want to move it to listB ensuring that it can never happen twice for the same item.
First, ensure all queries are in a transaction. That's a must. Double check your flow to ensure there can never be a bug with it. A good practice would be to ensure that the transactionId field exists before you send the last commit. Though the adapter guarantees it, it's good to be paranoid.
Ensure that your listA, has an auto increment or random generated (I like UUID v4) primary key field which is set to unique.
Then, in your schema of listB, add a field for a reference ID of the primary key of listA. No need to create sails association for this since this field is for checksum and by the time data gets committed, listA will not have the item you're referencing.
Now the fun part! Ensure that you add to listB as the last step and delete from listA as the second last step. While adding to listB, ensure that you provide the ID of listA as the value of the reference ID in listB.
How does this solve things? The transaction will never be committed if listB already has the same item in reference ID column.
In theory you're creating a pseudo foreign key that is useful for the split second scenarios and utilises MySQLs uniqueness guarantee of a column. It can also be useful in future where you choose to soft delete your items in listA instead of really deleting it.
You also have another shield - instead of depending on the autoTK feature of this adapter, manually create the trabsacrionId field in your model and add "unique" property to it.
The trick in this is to include the quantity field. For that - you'll need to create a hashing function that uses an incremental quantity and sale count. Instead of having a balance field where you subtract till zero, have two fields that track sale and quantity and you track till it equalises. Then include the sale number as part of the reference ID in list B.
And create a workflow that never allows decreasing the quantity field or makes it read-only with new batch of items is added as a new batch of items instead of directly destroying items in listA
Imagine if I'm using your adapter.
itemID is always "unique" in both table because it is a id(item game id) provided by steam, so, never repeat.
True
If I've read you correctly, you have an item in list A and you want to move it to listB ensuring that it can never happen twice for the same item.
True
First, ensure all queries are in a transaction. That's a must. Double check your flow to ensure there can never be a bug with it. A good practice would be to ensure that the transactionId field exists before you send the last commit. Though the adapter guarantees it, it's good to be paranoid.
Why?
Ensure that your listA, has an auto increment or random generated (I like UUID v4) primary key field which is set to unique. R: I have itemID, is unique in both side. So probably I already have this part OK.
I have two tables, "Assets" and "AssetsHolding" - Both have the same schema. This was the best way I found
Then, in your schema of listB, add a field for a reference ID of the primary key of listA. No need to create sails association for this since this field is for checksum and by the time data gets committed, listA will not have the item you're referencing.
The big question: - When and where I catch the error? What the line of code? I'm noob here.
How does this solve things? The transaction will never be committed if listB already has the same item in reference ID column.
I can't
You also have another shield - instead of depending on the autoTK feature of this adapter, manually create the trabsacrionId field in your model and add "unique" property to it.
In near future I will add a shop cart :-P
Is always quantity 0
The trick in this is to include the quantity field. For that - you'll need to create a hashing function that uses an incremental quantity and sale count. Instead of having a balance field where you subtract till zero, have two fields that track sale and quantity and you track till it equalises. Then include the sale number as part of the reference ID in list B.
And create a workflow that never allows decreasing the quantity field or makes it read-only with new batch of items is added as a new batch of items instead of directly destroying items in listA
When and where I catch the error? What the line of code? I'm noob here.
If you have the referenceID field in listB, you'll get an error while doing transaction.commit()
ahmmmmmmmmmmmmmm :-D
I can use a try catch around the "transaction.commit()"? :-D
No no. .commit has a callback too!!! ;-) the first parameter is an error (if any)
Nice ;)
How can I use transactions in table in one to many relationship?
Same way as others. .populate works just fine. Model.transact(transactionObj).populate(association).exec(…);
Nice :)
Another way, how to use this adapter to put/remove user balance?
Normally I do something like that.
Update user set balance = balance - X where id = 1;
Unless you're doing multiple queries, you'd do it the same way you'd do with the regular sails-MySQL adapter.
Hey - this thread is becoming too long to be comprehensible for others reading it. Can we close this and for other questions create new issues?
Cheers!
Hello again buddy, I have another question(Yes, is a technical question, not a problem).
How do I know if a transaction works correctly? How do I know if the data is not duplicated if two people simultaneously do on purpose the same request? Supposedly the first request to enter the queue, it has to be processed and the following have to give some kind of mistake, right?
Imagine that two people are trying to make a purchase of a single item on my website, how can I write this code?
Here are an example:
` var uid = req.session.user.id;
`
How do I know an item to only one user has been added?