Closed philipxyc closed 1 year ago
Added the logic to remove existing duplicates before adding unique columns. Let me know if it works for you.
Looks good to me! My only concern is if the logic going to keep the latest options or just randomly keep one, I think the eloquent going to get
the latest rows by default? My database has already been manually migrated so I can't test it out for now. Otherwise, I think this PR is good to go!
The problem
Every time an admin clicks on the save button on the options page for a location, around 25 rows of new data will be inserted to
location_options
table.Expected behavior
For dealing with update options for a location, old option records should be modified instead of inserting new records.
Why matters
I finally ended up with a
location_options
table with 500+ rows for only a single location, which significantly slower my page response time to longer than 10 seconds.Root cause
In the
updateOptions
function ofLocationOption.php
, there implemented the logic for updating the existing option record.https://github.com/tastyigniter/TastyIgniter/blob/f49963529869512b42b4f831d1c904f955431be5/app/admin/models/LocationOption.php#L175-L192
However, the
self::upsert
function is used there, on which documentation writes:This means the
upsert
logic inupdateOptions
function is fallbacked to the behavior asinsert
, because the fields don't have "unique" index.Work in progress (help needed)
This migration works fine if there is no duplicate ('location_id', 'item') data in the current
location_options
table.However, as mentioned above, it is very likely already exists many duplicated data there. In this case, the migration will fail because of the error "Integrity constraint violation: 1062 Duplicate entry".
To properly solve the issue, a logic for removing existing duplicates for this table should be implemented before the schema change. I did some research about it but was still not able to find a good way. I would be appreciated it if you can give me some help in implementing this.