Closed sandrodunkel closed 1 year ago
This is a craft problem, probably related to https://github.com/craftcms/cms/issues/11220
I had this "duplicate entry" error before too when I was editing an entry while a queue job was saving an element for another plugin.
What craft version are you running? Craft made a fix related to that issue in 3.7.43
I'm using Craft 4.0.3. As you said, the error discussed in your linked issue already got fixed in 3.7.43.
Let me make sure I understand what you say. SiteCopy uses the queue to process all the copying. If I start the queue job of copying entry.id 123 and then—whilst SiteCopy is handling the copy job in the queue—I edit the very same entry, then I trigger this error since the entry is basically being used twice? This would explain another error we sometimes used to get, which said that the mutex lock for the queue failed.
My project.yaml:
{
"require": {
"carlcs/craft-redactorcustomstyles": "4.0.3",
"craftcms/cms": "4.3.10",
"craftcms/feed-me": "5.0.5",
"craftcms/redactor": "3.0.3",
"goldinteractive/craft-sitecopy": "1.0.4",
"nystudio107/craft-cookies": "4.0.0",
"nystudio107/craft-seomatic": "4.0.20",
"putyourlightson/craft-blitz": "4.4.0",
"putyourlightson/craft-dashboard-begone": "2.0.0",
"putyourlightson/craft-elements-panel": "2.0.0",
"putyourlightson/craft-sprig": "2.5.1",
"vaersaagod/geomate": "v2.1.0",
"vaersaagod/matrixmate": "2.1.2",
"vlucas/phpdotenv": "^3.4.0"
},
"autoload": {
"psr-4": {
"modules\\": "modules/"
}
},
"config": {
"sort-packages": true,
"optimize-autoloader": true,
"platform": {
"php": "8.0.2"
},
"allow-plugins": {
"craftcms/plugin-installer": true,
"yiisoft/yii2-composer": true
}
},
"scripts": {
"post-root-package-install": [
"@php -r \"file_exists('.env') || copy('.env.example', '.env');\""
]
}
}
Thats correct yes. Under the hood, sitecopy calls the standard saveElement() function from craft for each site you select it to copy to. If you edit the entry at the same time this can cause problems.
A Mutex lock is a bit different, that error just says that the job wants to write something to the db, but another process is already writing something (has locked the db). This can happen if multiple queue jobs are running at the same time. I also had it happen some times when multiple users were editing entries together.
Alrighty, thanks for your quick help and your explanations!
I have a craft instance with 20+ sites for multilingual/multinational use. To distribute content across these sites we use Site Copy. Since using it, the queue often has failed jobs, blocks out the control panel with an internal server error and backs up the queue to up to 800k jobs. We are 95% sure that the problem is Site Copy since we only have this problem since using Site Copy. Plus, most of of the stack traces clearly hint towards sitecopy being the trigger.
Stack trace example: This hints to a duplicated entry id and a duplicated slug. Is it possible that site copy somehow mixed up entry ids? I can't find duplicated entry ids or slugs in my control panel or my database.