Closed ymc-sise closed 4 months ago
@ymc-sise I had the same issue once and found the same workaround.
If the unique key constraint raises an error, Directus-Sync should continue the process and retry on another iteration. During the second iteration, we should not have the RECORD_NOT_UNIQUE
error.
This is similar to the dependencies issues between items solve by the "retry" pattern.
I will add it in the roadmap
@ymc-sise I've juste published a new version that fixed this issue. Check out the changelog: https://github.com/tractr/directus-sync/releases/tag/directus-sync%401.5.2
@EdouardDem Thx for the update. I tested it with 1.5.2 but unfortunately I still got the same error:
[18:01:52.789] DEBUG (143): [migration-client] Cache cleared
[18:01:52.789] INFO (143): ---- Push schema ----
[18:01:53.305] INFO (143): [snapshot] Changes applied
[18:01:53.310] INFO (143): ---- Clean up collections ----
[18:01:53.376] INFO (143): [dashboards] Deleted 0 dangling items
[18:01:54.263] INFO (143): [flows] Deleted 0 dangling items
[18:01:54.327] INFO (143): [folders] Deleted 0 dangling items
[18:01:57.516] INFO (143): [operations] Deleted 0 dangling items
[18:01:57.544] INFO (143): [panels] Deleted 0 dangling items
[18:01:59.659] INFO (143): [permissions] Deleted 0 dangling items
[18:02:00.030] INFO (143): [presets] Deleted 0 dangling items
[18:02:00.077] INFO (143): [roles] Deleted 0 dangling items
[18:02:00.112] INFO (143): [settings] Deleted 0 dangling items
[18:02:02.853] INFO (143): [translations] Deleted 0 dangling items
[18:02:02.865] INFO (143): [webhooks] Deleted 0 dangling items
[18:02:02.865] INFO (143): ---- Push: iteration 1 ----
[18:02:02.899] INFO (143): [dashboards] Created 0 items
[18:02:02.899] INFO (143): [dashboards] Updated 0 items
[18:02:02.900] INFO (143): [dashboards] Deleted 0 items
[18:02:03.458] INFO (143): [flows] Created 0 items
[18:02:03.458] INFO (143): [flows] Updated 0 items
[18:02:03.458] INFO (143): [flows] Deleted 0 items
[18:02:03.517] INFO (143): [folders] Created 0 items
[18:02:03.517] INFO (143): [folders] Updated 0 items
[18:02:03.517] INFO (143): [folders] Deleted 0 items
[18:02:06.253] ERROR (143):
errors: [
{
"message": "Value for field \"operations_resolve\" in collection \"directus\" has to be unique.",
"extensions": {
"code": "RECORD_NOT_UNIQUE",
"collection": "directus",
"field": "operations_resolve",
"stack": "DirectusError: Value for field \"operations_resolve\" in collection \"directus\" has to be unique.\n at uniqueViolation (file:///directus/node_modules/.pnpm/@directus+api@18.2.1_@unhead+vue@1.8.10_pinia@2.1.7_vue@3.4.21/node_modules/@directus/api/dist/database/errors/dialects/mysql.js:62:16)\n at extractError (file:///directus/node_modules/.pnpm/@directus+api@18.2.1_@unhead+vue@1.8.10_pinia@2.1.7_vue@3.4.21/node_modules/@directus/api/dist/database/errors/dialects/mysql.js:15:20)\n at translateDatabaseError (file:///directus/node_modules/.pnpm/@directus+api@18.2.1_@unhead+vue@1.8.10_pinia@2.1.7_vue@3.4.21/node_modules/@directus/api/dist/database/errors/translate.js:22:28)\n at file:///directus/node_modules/.pnpm/@directus+api@18.2.1_@unhead+vue@1.8.10_pinia@2.1.7_vue@3.4.21/node_modules/@directus/api/dist/services/items.js:153:29\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)"
}
}
]
response: {}
I changed a flow by adding a run script between two operations. In Directus I got the error (same screenshot as above). Then I saved the flow without attaching the run script. Then I added the run script to get rid of the error in Directus.
After that I pull with directus-sync, resetted my database to the previous state and tried to push my sync.
@ymc-sise It should be resolved by now. Is there a dependency loop, or do two operations resolve to the same one in your file operations.json
? Could you share the operations.json
file? Maybe there is a use case I didn't see.
I attached the old operations.json and the new operations.json with added operations in between other operations in one flow. Unfortunately same error as before.
@ymc-sise
My apologies, I have resolved the issue with operations updates, but not with operations creation. Upon inspecting your JSON files, it appears that the operations are being recreated, which is causing the bug. I will work on fixing this.
@ymc-sise I have published a new version: https://github.com/tractr/directus-sync/releases/tag/directus-sync%401.5.3
It should work now.
@EdouardDem Many thx! It works now, you saved me from recreating the flow manually.
If we exchange an operation with another operation in a flow, and we make a diff with
npx directus-sync pull
the JSON files look good.If we try to push the diff to another environment we have the error:
The same error we have in the frontend:
It's this common issue here: https://github.com/directus/directus/issues/14185
Workaround is to remove the flow and re-create the flow, instead of exchange the operation.
But maybe you can implement a fix in directus-sync so that it not triggers the error in Directus.
Anyway we love your syncing tool, it is a great help!