Closed fobin closed 3 years ago
I'm unfortunately not familiar with PostgreSQL well enough to debug this. Appreciating help with this one.
I'm not sure there is a way around this on our end. This how Postgres works if a query error is encountered within a transaction, and the database cache works atomically by attempting to insert the record and if it gets an exception it will update the record instead. This is similar to the approach Symfony takes.
So, my work around suggestion is to define a separate connection in your Laravel configuration for the cache.
Then, set your lock_connection
in your cache
configuration file:
Description:
DatabaseLock.php raises unhandled QueryException inside catch.
SQLSTATE[25P02]: In failed sql transaction: 7 ERROR: current transaction is aborted, commands ignored until end of transaction block (SQL: update "cache_locks" set "owner" = CWrOKyyPnGGBZbfG, "expiration" = 1613560215 where "key" = laravel_cachelaravel_unique_job:App\Jobs\UniqueJob123 and ("owner" = CWrOKyyPnGGBZbfG or "expiration" <= 1613473815))
It seems that this can happen in a case where database lock is tried to acquire within transaction. I came across this since I have models which are saved with
->saveOrFail()
and after creation they trigger same job inside observers created method. That job is usingShouldBeUnique
to ensure only one is in the end done.Models successfully saved but these job creations are failing after first one is created.
Steps To Reproduce:
This is simplest I could come up with.
ShouldBeUnique
Try to run test case:
I've also created project which reproduces this by running migrations and then running test TestJobs.php: https://github.com/fobin/laravel-unique-job