laravel / framework

The Laravel Framework.
https://laravel.com
MIT License
32.21k stars 10.9k forks source link

Database Lock fails on PostgreSQL if transaction in progress #36280

Closed fobin closed 3 years ago

fobin commented 3 years ago

Description:

DatabaseLock.php raises unhandled QueryException inside catch. SQLSTATE[25P02]: In failed sql transaction: 7 ERROR: current transaction is aborted, commands ignored until end of transaction block (SQL: update "cache_locks" set "owner" = CWrOKyyPnGGBZbfG, "expiration" = 1613560215 where "key" = laravel_cachelaravel_unique_job:App\Jobs\UniqueJob123 and ("owner" = CWrOKyyPnGGBZbfG or "expiration" <= 1613473815))

It seems that this can happen in a case where database lock is tried to acquire within transaction. I came across this since I have models which are saved with ->saveOrFail() and after creation they trigger same job inside observers created method. That job is using ShouldBeUnique to ensure only one is in the end done.

Models successfully saved but these job creations are failing after first one is created.

Steps To Reproduce:

This is simplest I could come up with.

  1. Setup Laravel with Postgresql
  2. Setup database cache
  3. Setup database cache_locks table
  4. Setup database queues
  5. Settings should be then in phpunit CACHE_DRIVER=database, QUEUE_CONNECTION=database
  6. Create UniqueJob which implements ShouldBeUnique

Try to run test case:

public function test_example()
    {
        $this->getConnection()->beginTransaction();
        // On this call database.lock uses Insert and it works for the first time
        UniqueJob::dispatch();
        // On this call insert fails and it goes to catch but the update fails on the error
        UniqueJob::dispatch();
    }

I've also created project which reproduces this by running migrations and then running test TestJobs.php: https://github.com/fobin/laravel-unique-job

driesvints commented 3 years ago

I'm unfortunately not familiar with PostgreSQL well enough to debug this. Appreciating help with this one.

taylorotwell commented 3 years ago

I'm not sure there is a way around this on our end. This how Postgres works if a query error is encountered within a transaction, and the database cache works atomically by attempting to insert the record and if it gets an exception it will update the record instead. This is similar to the approach Symfony takes.

So, my work around suggestion is to define a separate connection in your Laravel configuration for the cache.

image

Then, set your lock_connection in your cache configuration file:

image