procrastinate-org / procrastinate

PostgreSQL-based Task Queue for Python
https://procrastinate.readthedocs.io/
MIT License
834 stars 52 forks source link

Cannot execute create index concurrently in procrastinate task #1051

Closed paulzakin closed 3 months ago

paulzakin commented 3 months ago

Would you mind seeing if you can? I think if you can on the simplest possible set up then it’s clearly a problem on our end!

ewjoachim commented 3 months ago

In #1059, I add a task that creates an index concurrently in a Django setup.

@app.task
def create_index():
    c = connection.cursor()
    c.execute("CREATE INDEX CONCURRENTLY ON demo_book (indexed)")

I'm deferring it with:

$ procrastinate_demos/demo_django/manage.py procrastinate defer procrastinate_demos.demo_django.demo.tasks.create_index

Then launching a worker with:

$ procrastinate_demos/demo_django/manage.py procrastinate worker

And it seems to work:

2024-05-05 20:53:57,403 INFO    procrastinate.worker Starting job procrastinate_demos.demo_django.demo.tasks.create_index[1]() {'taskName': 'single_worker', 'action': 'start_job', 'worker': {'name': 'worker', 'id': 0, 'queues': None}, 'job': {'id': 1, 'status': 'doing', 'queue': 'default', 'lock': None, 'queueing_lock': None, 'task_name': 'procrastinate_demos.demo_django.demo.tasks.create_index', 'task_kwargs': {}, 'scheduled_at': None, 'attempts': 0, 'call_string': 'procrastinate_demos.demo_django.demo.tasks.create_index[1]()'}, 'start_timestamp': 1714942437.403614, 'duration': 8.821487426757812e-06}
2024-05-05 20:53:57,411 INFO    procrastinate.worker Job procrastinate_demos.demo_django.demo.tasks.create_index[1]() ended with status: Success, lasted 0.008 s {'taskName': 'single_worker', 'action': 'job_success', 'worker': {'name': 'worker', 'id': 0, 'queues': None}, 'job': {'id': 1, 'status': 'doing', 'queue': 'default', 'lock': None, 'queueing_lock': None, 'task_name': 'procrastinate_demos.demo_django.demo.tasks.create_index', 'task_kwargs': {}, 'scheduled_at': None, 'attempts': 0, 'call_string': 'procrastinate_demos.demo_django.demo.tasks.create_index[1]()'}, 'start_timestamp': 1714942437.403614, 'duration': 0.007569074630737305, 'end_timestamp': 1714942437.411183, 'result': None}

Index seems created:

procrastinate=# \d demo_book
demo_book             demo_book_id_seq      demo_book_indexed_idx demo_book_pkey
procrastinate=# \d demo_book
                                  Table "public.demo_book"
 Column  |          Type          | Collation | Nullable |             Default
---------+------------------------+-----------+----------+----------------------------------
 id      | bigint                 |           | not null | generated by default as identity
 title   | character varying(100) |           | not null |
 author  | character varying(100) |           | not null |
 indexed | boolean                |           | not null |
Indexes:
    "demo_book_pkey" PRIMARY KEY, btree (id)
    "demo_book_indexed_idx" btree (indexed)

From my point of view, it works :)

paulzakin commented 3 months ago

Ok! This is helpful - will need to explore - that is enough confirmation to know it is something on our end

paulzakin commented 3 months ago

https://www.pgcasts.com/episodes/the-skip-locked-feature-in-postgres-9-5

No luck on my end…. Hmmmm… SKIP…LOCKED cannot be executed outside of a transaction (at least according to that article) but INDEX CONCURRENTLY can only be executed outside of a transaction block - they seem mutually exclusive but I may be missing something.

I’m trying to create multiple indexes at the same time, maybe that is why? You are only trying to create one index - can you try two in that same statement?

It feels like it could be an auto-commit thing - I tried to turn it off when Django but maybe Procrastinate is doing its own thing - the problem looks similar to the link below - I’ll give it an try tomorrow

https://github.com/golang-migrate/migrate/issues/284

paulzakin commented 3 months ago

Ok, that ended up being it - executing each one rather than all at once solved this issue - closing it :)