getsentry / sentry-python

The official Python SDK for Sentry.io
https://sentry.io/for/python/
MIT License
1.92k stars 509 forks source link

arq integration: transaction name can be wrong for concurrent jobs #2608

Open andreasdri opened 11 months ago

andreasdri commented 11 months ago

How do you use Sentry?

Sentry Saas (sentry.io)

Version

1.39.0

Steps to Reproduce

We use the Sentry arq integration. When running many concurrent jobs at scale, we have noticed that the transaction and extra arq metadata can be wrong and mixed between jobs. I have tried to write a test case to test_arq.py which fails and reproduces the problem:

@pytest.mark.asyncio
async def test_job_concurrency(capture_events, init_arq):
    """
    10 - division starts
    70 - sleepy starts
    110 - division raises error
    120 - sleepy finishes

    """

    async def sleepy(_):
        await asyncio.sleep(0.05)

    async def division(_):
        await asyncio.sleep(0.1)
        return 1 / 0

    sleepy.__qualname__ = sleepy.__name__
    division.__qualname__ = division.__name__

    pool, worker = init_arq([sleepy, division])

    events = capture_events()

    await pool.enqueue_job(
        "division", _job_id="123", _defer_by=timedelta(milliseconds=10)
    )
    await pool.enqueue_job(
        "sleepy", _job_id="456", _defer_by=timedelta(milliseconds=70)
    )

    loop = asyncio.get_event_loop()
    task = loop.create_task(worker.async_run())
    await asyncio.sleep(1)

    task.cancel()

    await worker.close()

    exception_event = events[1]
    assert exception_event["exception"]["values"][0]["type"] == "ZeroDivisionError"
    assert exception_event["transaction"] == "division" # fails, transaction is sleepy
    assert exception_event["extra"]["arq-job"]["task"] == "division" # fails, task is sleepy

Expected Result

The correct transaction and arq extra metadata will always be attached to the correct events and not mixed between concurrent jobs.

Actual Result

It is not guaranteed that the correct transaction and arq extra metadata is attached to the correct events for concurrent jobs.

antonpirker commented 11 months ago

Hey @andreasdri thanks for reporting this and creating a test case! We will look into this