Closed MilanPecov closed 1 year ago
Hi.
Each sync iteration is performed in [RedisLock](https://redis-py.readthedocs.io/en/v4.1.2/lock.html)
(for each synced ClickhouseModel separately). This is done in order to reduce database load.
Lock has 2 parameters limiting its maximum blocking time:
Error, that you have provided, tells us that sync process takes more time than timeout is set. Timeout can be set in several ways:
ClickHouseModel.sync_lock_timeout
attributesync_delay * 10
(the default). sync_delay
can be also set as ClickHouseModel.sync_delay
or global config.SYNC_DELAY
In order to solve the error, you can:
DEBUG
logging level and try digging into logs.sync_lock_timeout
or sync_delay
for model you have problems with.As a result, we're having some data loss issues.
I don't think it is somehow connected. This error can not cause data loss. It looks like some other error (not present in stack trace) is causing it, but you can't see it as it causes timeout.
Hi,
Thank you for the quick reply and for the helpful information.
Yes, our problem is that every Monday we perform a large partial update on one table/model in PostgreSQL which needs to be synced into Clickhouse and this creates a lot of operations in Redis. We will try increasing the sync_lock_timeout and monitor what happens.
For additional logging, we use Datadog which has its own wrapper around statsd. I'm not sure if the current logging implementation will work out of the box for this, but it looks simple enough so maybe I'll try to make it work with Datadog, although it may require some minor tweaks.
I'll keep you posted if this solves the problem next Monday.
Thanks again for your help!
This error occurs sporadically in production, and I'm not sure if it is simply an exception that needs to be handled properly. It occurs in the celery task that syncs PG and CH, and in particular it happens when Redis attempts to release the lock.
What is the best way to deal with this error? As a result, we're having some data loss issues.
Here is the complete stacktrace.