apache / hudi

Upserts, Deletes And Incremental Processing on Big Data.
https://hudi.apache.org/
Apache License 2.0
5.36k stars 2.42k forks source link

[SUPPORT]Unable to acquire lock #9015

Open c-f-cooper opened 1 year ago

c-f-cooper commented 1 year ago

Describe the problem you faced

When i use cow+inset mode write to hudi table,i use multi-writer to write,and use the FileSystemBasedLockProvider there is an error occred,the error is org.apache.hudi.exception.HoodieLockException: Unable to acquire lock, lock object hdfs://hdfs-k8s/log/rtr/.hoodie/lock.

Environment Description

yes Additional context

Add any other context about the problem here.

Stacktrace

Add the stacktrace of the error.

FC34E360-A511-4553-8D0B-5BFE25D8BDC5_1_102_o

danny0405 commented 1 year ago

Can you paste more stack trace details here?

c-f-cooper commented 1 year ago

Can you paste more stack trace details here?

E77258AC-A250-4FB9-89A2-4070E5369C43_1_102_o

danny0405 commented 1 year ago

Can you check whether there is already a lock file there, maybe you can remove the file manually.

c-f-cooper commented 1 year ago

Can you check whether there is already a lock file there, maybe you can remove the file manually.

yeah,there is really a lock file in the .hoodie directory.when i delete it,the job running correctly!The error reappeared later,and i have three job to write the table.

danny0405 commented 1 year ago

there are some parameters on the filesystem lock, it seems there are too many contentions on the lock fetching.