Open c-f-cooper opened 1 year ago
Can you paste more stack trace details here?
Can you paste more stack trace details here?
Can you check whether there is already a lock file there, maybe you can remove the file manually.
Can you check whether there is already a lock file there, maybe you can remove the file manually.
yeah,there is really a lock file in the .hoodie
directory.when i delete it,the job running correctly!The error reappeared later,and i have three job to write the table.
there are some parameters on the filesystem lock, it seems there are too many contentions on the lock fetching.
Describe the problem you faced
When i use cow+inset mode write to hudi table,i use multi-writer to write,and use the FileSystemBasedLockProvider there is an error occred,the error is
org.apache.hudi.exception.HoodieLockException: Unable to acquire lock, lock object hdfs://hdfs-k8s/log/rtr/.hoodie/lock
.Environment Description
Hudi version : 0.13.0
Hive version : 3.1.2
Hadoop version : 3.3.2
Storage (HDFS/S3/GCS..) : hdfs on k8s
Running on Docker? (yes/no) :
yes Additional context
Add any other context about the problem here.
Stacktrace
Add the stacktrace of the error.