Closed richman1000000 closed 1 year ago
I have NFS server for databackend as storage for SDFS as underlying storage. I tried different settings but issues still persists This is mount options that i use
10.19.10.1:/media/bacdedup /bacdedup nfs sync,nfsvers=3,local_lock=all,rw,rsize=32768,wsize=32768 0 0
And this is the error in log files
2020-06-03 09:53:16,481 [ERROR] [sdfs] [org.opendedup.sdfs.filestore.HashBlobArchive] [2170] [Thread-2] - unable to compact 7808310324588043628 java.nio.channels.ClosedChannelException at java.base/sun.nio.ch.FileChannelImpl.ensureOpen(FileChannelImpl.java:150) at java.base/sun.nio.ch.FileChannelImpl.size(FileChannelImpl.java:373) at org.opendedup.collections.SimpleByteArrayLongMap.next(SimpleByteArrayLongMap.java:142) at org.opendedup.sdfs.filestore.HashBlobArchive.compact(HashBlobArchive.java:2085) at org.opendedup.sdfs.filestore.HashBlobArchive.compactArchive(HashBlobArchive.java:973) at org.opendedup.sdfs.filestore.BatchFileChunkStore.run(BatchFileChunkStore.java:436) at java.base/java.lang.Thread.run(Thread.java:834) 2020-06-03 09:53:16,481 [INFO] [sdfs] [org.opendedup.sdfs.filestore.BatchFileChunkStore] [433] [Thread-2] - updating 4718667098453258399 sz=51 2020-06-03 09:53:16,482 [ERROR] [sdfs] [org.opendedup.sdfs.filestore.HashBlobArchive] [2170] [Thread-2] - unable to compact 4718667098453258399 java.nio.channels.ClosedChannelException
I searched internet, and found this article https://lucene.apache.org/core/3_3_0/api/core/org/apache/lucene/store/NIOFSDirectory.html
again, I've tested in on XFS directly and on NFS mounted folder The issue exists on NFS only
sooo. Project died today?
I have NFS server for databackend as storage for SDFS as underlying storage. I tried different settings but issues still persists This is mount options that i use
And this is the error in log files
I searched internet, and found this article https://lucene.apache.org/core/3_3_0/api/core/org/apache/lucene/store/NIOFSDirectory.html