Closed richman1000000 closed 5 years ago
Seeing the same, tested it on CentOS and Debian. Any chance you have either a RPM or Deb package for the one that you are using that works?
of course. I always have copy of working software somewhere and previous versions. https://yadi.sk/d/Z8Wm7j6y3anvUz
@opendedup close this issue, fixed in newer versions
After upgrade from 3.7.6.0 to 3.7.7.4 perodically getting error. Also NFS service getting stuck after this error happens. Downgrading resolved error. I've updated up downgraded twice. Error happens only in new version
vdedup1tb2-volume-cfg.xml.zip
2018-08-22 11:47:29,977 [WARN] [sdfs] [org.opendedup.collections.RocksDBMap] [810] [pool-5-thread-6] - unable to get key [6523017e27bd3dcc1631924a9ca340ce] [5374436344340394516] java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: java.io.IOException: not able to fetch hashmap for 5374436344340394516 at org.opendedup.sdfs.filestore.HashBlobArchive.getChunk(HashBlobArchive.java:1770) at org.opendedup.sdfs.filestore.HashBlobArchive.getBlock(HashBlobArchive.java:946) at org.opendedup.sdfs.filestore.BatchFileChunkStore.getChunk(BatchFileChunkStore.java:120) at org.opendedup.sdfs.filestore.ChunkData.getChunk(ChunkData.java:201) at org.opendedup.collections.RocksDBMap.getData(RocksDBMap.java:806) at org.opendedup.sdfs.filestore.HashStore.getHashChunk(HashStore.java:232) at org.opendedup.sdfs.servers.HashChunkService.fetchChunk(HashChunkService.java:144) at org.opendedup.sdfs.servers.HCServiceProxy.fetchChunk(HCServiceProxy.java:332) at org.opendedup.sdfs.io.WritableCacheBuffer$Shard.run(WritableCacheBuffer.java:1238) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)