What steps will reproduce the problem?
1.copy .mov and picture files to dedup disk more than 160G
2.copy speed very very slow after 5th hour
3.6th or 7th hour, exception error happen.
my laptop resource CPU i5 / mem 4G / HDD 500G
Microsoft Windows [version 6.1.7601]
A:\>cd sdfs
A:\sdfs>mksdfs --volume-name=sdfs_vol1 --volume-capacity=200GB
Attempting to create SDFS volume ...
A:\sdfs>mountsdfs -v sdfs_vol1 -m z
Running SDFS Version 1.2.3
reading config file = C:\Program Files (x86)\sdfs\etc\sdfs_vol1-volume-cfg.xml
Loading Hashtable Entries
Loading Hashes |))))))))))))))))))))))))))))))))))))))))))))))))))| 100%
Loaded entries 1295
Running Consistancy Check on DSE, this may take a while
Scanning DSE |)))))))))))))))))))))))))))))))))))))))))))))))] | 94% log4j:WARN
No appenders could be found for logger (org.quartz.core.SchedulerSignalerImpl).
log4j:WARN Please initialize the log4j system properly.
-f
z
version = 600
driverVersion = 400
Exception in thread "Thread-32" java.lang.OutOfMemoryError: GC overhead limit
exceeded
at java.nio.ByteBuffer.wrap(ByteBuffer.java:369)
at java.nio.ByteBuffer.wrap(ByteBuffer.java:392)
at org.opendedup.collections.FCByteArrayLongMap.indexRehashed(FCByteArrayLongMap.java:396)
at org.opendedup.collections.FCByteArrayLongMap.index(FCByteArrayLongMap.java:367)
at org.opendedup.collections.FCByteArrayLongMap.containsKey(FCByteArrayL
ongMap.java:227)
at org.opendedup.collections.FileBasedCSMap.containsKey(FileBasedCSMap.j
ava:223)
at org.opendedup.sdfs.filestore.HashStore.addHashChunk(HashStore.java:24
2)
at org.opendedup.sdfs.servers.HashChunkService.writeChunk(HashChunkServi
ce.java:105)
at org.opendedup.sdfs.servers.HCServiceProxy.writeChunk(HCServiceProxy.j
ava:152)
at org.opendedup.sdfs.io.SparseDedupFile.writeCache(SparseDedupFile.java
:370)
at org.opendedup.sdfs.io.WritableCacheBuffer.close(WritableCacheBuffer.j
ava:348)
at org.opendedup.util.PoolThread.run(PoolThread.java:32)
Exception in thread "Thread-27" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
at java.nio.ByteBuffer.wrap(ByteBuffer.java:369)
at java.nio.ByteBuffer.wrap(ByteBuffer.java:392)
at org.opendedup.collections.FCByteArrayLongMap.indexRehashed(FCByteArra
yLongMap.java:396)
at org.opendedup.collections.FCByteArrayLongMap.index(FCByteArrayLongMap
.java:367)
at org.opendedup.collections.FCByteArrayLongMap.containsKey(FCByteArrayL
ongMap.java:227)
at org.opendedup.collections.FileBasedCSMap.containsKey(FileBasedCSMap.j
ava:223)
at org.opendedup.sdfs.filestore.HashStore.addHashChunk(HashStore.java:24
2)
at org.opendedup.sdfs.servers.HashChunkService.writeChunk(HashChunkServi
ce.java:105)
at org.opendedup.sdfs.servers.HCServiceProxy.writeChunk(HCServiceProxy.j
ava:152)
at org.opendedup.sdfs.io.SparseDedupFile.writeCache(SparseDedupFile.java
:370)
at org.opendedup.sdfs.io.WritableCacheBuffer.close(WritableCacheBuffer.j
ava:348)
at org.opendedup.util.PoolThread.run(PoolThread.java:32)
Exception in thread "Thread-36" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
Exception in thread "Thread-10" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
Exception in thread "Thread-35" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
Exception in thread "Thread-0" java.lang.OutOfMemoryError: GC overhead limit exc
eeded
Exception in thread "Thread-34" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
Exception in thread "Thread-31" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
Exception in thread "FileManager-0" java.lang.OutOfMemoryError: GC overhead limi
t exceeded
Exception in thread "Thread-25" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
Exception in thread "Thread-37" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
Exception in thread "ActionDistributor-1" Exception in thread "Thread-33" java.l
ang.OutOfMemoryError: GC overhead limit exceeded
java.lang.OutOfMemoryError: GC overhead limit exceeded
Exception in thread "QuartzScheduler_QuartzSchedulerThread" java.lang.OutOfMemor
yError: GC overhead limit exceeded
Exception in thread "Thread-28" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
Exception in thread "Thread-29" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
Exception in thread "Thread-24" Exception in thread "Thread-30" java.lang.OutOfM
emoryError: GC overhead limit exceeded
java.lang.OutOfMemoryError: GC overhead limit exceeded
Exception in thread "ActionDistributor-0" java.lang.OutOfMemoryError: GC overhea
d limit exceeded
Exception in thread "ActionDistributor-2" java.lang.OutOfMemoryError: GC overhea
d limit exceeded
Java HotSpot(TM) Server VM warning: Exception java.lang.OutOfMemoryError occurre
d dispatching signal UNKNOWN to handler- the VM may need to be forcibly terminat
ed
Please Wait while shutting down SDFS
Data Can be lost if this is interrupted
Exception in thread "Thread-11" java.lang.OutOfMemoryError: GC overhead limit ex
ceeded
要終止批次工作嗎 (Y/N)? n
A:\sdfs>mountsdfs -v sdfs_vol1 -m z
Running SDFS Version 1.2.3
reading config file = C:\Program Files (x86)\sdfs\etc\sdfs_vol1-volume-cfg.xml
Loading Hashtable Entries
Loading Hashes |))))))))))))))))))))))))))))))))))))))))))))))))))| 100%
Loaded entries 1295
Running Consistancy Check on DSE, this may take a while
Scanning DSE |))))))))))))))))))))))))))))))))))))))))))))))))] | 96% log4j:WARN
No appenders could be found for logger (org.quartz.core.SchedulerSignalerImpl).
log4j:WARN Please initialize the log4j system properly.
-f
z
version = 600
driverVersion = 400
Original issue reported on code.google.com by WayneChe...@gmail.com on 12 Apr 2013 at 12:58
Original issue reported on code.google.com by
WayneChe...@gmail.com
on 12 Apr 2013 at 12:58