M1 test fails with OOEM if the number of keys is set high.
It happens during distributeTask, while data are being merged. There are is too many updates. This is duplicate of #30
Solution could be:
apply back pressure (slow down updates)
increase memory
reduce memory overhead (replace Pair(Key,Value) with single byte[])
[info] KeyCount: 10000000
[info] Shard count: 16
[info] Dir: /tmp/iodb0.8201001663280878
[info] Store populated, dir size: 1.0100049 GB
[info] rollback
[info] rollback
[info] rollback
[info] rollback
[info] rollback
[info] rollback
[info] rollback
[error] java.util.concurrent.ExecutionException: Background distribution task failed
[error] at io.iohk.iodb.ShardedStore.$anonfun$new$4(ShardedStore.scala:81)
[error] at io.iohk.iodb.Store.$anonfun$runnable$1(Store.scala:184)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] at java.lang.Thread.run(Thread.java:745)
[error] Caused by: java.lang.OutOfMemoryError: Java heap space
[error] at java.nio.ByteBuffer.wrap(ByteBuffer.java:373)
[error] at java.nio.ByteBuffer.wrap(ByteBuffer.java:396)
[error] at io.iohk.iodb.LogStore.fileReadData(LogStore.scala:1121)
[error] at io.iohk.iodb.LogStore$ret$2$.next(LogStore.scala:1184)
[error] at io.iohk.iodb.LogStore$ret$2$.next(LogStore.scala:1174)
[error] at scala.collection.Iterator$$anon$10.next(Iterator.scala:448)
[error] at scala.collection.convert.Wrappers$IteratorWrapper.next(Wrappers.scala:28)
[error] at com.google.common.collect.Iterators$PeekingImpl.peek(Iterators.java:1194)
[error] at com.google.common.collect.Iterators$MergingIterator$1.compare(Iterators.java:1304)
[error] at com.google.common.collect.Iterators$MergingIterator$1.compare(Iterators.java:1301)
[error] at java.util.PriorityQueue.siftUpUsingComparator(PriorityQueue.java:669)
[error] at java.util.PriorityQueue.siftUp(PriorityQueue.java:645)
[error] at java.util.PriorityQueue.offer(PriorityQueue.java:344)
[error] at java.util.PriorityQueue.add(PriorityQueue.java:321)
[error] at com.google.common.collect.Iterators$MergingIterator.next(Iterators.java:1327)
[error] at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:40)
[error] at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:501)
[error] at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:500)
[error] at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:447)
[error] at io.iohk.iodb.ShardedStore.flushShard$1(ShardedStore.scala:229)
[error] at io.iohk.iodb.ShardedStore.taskDistribute(ShardedStore.scala:251)
[error] at io.iohk.iodb.ShardedStore.$anonfun$new$4(ShardedStore.scala:76)
[error] at io.iohk.iodb.ShardedStore$$Lambda$17/1810132623.apply$mcV$sp(Unknown Source)
[error] at io.iohk.iodb.Store.$anonfun$runnable$1(Store.scala:184)
[error] at io.iohk.iodb.Store$$Lambda$16/760563749.run(Unknown Source)
[error] ... 3 more
[error] Nov 08, 2017 12:44:40 AM io.iohk.iodb.Store $anonfun$runnable$1
[error] SEVERE: Background task failed
[error] java.lang.OutOfMemoryError: Java heap space
[error] at java.util.concurrent.ConcurrentHashMap.putVal(ConcurrentHashMap.java:1043)
[error] at java.util.concurrent.ConcurrentHashMap.putIfAbsent(ConcurrentHashMap.java:1535)
[error] at java.lang.ClassLoader.getClassLoadingLock(ClassLoader.java:463)
[error] at java.lang.ClassLoader.loadClass(ClassLoader.java:404)
[error] at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
[error] at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
[error] at io.iohk.iodb.ShardedStore.$anonfun$new$2(ShardedStore.scala:65)
[error] at io.iohk.iodb.ShardedStore$$Lambda$15/66233253.apply$mcV$sp(Unknown Source)
[error] at io.iohk.iodb.Store.$anonfun$runnable$1(Store.scala:184)
[error] at io.iohk.iodb.Store$$Lambda$16/760563749.run(Unknown Source)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] at java.lang.Thread.run(Thread.java:745)
[error]
[error] java.util.concurrent.ExecutionException: Background distribution task failed
[error] at io.iohk.iodb.ShardedStore.$anonfun$new$4(ShardedStore.scala:81)
[error] at io.iohk.iodb.Store.$anonfun$runnable$1(Store.scala:184)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] at java.lang.Thread.run(Thread.java:745)
[error] Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
[error] at io.iohk.iodb.LogStore.$anonfun$loadKeyValues$5(LogStore.scala:786)
[error] at io.iohk.iodb.LogStore$$Lambda$68/2008966511.apply(Unknown Source)
[error] at scala.collection.Iterator$$anon$10.next(Iterator.scala:448)
[error] at io.iohk.iodb.ShardedStore.flushShard$1(ShardedStore.scala:229)
[error] at io.iohk.iodb.ShardedStore.taskDistribute(ShardedStore.scala:251)
[error] at io.iohk.iodb.ShardedStore.$anonfun$new$4(ShardedStore.scala:76)
[error] at io.iohk.iodb.ShardedStore$$Lambda$17/1810132623.apply$mcV$sp(Unknown Source)
[error] at io.iohk.iodb.Store.$anonfun$runnable$1(Store.scala:184)
[error] at io.iohk.iodb.Store$$Lambda$16/760563749.run(Unknown Source)
[error] ... 3 more
[error] Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
[error] java.util.concurrent.ExecutionException: Background distribution task failed
[error] at io.iohk.iodb.ShardedStore.$anonfun$new$4(ShardedStore.scala:81)
[error] at io.iohk.iodb.Store.$anonfun$runnable$1(Store.scala:184)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] at java.lang.Thread.run(Thread.java:745)
[error] Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
M1 test fails with OOEM if the number of keys is set high. It happens during distributeTask, while data are being merged. There are is too many updates. This is duplicate of #30
Solution could be:
Command line:
Error log: