isrc-cas / tarsier-oerv

Project magament for porting openEuler to RISC-V
Apache License 2.0
33 stars 51 forks source link

在 oe rv64 系统中测试 Renaissance Suite #242

Closed BigBrotherJu closed 2 years ago

BigBrotherJu commented 2 years ago

Renaissance Suite 的官网是 https://renaissance.dev/。最新的版本是 v0.13.0,可以在 https://renaissance.dev/download 下载。因为一些 benchmark 的许可证,v0.13.0 有两个版本:renaissance-gpl-0.13.0.jar renaissance-mit-0.13.0.jar。我把两个文件都下载下来看了一下,renaissance-gpl-0.13.0.jar 中包含了 renaissance-mit-0.13.0.jar 中所有的 benchmark,另外还多了一些,所以我们接下来运行 renaissance-gpl-0.13.0.jar。

renaissance-gpl-0.13.0.jar 中所有的 benchmark 如下所示:

[juhan@fedora renaissance]$ java -jar renaissance-gpl-0.13.0.jar --list
scrabble
    Solves the Scrabble puzzle using JDK Streams.
    Minimum JVM version required: 1.8
    Default repetitions: 50

page-rank
    Runs a number of PageRank iterations, using RDDs.
    Minimum JVM version required: 1.8
    Default repetitions: 20

future-genetic
    Runs a genetic algorithm using the Jenetics library and futures.
    Minimum JVM version required: 1.8
    Default repetitions: 50

akka-uct
    Runs the Unbalanced Cobwebbed Tree actor workload in Akka.
    Minimum JVM version required: 1.8
    Default repetitions: 24

movie-lens
    Recommends movies using the ALS algorithm.
    Minimum JVM version required: 1.8
    Default repetitions: 20

scala-doku
    Solves Sudoku Puzzles using Scala collections.
    Minimum JVM version required: 1.8
    Default repetitions: 20

chi-square
    Runs the chi-square test from Spark MLlib.
    Minimum JVM version required: 1.8
    Default repetitions: 60

fj-kmeans
    Runs the k-means algorithm using the fork/join framework.
    Minimum JVM version required: 1.8
    Default repetitions: 30

rx-scrabble
    Solves the Scrabble puzzle using the Rx streams.
    Minimum JVM version required: 1.8
    Default repetitions: 80

db-shootout
    Executes a shootout test using several in-memory databases.
    Minimum JVM version required: 1.8
    Maximum JVM version supported: 11
    Default repetitions: 16

neo4j-analytics
    Executes Neo4J graph queries against a movie database.
    Minimum JVM version required: 11
    Maximum JVM version supported: 15
    Default repetitions: 20

finagle-http
    Sends many small Finagle HTTP requests to a Finagle HTTP server and awaits
    response.
    Minimum JVM version required: 1.8
    Default repetitions: 12

reactors
    Runs benchmarks inspired by the Savina microbenchmark workloads in a
    sequence on Reactors.IO.
    Minimum JVM version required: 1.8
    Default repetitions: 10

dec-tree
    Runs the Random Forest algorithm from the Spark ML library.
    Minimum JVM version required: 1.8
    Default repetitions: 40

scala-stm-bench7
    Runs the stmbench7 benchmark using ScalaSTM.
    Minimum JVM version required: 1.8
    Default repetitions: 60

naive-bayes
    Runs the multinomial Naive Bayes algorithm from the Spark ML library.
    Minimum JVM version required: 1.8
    Default repetitions: 30

als
    Runs the ALS algorithm from the Spark ML library.
    Minimum JVM version required: 1.8
    Default repetitions: 30

par-mnemonics
    Solves the phone mnemonics problem using parallel JDK streams.
    Minimum JVM version required: 1.8
    Default repetitions: 16

scala-kmeans
    Runs the K-Means algorithm using Scala collections.
    Minimum JVM version required: 1.8
    Default repetitions: 50

philosophers
    Solves a variant of the dining philosophers problem using ScalaSTM.
    Minimum JVM version required: 1.8
    Default repetitions: 30

log-regression
    Runs the Logistic Regression algorithm from the Spark ML library.
    Minimum JVM version required: 1.8
    Default repetitions: 20

gauss-mix
    Computes a Gaussian mixture model using expectation-maximization.
    Minimum JVM version required: 1.8
    Default repetitions: 40

mnemonics
    Solves the phone mnemonics problem using JDK streams.
    Minimum JVM version required: 1.8
    Default repetitions: 16

dotty
    Runs the Dotty compiler on a set of source code files.
    Minimum JVM version required: 1.8
    Default repetitions: 50

finagle-chirper
    Simulates a microblogging service using Twitter Finagle.
    Minimum JVM version required: 1.8
    Default repetitions: 90

renaissance-gpl-0.13.0.jar 的参数如下:

[juhan@fedora renaissance]$ java -jar renaissance-gpl-0.13.0.jar
Renaissance Benchmark Suite, version 0.13.0
Usage: renaissance [options] [benchmark-specification]

  -h, --help               Prints this usage text.
  -r, --repetitions <count>
                           Execute the measured operation a fixed number of times.
  -t, --run-seconds <seconds>
                           Execute the measured operation for fixed time (wall-clock).
  --operation-run-seconds <seconds>
                           Execute the measured operation for fixed accumulated operation time (wall-clock).
  --policy <class-path>!<class-name>
                           Use policy plugin to control repetition of measured operation execution.
  --plugin <class-path>[!<class-name>]
                           Load external plugin. Can appear multiple times.
  --with-arg <value>       Adds an argument to the plugin or policy specified last. Can appear multiple times.
  --csv <csv-file>         Output results as CSV to <csv-file>.
  --json <json-file>       Output results as JSON to <json-file>.
  -c, --configuration <conf-name>
                           Use benchmark parameters from configuration <conf-name>.
  -o, --override <name>=<value>
                           Override the value of a configuration parameter <name> to <value>.
  --scratch-base <dir>     Create scratch directories in <dir>. Defaults to current directory.
  --keep-scratch           Keep the scratch directories after VM exit. Defaults to deleting scratch directories.
  --no-forced-gc           Do not force garbage collection before each measured operation. Defaults to forced GC.
  --no-jvm-check           Do not check benchmark JVM version requirements (for execution or raw-list).
  --list                   Print the names and descriptions of all benchmarks.
  --raw-list               Print the names of benchmarks compatible with this JVM (one per line).
  --group-list             Print the names of all benchmark groups (one per line).
  benchmark-specification  List of benchmarks (or groups) to execute (or 'all').
BigBrotherJu commented 2 years ago

当我在 amd64 fedora 上用 $ java -jar renaissance-gpl-0.13.0.jar all 运行所有 benchmark 时,如果通过 ssh 登录 fedora 执行这条命令,ssh 连接总是会在没有运行完所有 benchmark 前就被断开,如果是通过 fedora 图形界面里的终端模拟器执行这条命令,终端模拟器总是会在没有运行完所有 benchmark 前就自动关闭。benchmark 一共有 25 个,现在尝试分 3 次执行,分别执行 10 个、10 个、5 个,看看会不会好一点。

BigBrotherJu commented 2 years ago

接下来,我们进行正式测试。我们首先在 amd64 的 fedora 35 上运行 Renaissance Suite 所有 benchmark。fedora 35 是运行在 MacBook Pro 2018 15 寸上的虚拟机,被分配了 4G 内存和 2 个 CPU。MacBook Pro 2018 15 寸本身的 CPU 是 i7-8750H,6 核 12 线程,内存一共 16 G。fedora 35 上的 jdk 如下所示:

[juhan@fedora dacapo]$ java -version
openjdk version "11.0.13" 2021-10-19
OpenJDK Runtime Environment 18.9 (build 11.0.13+8)
OpenJDK 64-Bit Server VM 18.9 (build 11.0.13+8, mixed mode, sharing)

首先运行前 10 个 benchmark。--csv csvout 的意思是把得到的结果以 csv 格式存到 csvout 文件中。-r 15 的意思是每个 benchmark 只运行 15 次,因为有些 benchmark 默认要运行 50 次以上,花的时间太久了。下面只放上第一个 benchmark 的输出结果,其他省略。

[juhan@fedora renaissance]$ java -jar renaissance-gpl-0.13.0.jar --csv csvout -r 15 scrabble page-rank future-genetic akka-uct movie-lens scala-doku chi-square fj-kmeans rx-scrabble db-shootout
====== scrabble (functional) [default], iteration 0 started ======
GC before operation: completed in 150.477 ms, heap usage 128.898 MB -> 75.314 MB.
====== scrabble (functional) [default], iteration 0 completed (1934.943 ms) ======
====== scrabble (functional) [default], iteration 1 started ======
GC before operation: completed in 140.689 ms, heap usage 257.031 MB -> 75.491 MB.
====== scrabble (functional) [default], iteration 1 completed (1375.120 ms) ======
====== scrabble (functional) [default], iteration 2 started ======
GC before operation: completed in 128.243 ms, heap usage 202.854 MB -> 75.490 MB.
====== scrabble (functional) [default], iteration 2 completed (1375.969 ms) ======
====== scrabble (functional) [default], iteration 3 started ======
GC before operation: completed in 129.031 ms, heap usage 341.193 MB -> 75.490 MB.
====== scrabble (functional) [default], iteration 3 completed (1384.710 ms) ======
====== scrabble (functional) [default], iteration 4 started ======
GC before operation: completed in 127.120 ms, heap usage 191.368 MB -> 75.490 MB.
====== scrabble (functional) [default], iteration 4 completed (1432.037 ms) ======
====== scrabble (functional) [default], iteration 5 started ======
GC before operation: completed in 142.299 ms, heap usage 216.798 MB -> 75.490 MB.
====== scrabble (functional) [default], iteration 5 completed (1483.038 ms) ======
====== scrabble (functional) [default], iteration 6 started ======
GC before operation: completed in 135.799 ms, heap usage 299.695 MB -> 75.490 MB.
====== scrabble (functional) [default], iteration 6 completed (1420.099 ms) ======
====== scrabble (functional) [default], iteration 7 started ======
GC before operation: completed in 142.691 ms, heap usage 273.227 MB -> 75.491 MB.
====== scrabble (functional) [default], iteration 7 completed (1545.375 ms) ======
====== scrabble (functional) [default], iteration 8 started ======
GC before operation: completed in 140.436 ms, heap usage 275.636 MB -> 75.490 MB.
====== scrabble (functional) [default], iteration 8 completed (1416.463 ms) ======
====== scrabble (functional) [default], iteration 9 started ======
GC before operation: completed in 128.984 ms, heap usage 215.942 MB -> 75.490 MB.
====== scrabble (functional) [default], iteration 9 completed (1464.793 ms) ======
====== scrabble (functional) [default], iteration 10 started ======
GC before operation: completed in 149.485 ms, heap usage 251.865 MB -> 75.491 MB.
====== scrabble (functional) [default], iteration 10 completed (1419.488 ms) ======
====== scrabble (functional) [default], iteration 11 started ======
GC before operation: completed in 131.305 ms, heap usage 206.317 MB -> 75.491 MB.
====== scrabble (functional) [default], iteration 11 completed (1437.161 ms) ======
====== scrabble (functional) [default], iteration 12 started ======
GC before operation: completed in 144.270 ms, heap usage 132.602 MB -> 75.491 MB.
====== scrabble (functional) [default], iteration 12 completed (1659.846 ms) ======
====== scrabble (functional) [default], iteration 13 started ======
GC before operation: completed in 138.325 ms, heap usage 283.092 MB -> 75.491 MB.
====== scrabble (functional) [default], iteration 13 completed (1614.472 ms) ======
====== scrabble (functional) [default], iteration 14 started ======
GC before operation: completed in 136.292 ms, heap usage 268.395 MB -> 75.491 MB.
====== scrabble (functional) [default], iteration 14 completed (1362.901 ms) ======
# 后面省略

运行好以后,我们可以查看 csvout 的内容。下面的 duration_ns 就是运行一次 benchmark 需要的时间,单位是纳秒;uptime_ns 目前还不清楚是什么意思,好像也没有用到;vm_start_unix_ms 是 unix 时间戳,表示 jvm 启动的时间。我们只需要用到 duration_ns。

benchmark,duration_ns,uptime_ns,vm_start_unix_ms
db-shootout,11539406821,1411464926630,1643109453573
db-shootout,12082063592,1423184531567,1643109453573
db-shootout,11428769601,1435470659369,1643109453573
db-shootout,8982290897,1447394962794,1643109453573
db-shootout,9973173108,1456625463821,1643109453573
db-shootout,9633865856,1466755882769,1643109453573
db-shootout,10623167037,1476587189786,1643109453573
db-shootout,9808136589,1487448896864,1643109453573
db-shootout,10014376672,1497499789852,1643109453573
db-shootout,9122991618,1507800631498,1643109453573
db-shootout,9109890581,1517114952547,1643109453573
db-shootout,10035837344,1526392823788,1643109453573
db-shootout,10115576846,1536633195855,1643109453573
db-shootout,10145907218,1546965392627,1643109453573
db-shootout,12573883772,1557287832996,1643109453573
akka-uct,29089386800,528127040387,1643109453573
akka-uct,23443073046,557573592959,1643109453573
akka-uct,23239949040,581411458061,1643109453573
akka-uct,22268010608,605044002252,1643109453573
akka-uct,22850961958,627690744688,1643109453573
akka-uct,22888217554,650888009890,1643109453573
akka-uct,23217387423,674171464885,1643109453573
akka-uct,22924318860,697765647045,1643109453573
akka-uct,22465484782,721054626452,1643109453573
akka-uct,22296642756,743901048464,1643109453573
akka-uct,21633666303,766542331746,1643109453573
akka-uct,20696939254,788551095290,1643109453573
akka-uct,21649967541,809587087986,1643109453573
akka-uct,22509159441,831619977877,1643109453573
akka-uct,21301197064,854503615803,1643109453573
movie-lens,31757096983,883065243171,1643109453573
movie-lens,16120359003,914966227930,1643109453573
movie-lens,16451005191,931217648995,1643109453573
movie-lens,15904784361,947793059677,1643109453573
movie-lens,14420805795,963839822286,1643109453573
movie-lens,13624830718,978390585483,1643109453573
movie-lens,14894908798,992147341097,1643109453573
movie-lens,13477238049,1007171652174,1643109453573
movie-lens,12724233251,1020776264917,1643109453573
movie-lens,13777661148,1033632515419,1643109453573
movie-lens,13682075097,1047557666585,1643109453573
movie-lens,12478395136,1061372796072,1643109453573
movie-lens,13760480512,1073986244423,1643109453573
movie-lens,14396586216,1087873592867,1643109453573
movie-lens,13355777500,1102405488635,1643109453573
page-rank,23383749090,33496777393,1643109453573
page-rank,14338216790,66865217643,1643109453573
page-rank,15428468358,87114410755,1643109453573
page-rank,15296051594,108146671833,1643109453573
page-rank,15485475454,129423181563,1643109453573
page-rank,15319068924,150926152667,1643109453573
page-rank,13302469430,171926787974,1643109453573
page-rank,15416199591,190792158162,1643109453573
page-rank,14812523976,212859064680,1643109453573
page-rank,15153891038,234299522257,1643109453573
page-rank,14220974912,255270083058,1643109453573
page-rank,15324006610,275179501124,1643109453573
page-rank,14638640971,296631452390,1643109453573
page-rank,13647424372,316808378958,1643109453573
page-rank,13525756265,335999083485,1643109453573
rx-scrabble,1100501733,1394484869771,1643109453573
rx-scrabble,712901348,1395765203592,1643109453573
rx-scrabble,444751017,1396648705145,1643109453573
rx-scrabble,397161106,1397253349025,1643109453573
rx-scrabble,335031589,1397812776001,1643109453573
rx-scrabble,381168882,1398304040559,1643109453573
rx-scrabble,349848723,1398856608001,1643109453573
rx-scrabble,304131197,1399362513258,1643109453573
rx-scrabble,318412006,1399819708009,1643109453573
rx-scrabble,297967216,1400295210337,1643109453573
rx-scrabble,308536435,1400747554347,1643109453573
rx-scrabble,296616410,1401217922407,1643109453573
rx-scrabble,297502218,1401671726326,1643109453573
rx-scrabble,292644947,1402136676787,1643109453573
rx-scrabble,299662911,1402592653596,1643109453573
scala-doku,5330888993,1116156691284,1643109453573
scala-doku,3587572863,1121618577788,1643109453573
scala-doku,3445005351,1125318804122,1643109453573
scala-doku,3536416670,1128884477258,1643109453573
scala-doku,3504108927,1132555973574,1643109453573
scala-doku,3401225255,1136179243621,1643109453573
scala-doku,3448284609,1139695070707,1643109453573
scala-doku,3431392795,1143259809181,1643109453573
scala-doku,3498149604,1146809816981,1643109453573
scala-doku,3414789281,1150424728780,1643109453573
scala-doku,3360107613,1153951740615,1643109453573
scala-doku,3360016034,1157428648937,1643109453573
scala-doku,3451357994,1160924232886,1643109453573
scala-doku,3382904327,1164497120733,1643109453573
scala-doku,3381428604,1167999928579,1643109453573
scrabble,1934943046,1599984438,1643109453573
scrabble,1375120214,3682941503,1643109453573
scrabble,1375969210,5188526736,1643109453573
scrabble,1384709880,6695609445,1643109453573
scrabble,1432037079,8208701863,1643109453573
scrabble,1483037884,9784568179,1643109453573
scrabble,1420098789,11406691752,1643109453573
scrabble,1545375175,12971708885,1643109453573
scrabble,1416462914,14658869645,1643109453573
scrabble,1464793261,16205598172,1643109453573
scrabble,1419487827,17821600193,1643109453573
scrabble,1437161399,19373808124,1643109453573
scrabble,1659846285,20957515312,1643109453573
scrabble,1614471945,22757413154,1643109453573
scrabble,1362900980,24509479169,1643109453573
chi-square,5712098042,1175410250367,1643109453573
chi-square,1923224980,1181437248448,1643109453573
chi-square,1409149712,1183625745930,1643109453573
chi-square,1427724358,1185308225443,1643109453573
chi-square,1328689625,1187019685863,1643109453573
chi-square,1237759207,1188609285785,1643109453573
chi-square,1240328869,1190119202771,1643109453573
chi-square,1140778592,1191616428342,1643109453573
chi-square,1170260782,1193028262006,1643109453573
chi-square,1034384355,1194458835815,1643109453573
chi-square,1176304921,1195758271528,1643109453573
chi-square,956588332,1197200274834,1643109453573
chi-square,939811694,1198419400497,1643109453573
chi-square,953981998,1199611197144,1643109453573
chi-square,733832078,1200821966894,1643109453573
future-genetic,16372811626,355480395518,1643109453573
future-genetic,11065712370,371928361443,1643109453573
future-genetic,11276151405,383069133698,1643109453573
future-genetic,11346445607,394415008958,1643109453573
future-genetic,10813141328,405837369428,1643109453573
future-genetic,11274090308,416721461856,1643109453573
future-genetic,11047555131,428064199745,1643109453573
future-genetic,10963599895,439184645439,1643109453573
future-genetic,11166771225,450218414620,1643109453573
future-genetic,11097153087,461450698885,1643109453573
future-genetic,10918803891,472621731572,1643109453573
future-genetic,10967146259,483606404914,1643109453573
future-genetic,10983139407,494642841641,1643109453573
future-genetic,10936157599,505704808338,1643109453573
future-genetic,10969561230,516717807811,1643109453573
fj-kmeans,12632380219,1202021650395,1643109453573
fj-kmeans,12227137631,1214899924700,1643109453573
fj-kmeans,12436134032,1227341421468,1643109453573
fj-kmeans,12621293368,1239991164099,1643109453573
fj-kmeans,12869042755,1252832178077,1643109453573
fj-kmeans,12754551624,1265923138409,1643109453573
fj-kmeans,13148985543,1278890369170,1643109453573
fj-kmeans,13209157880,1292251901566,1643109453573
fj-kmeans,11709560285,1305685898008,1643109453573
fj-kmeans,12167230852,1317602254771,1643109453573
fj-kmeans,13164836591,1329974110142,1643109453573
fj-kmeans,12495914367,1343351041150,1643109453573
fj-kmeans,12330638174,1356065860383,1643109453573
fj-kmeans,12347785610,1368605609609,1643109453573
fj-kmeans,12459292963,1381163702801,1643109453573

接下来继续后面 10 个 benchmark,其中 neo4j-analytics 运行失败,原因好像是堆内存不够了:

[juhan@fedora renaissance]$ java -jar renaissance-gpl-0.13.0.jar --csv csvout2 -r 15 neo4j-analytics finagle-http reactors dec-tree scala-stm-bench7 naive-bayes als par-mnemonics scala-kmeans philosophers
Creating graph database...
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.commons.lang3.reflect.FieldUtils (file:/home/juhan/local/java-test/renaissance/harness-194849-16821980275150874665/neo4j/lib/commons-lang3-3.11.jar) to field java.io.FileDescriptor.fd
WARNING: Please consider reporting this to the maintainers of org.apache.commons.lang3.reflect.FieldUtils
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Populating database...
Benchmark 'neo4j-analytics' failed with exception:
java.lang.OutOfMemoryError: Java heap space
    at org.neo4j.internal.recordstorage.TransactionRecordState.extractCommands(TransactionRecordState.java:189)
    at org.neo4j.internal.recordstorage.RecordStorageEngine.createCommands(RecordStorageEngine.java:341)
    at org.neo4j.kernel.impl.api.KernelTransactionImplementation.commitTransaction(KernelTransactionImplementation.java:730)
    at org.neo4j.kernel.impl.api.KernelTransactionImplementation.closeTransaction(KernelTransactionImplementation.java:637)
    at org.neo4j.kernel.impl.api.KernelTransactionImplementation.commit(KernelTransactionImplementation.java:604)
    at org.neo4j.kernel.impl.coreapi.TransactionImpl$$Lambda$1002/0x0000000840822040.perform(Unknown Source)
    at org.neo4j.kernel.impl.coreapi.TransactionImpl.safeTerminalOperation(TransactionImpl.java:555)
    at org.neo4j.kernel.impl.coreapi.TransactionImpl.commit(TransactionImpl.java:138)
    at org.renaissance.neo4j.analytics.AnalyticsBenchmark.populateVertices(AnalyticsBenchmark.scala:93)
    at org.renaissance.neo4j.analytics.AnalyticsBenchmark.populateDatabase(AnalyticsBenchmark.scala:31)
    at org.renaissance.neo4j.Neo4jAnalytics.setUpBeforeAll(Neo4jAnalytics.scala:63)
    at org.renaissance.harness.ExecutionDriver.executeBenchmark(ExecutionDriver.java:82)
    at org.renaissance.harness.RenaissanceSuite$.$anonfun$runBenchmarks$1(RenaissanceSuite.scala:140)
    at org.renaissance.harness.RenaissanceSuite$.$anonfun$runBenchmarks$1$adapted(RenaissanceSuite.scala:136)
    at org.renaissance.harness.RenaissanceSuite$$$Lambda$165/0x0000000840182040.apply(Unknown Source)
    at scala.collection.immutable.List.foreach(List.scala:333)
    at org.renaissance.harness.RenaissanceSuite$.runBenchmarks(RenaissanceSuite.scala:136)
    at org.renaissance.harness.RenaissanceSuite$.main(RenaissanceSuite.scala:117)
    at org.renaissance.harness.RenaissanceSuite.main(RenaissanceSuite.scala)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at org.renaissance.core.Launcher.loadAndInvokeHarnessClass(Launcher.java:114)
    at org.renaissance.core.Launcher.launchHarnessClass(Launcher.java:73)
    at org.renaissance.core.Launcher.main(Launcher.java:37)
# 省略

csvout2 如下所示,可以看到,其中没有 neo4j-analytics。

benchmark,duration_ns,uptime_ns,vm_start_unix_ms
scala-kmeans,391772155,1150650383613,1643111329133
scala-kmeans,258158250,1151315075600,1643111329133
scala-kmeans,258742233,1151833240480,1643111329133
scala-kmeans,265728412,1152348783226,1643111329133
scala-kmeans,264240157,1152873929642,1643111329133
scala-kmeans,265770746,1153397469347,1643111329133
scala-kmeans,259843581,1153916181344,1643111329133
scala-kmeans,261675020,1154435949440,1643111329133
scala-kmeans,264092046,1154954601262,1643111329133
scala-kmeans,265130301,1155479751137,1643111329133
scala-kmeans,264001422,1156004248476,1643111329133
scala-kmeans,261285245,1156520752973,1643111329133
scala-kmeans,264559098,1157040056601,1643111329133
scala-kmeans,256833804,1157558620299,1643111329133
scala-kmeans,258940672,1158073896111,1643111329133
scala-stm-bench7,3533777436,361050927062,1643111329133
scala-stm-bench7,2320625162,364757167586,1643111329133
scala-stm-bench7,1801456286,367251954970,1643111329133
scala-stm-bench7,1733272304,369221934250,1643111329133
scala-stm-bench7,1830479437,371142320757,1643111329133
scala-stm-bench7,1799325367,373144392183,1643111329133
scala-stm-bench7,1739860907,375111477647,1643111329133
scala-stm-bench7,1767951985,377048601087,1643111329133
scala-stm-bench7,1757737040,378984881120,1643111329133
scala-stm-bench7,1773161160,380921613274,1643111329133
scala-stm-bench7,1656966905,382879962720,1643111329133
scala-stm-bench7,1752000492,384711420906,1643111329133
scala-stm-bench7,1926391022,386633962880,1643111329133
scala-stm-bench7,1737271966,388738084635,1643111329133
scala-stm-bench7,1681020060,390647886534,1643111329133
reactors,19238717337,71849970324,1643111329133
reactors,15732394082,91248276068,1643111329133
reactors,15481891135,107103344817,1643111329133
reactors,17675408534,122717928351,1643111329133
reactors,16255188573,140520133137,1643111329133
reactors,15552155348,156910860996,1643111329133
reactors,15271432980,172597001320,1643111329133
reactors,15214147925,188012467118,1643111329133
reactors,15458651934,203366015050,1643111329133
reactors,15404675661,219051225919,1643111329133
reactors,14731793783,234606548560,1643111329133
reactors,15105069610,249482474770,1643111329133
reactors,15512833349,264734871451,1643111329133
reactors,14967370938,280396226268,1643111329133
reactors,14969600102,295522349330,1643111329133
par-mnemonics,5309800971,1071910546926,1643111329133
par-mnemonics,4451503457,1078310001509,1643111329133
par-mnemonics,4607631892,1083293476093,1643111329133
par-mnemonics,4634443532,1088419169795,1643111329133
par-mnemonics,4641892014,1093559703911,1643111329133
par-mnemonics,4565056403,1098704019478,1643111329133
par-mnemonics,4673040389,1103821023023,1643111329133
par-mnemonics,4622936551,1109020427033,1643111329133
par-mnemonics,4633132900,1114163824576,1643111329133
par-mnemonics,4644343921,1119324175281,1643111329133
par-mnemonics,4658708652,1124479618765,1643111329133
par-mnemonics,4633479944,1129655659212,1643111329133
par-mnemonics,4643189653,1134826534092,1643111329133
par-mnemonics,4595582734,1139993096794,1643111329133
par-mnemonics,4587650668,1145096376108,1643111329133
philosophers,1844170451,1158617849898,1643111329133
philosophers,673491220,1160748881247,1643111329133
philosophers,619918526,1161670107519,1643111329133
philosophers,730621071,1162530230964,1643111329133
philosophers,737425575,1163497545440,1643111329133
philosophers,719187653,1164470942846,1643111329133
philosophers,661272613,1165423365963,1643111329133
philosophers,740249290,1166314969064,1643111329133
philosophers,741511016,1167298219381,1643111329133
philosophers,809107217,1168277502310,1643111329133
philosophers,917247974,1169332573636,1643111329133
philosophers,970216015,1170482224710,1643111329133
philosophers,838004711,1171690060207,1643111329133
philosophers,761275305,1172767839284,1643111329133
philosophers,746089692,1173770991948,1643111329133
als,20024716722,928412040125,1643111329133
als,11543627348,948772275161,1643111329133
als,9805238202,960638711304,1643111329133
als,8955192058,970905167924,1643111329133
als,8446099199,980170960955,1643111329133
als,8241907850,988931663544,1643111329133
als,7824908499,997515502430,1643111329133
als,8585849149,1005615170802,1643111329133
als,7632590474,1014546537075,1643111329133
als,8144100597,1022484897356,1643111329133
als,7738255828,1030915339213,1643111329133
als,8032608397,1038920101793,1643111329133
als,7769125512,1047289289697,1643111329133
als,7895567686,1055356635671,1643111329133
als,7851009941,1063542843399,1643111329133
naive-bayes,41980039026,402280987913,1643111329133
naive-bayes,40104725084,444648485760,1643111329133
naive-bayes,33802082675,485020734467,1643111329133
naive-bayes,33442751203,519067179578,1643111329133
naive-bayes,32906005670,552759818055,1643111329133
naive-bayes,32572868607,585898776189,1643111329133
naive-bayes,31513931444,618706800891,1643111329133
naive-bayes,33231571380,650420632248,1643111329133
naive-bayes,33790689640,683877152692,1643111329133
naive-bayes,36405125720,717920881622,1643111329133
naive-bayes,34647402331,754877921186,1643111329133
naive-bayes,32656265987,789748531959,1643111329133
naive-bayes,33060054132,822618570977,1643111329133
naive-bayes,32029811190,855918687570,1643111329133
naive-bayes,32012396070,888185061108,1643111329133
dec-tree,10998885860,319598720391,1643111329133
dec-tree,3881041044,330772092060,1643111329133
dec-tree,3113500610,334831739174,1643111329133
dec-tree,2493767813,338130708106,1643111329133
dec-tree,1836200732,340789952856,1643111329133
dec-tree,2084297851,342793698494,1643111329133
dec-tree,1579895624,345069941605,1643111329133
dec-tree,1454836191,346820186884,1643111329133
dec-tree,1831485279,348457742976,1643111329133
dec-tree,1647244618,350460468794,1643111329133
dec-tree,1822058704,352286873503,1643111329133
dec-tree,1466994306,354285778838,1643111329133
dec-tree,1440823731,355933615323,1643111329133
dec-tree,1441389586,357548691998,1643111329133
dec-tree,1476473287,359166063577,1643111329133
finagle-http,8094763720,22945835452,1643111329133
finagle-http,4940217491,31180787491,1643111329133
finagle-http,3309778767,36196895705,1643111329133
finagle-http,2744019166,39596464061,1643111329133
finagle-http,2680636375,42419218693,1643111329133
finagle-http,2601862050,45180726247,1643111329133
finagle-http,2641121032,47861108760,1643111329133
finagle-http,2625990471,50578009042,1643111329133
finagle-http,2612634865,53283269648,1643111329133
finagle-http,2545498607,55967631776,1643111329133
finagle-http,2736144225,58577483202,1643111329133
finagle-http,2483349364,61393884361,1643111329133
finagle-http,2339066712,63956856810,1643111329133
finagle-http,2385318559,66374648142,1643111329133
finagle-http,2469781541,68849904378,1643111329133

最后运行剩下的 5 个 benchmark:

[juhan@fed[juhan@fedora renais[juhan@fedora renaissance]$ java -jar renaissance-gpl-0.13.0.jar --csv csvout3 -r 15 log-regression gauss-mix mnemonics dotty finagle-chirper
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
NOTE: 'log-regression' benchmark uses Spark local executor with 2 (out of 2) threads.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.util.SizeEstimator$ (file:/home/juhan/local/java-test/renaissance/harness-113030-5467052652953952181/apache-spark/lib/spark-core_2.12-3.1.2.jar) to field java.net.URI.scheme
WARNING: Please consider reporting this to the maintainers of org.apache.spark.util.SizeEstimator$
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
====== log-regression (apache-spark) [default], iteration 0 started ======
GC before operation: completed in 85.921 ms, heap usage 51.462 MB -> 27.040 MB.
====== log-regression (apache-spark) [default], iteration 0 completed (16159.039 ms) ======
====== log-regression (apache-spark) [default], iteration 1 started ======
GC before operation: completed in 146.124 ms, heap usage 242.157 MB -> 102.408 MB.
====== log-regression (apache-spark) [default], iteration 1 completed (4197.336 ms) ======
====== log-regression (apache-spark) [default], iteration 2 started ======
GC before operation: completed in 140.131 ms, heap usage 301.721 MB -> 102.949 MB.
====== log-regression (apache-spark) [default], iteration 2 completed (2957.423 ms) ======
====== log-regression (apache-spark) [default], iteration 3 started ======
GC before operation: completed in 97.862 ms, heap usage 302.306 MB -> 103.350 MB.
====== log-regression (apache-spark) [default], iteration 3 completed (2541.531 ms) ======
====== log-regression (apache-spark) [default], iteration 4 started ======
GC before operation: completed in 85.192 ms, heap usage 255.833 MB -> 103.593 MB.
====== log-regression (apache-spark) [default], iteration 4 completed (3409.590 ms) ======
====== log-regression (apache-spark) [default], iteration 5 started ======
GC before operation: completed in 119.912 ms, heap usage 278.948 MB -> 103.823 MB.
====== log-regression (apache-spark) [default], iteration 5 completed (2405.673 ms) ======
====== log-regression (apache-spark) [default], iteration 6 started ======
GC before operation: completed in 99.037 ms, heap usage 255.061 MB -> 103.969 MB.
====== log-regression (apache-spark) [default], iteration 6 completed (2534.754 ms) ======
====== log-regression (apache-spark) [default], iteration 7 started ======
GC before operation: completed in 103.666 ms, heap usage 278.998 MB -> 104.200 MB.
====== log-regression (apache-spark) [default], iteration 7 completed (2470.935 ms) ======
====== log-regression (apache-spark) [default], iteration 8 started ======
GC before operation: completed in 109.464 ms, heap usage 245.902 MB -> 104.349 MB.
====== log-regression (apache-spark) [default], iteration 8 completed (2466.112 ms) ======
====== log-regression (apache-spark) [default], iteration 9 started ======
GC before operation: completed in 104.299 ms, heap usage 223.216 MB -> 104.503 MB.
====== log-regression (apache-spark) [default], iteration 9 completed (2272.668 ms) ======
====== log-regression (apache-spark) [default], iteration 10 started ======
GC before operation: completed in 111.724 ms, heap usage 232.233 MB -> 104.695 MB.
====== log-regression (apache-spark) [default], iteration 10 completed (2128.703 ms) ======
====== log-regression (apache-spark) [default], iteration 11 started ======
GC before operation: completed in 108.033 ms, heap usage 276.428 MB -> 104.956 MB.
====== log-regression (apache-spark) [default], iteration 11 completed (2294.842 ms) ======
====== log-regression (apache-spark) [default], iteration 12 started ======
GC before operation: completed in 111.634 ms, heap usage 277.191 MB -> 105.139 MB.
====== log-regression (apache-spark) [default], iteration 12 completed (1954.784 ms) ======
====== log-regression (apache-spark) [default], iteration 13 started ======
GC before operation: completed in 107.812 ms, heap usage 255.546 MB -> 105.277 MB.
====== log-regression (apache-spark) [default], iteration 13 completed (2069.874 ms) ======
====== log-regression (apache-spark) [default], iteration 14 started ======
GC before operation: completed in 112.351 ms, heap usage 277.026 MB -> 105.494 MB.
====== log-regression (apache-spark) [default], iteration 14 completed (2318.310 ms) ======
# 省略

csvout3 如下所示:

benchmark,duration_ns,uptime_ns,vm_start_unix_ms
finagle-chirper,9284462891,294961405487,1643167830096
finagle-chirper,4957048470,304572238258,1643167830096
finagle-chirper,4059331817,310463134291,1643167830096
finagle-chirper,2990781766,314766676719,1643167830096
finagle-chirper,3061381329,317980576783,1643167830096
finagle-chirper,2980942454,321299533686,1643167830096
finagle-chirper,2721524390,324564168676,1643167830096
finagle-chirper,2556866853,327499017028,1643167830096
finagle-chirper,2554640963,330305186003,1643167830096
finagle-chirper,2550923767,333077447055,1643167830096
finagle-chirper,2565753274,335835976682,1643167830096
finagle-chirper,3059045863,338620519217,1643167830096
finagle-chirper,2553075764,341892749736,1643167830096
finagle-chirper,2676057171,344689723600,1643167830096
finagle-chirper,2489068580,347580274182,1643167830096
dotty,11388659219,232880153746,1643167830096
dotty,3831365059,245067814417,1643167830096
dotty,2828342393,249792489656,1643167830096
dotty,2684314689,253507714907,1643167830096
dotty,2421074928,256965889710,1643167830096
dotty,2583643353,260157937220,1643167830096
dotty,2232721904,263490345462,1643167830096
dotty,2179831660,266471035623,1643167830096
dotty,2271144727,269445680718,1643167830096
dotty,2061089545,272489873373,1643167830096
dotty,2049297825,275324868009,1643167830096
dotty,4156175681,278232001898,1643167830096
dotty,2421559430,283171068802,1643167830096
dotty,1972438356,286482000787,1643167830096
dotty,2141953014,289234652762,1643167830096
mnemonics,6411721253,134886046078,1643167830096
mnemonics,6126297711,141817428564,1643167830096
mnemonics,6132539392,148363719453,1643167830096
mnemonics,7080757449,154954659760,1643167830096
mnemonics,6364774471,162568092942,1643167830096
mnemonics,6176742864,169357462153,1643167830096
mnemonics,6031820851,175963769913,1643167830096
mnemonics,5750239863,182487680749,1643167830096
mnemonics,5882720582,188731539783,1643167830096
mnemonics,5748330430,195008109074,1643167830096
mnemonics,5888988608,201154827613,1643167830096
mnemonics,5740538096,207538918284,1643167830096
mnemonics,5702859582,213722680324,1643167830096
mnemonics,5711609457,219907292380,1643167830096
mnemonics,5674449151,226018627507,1643167830096
log-regression,16159039041,10693266431,1643167830096
log-regression,4197336136,27001945228,1643167830096
log-regression,2957423440,31341260371,1643167830096
log-regression,2541530897,34398213702,1643167830096
log-regression,3409589872,37025519499,1643167830096
log-regression,2405672732,40556569629,1643167830096
log-regression,2534753942,43062230091,1643167830096
log-regression,2470935189,45701542631,1643167830096
log-regression,2466112174,48283739825,1643167830096
log-regression,2272668342,50855881430,1643167830096
log-regression,2128703161,53241234312,1643167830096
log-regression,2294842220,55480351191,1643167830096
log-regression,1954784286,57889868798,1643167830096
log-regression,2069874429,59953146576,1643167830096
log-regression,2318309801,62136911317,1643167830096
gauss-mix,9495163777,70799212220,1643167830096
gauss-mix,3593591744,80639897368,1643167830096
gauss-mix,4108380800,84400240278,1643167830096
gauss-mix,3885991922,88672755182,1643167830096
gauss-mix,3573426601,92707090403,1643167830096
gauss-mix,4683462268,96426074361,1643167830096
gauss-mix,3564303041,101341640511,1643167830096
gauss-mix,3918728782,105058729628,1643167830096
gauss-mix,3318578601,109158942932,1643167830096
gauss-mix,3553654170,112689114454,1643167830096
gauss-mix,3260212579,116412697128,1643167830096
gauss-mix,3857366686,119855194280,1643167830096
gauss-mix,3059891354,123867527805,1643167830096
gauss-mix,4036006171,127084269705,1643167830096
gauss-mix,3208065577,131328652873,1643167830096
BigBrotherJu commented 2 years ago

我们写一个简单的 python 脚本来计算所有 benchmark 的平均运行时间,并打印出 markdown 表格:

def read_and_cal(file_name):
    with open(file_name, 'r') as f:
        time_sum = 0
        iter = 0
        name = ''
        for line in f.readlines():
            if line.strip().split(',')[0] == 'benchmark':
                continue
            else:
                if iter == 0:
                    name = line.strip().split(',')[0]
                iter += 1
                time_sum += int(line.strip().split(',')[1])
                # print(iter, line.strip().split(',')[1])
                if iter == 15:
                    results.append( name + ' | ' + format(time_sum/1e6/15, '0.3f') + ' ms');
                    iter = 0
                    time_sum = 0
                    name = ''
results = []
read_and_cal('csvout')
read_and_cal('csvout2')
read_and_cal('csvout3')
results.append('neo4j-analytics | 堆内存原因出错')

sorted_results = sorted(results)

print('benchmark | 平均时间')
print(':- | :-')
for result in sorted_results:
    print(result)

markdown 表格如下:

benchmark 平均时间
akka-uct 22831.624 ms
als 9232.720 ms
chi-square 1492.328 ms
db-shootout 10345.956 ms
dec-tree 2571.260 ms
dotty 3148.241 ms
finagle-chirper 3404.060 ms
finagle-http 3147.346 ms
fj-kmeans 12571.596 ms
future-genetic 11413.216 ms
gauss-mix 4074.455 ms
log-regression 3478.772 ms
mnemonics 6028.293 ms
movie-lens 15388.416 ms
naive-bayes 34277.048 ms
neo4j-analytics 堆内存原因出错
page-rank 15286.194 ms
par-mnemonics 4660.160 ms
philosophers 833.986 ms
reactors 15771.422 ms
rx-scrabble 409.123 ms
scala-doku 3568.910 ms
scala-kmeans 270.718 ms
scala-stm-bench7 1920.753 ms
scrabble 1488.428 ms
BigBrotherJu commented 2 years ago

接下来,我们在 oe rv 上运行所有 benchmark。oe rv 通过 qemu 运行在 Ubuntu 20.04.3 LTS 中。qemu 的启动参数如下,给 oe rv 模拟了 4 个CPU,4G 内存:

$ qemu-system-riscv64 \
  -nographic -machine virt \
  -smp 4 -m 4G \
  -kernel fw_payload_oe.elf \
  -drive file=openEuler-preview.riscv64.qcow2,format=qcow2,id=hd0 \
  -object rng-random,filename=/dev/urandom,id=rng0 \
  -device virtio-rng-device,rng=rng0 \
  -device virtio-blk-device,drive=hd0 \
  -device virtio-net-device,netdev=usernet \
  -netdev user,id=usernet,hostfwd=tcp::12055-:22 \
  -append 'root=/dev/vda1 rw console=ttyS0 systemd.default_timeout_start_sec=600 selinux=0 highres=off earlycon'

qemu 的版本信息如下:

juhan@juhan-ubuntu:~$ qemu-system-riscv64 --version
QEMU emulator version 4.2.1 (Debian 1:4.2-3ubuntu6.19)
Copyright (c) 2003-2019 Fabrice Bellard and the QEMU Project developers

Ubuntu 20.04.3 LTS 是运行在 MacBook Pro 2018 15 寸上的虚拟机,被分配了 6G 内存和 2 个 CPU。MacBook Pro 2018 15 寸本身的 CPU 是 i7-8750H,6 核 12 线程,内存一共 16 G。

oe rv 上的 jdk 如下所示,是 bisheng jdk,默认应该开启了 JIT:

[root@openEuler-RISCV-rare ~]# java -version
openjdk version "11.0.11" 2021-04-20
OpenJDK Runtime Environment Bisheng (build 11.0.11+9)
OpenJDK 64-Bit Server VM Bisheng (build 11.0.11+9, mixed mode, sharing)

经过初步测试,oe rv 运行 benchmark 的时间非常非常长,前 10 个 benchmark 运行接近一天也没有完成,而且我已经把每个 benchmark 的运行次数限制在 15 次了,把 benchmark 分 10 个、10 个、5 个跑不现实。下面我们尝试把 25 个 benchmark 分 5 次运行。

BigBrotherJu commented 2 years ago

oe rv 运行前 5 个 benchmark:

[root@openEuler-RISCV-rare renaissance]# java -jar renaissance-gpl-0.13.0.jar --csv csvout1 -r 15 scrabble page-rank future-genetic akka-uct movie-lens
====== scrabble (functional) [default], iteration 0 started ======
GC before operation: completed in 1007.902 ms, heap usage 99.028 MB -> 75.072 MB.
====== scrabble (functional) [default], iteration 0 completed (45010.516 ms) ======
====== scrabble (functional) [default], iteration 1 started ======
GC before operation: completed in 6444.204 ms, heap usage 131.332 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 1 completed (20441.618 ms) ======
====== scrabble (functional) [default], iteration 2 started ======
GC before operation: completed in 1224.729 ms, heap usage 157.682 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 2 completed (17460.292 ms) ======
====== scrabble (functional) [default], iteration 3 started ======
GC before operation: completed in 2156.115 ms, heap usage 217.717 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 3 completed (17878.417 ms) ======
====== scrabble (functional) [default], iteration 4 started ======
GC before operation: completed in 4131.425 ms, heap usage 335.099 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 4 completed (18289.626 ms) ======
====== scrabble (functional) [default], iteration 5 started ======
GC before operation: completed in 990.793 ms, heap usage 203.036 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 5 completed (18985.840 ms) ======
====== scrabble (functional) [default], iteration 6 started ======
GC before operation: completed in 1062.312 ms, heap usage 110.965 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 6 completed (18671.895 ms) ======
====== scrabble (functional) [default], iteration 7 started ======
GC before operation: completed in 1044.787 ms, heap usage 233.724 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 7 completed (18518.409 ms) ======
====== scrabble (functional) [default], iteration 8 started ======
GC before operation: completed in 942.999 ms, heap usage 134.049 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 8 completed (19707.737 ms) ======
====== scrabble (functional) [default], iteration 9 started ======
GC before operation: completed in 7290.651 ms, heap usage 297.272 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 9 completed (20636.469 ms) ======
====== scrabble (functional) [default], iteration 10 started ======
GC before operation: completed in 1009.495 ms, heap usage 204.209 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 10 completed (19017.329 ms) ======
====== scrabble (functional) [default], iteration 11 started ======
GC before operation: completed in 9845.483 ms, heap usage 293.261 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 11 completed (19004.322 ms) ======
====== scrabble (functional) [default], iteration 12 started ======
GC before operation: completed in 7568.473 ms, heap usage 256.855 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 12 completed (18741.109 ms) ======
====== scrabble (functional) [default], iteration 13 started ======
GC before operation: completed in 7177.194 ms, heap usage 244.406 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 13 completed (19530.786 ms) ======
====== scrabble (functional) [default], iteration 14 started ======
GC before operation: completed in 1026.863 ms, heap usage 177.561 MB -> 75.317 MB.
====== scrabble (functional) [default], iteration 14 completed (20450.816 ms) ======
# 后面省略

csvout1 内容:

benchmark,duration_ns,uptime_ns,vm_start_unix_ms
page-rank,680830015987,547936863325,1643258667137
page-rank,342641713122,1462289328542,1643258667137
page-rank,427980139916,1942510877718,1643258667137
page-rank,425140223714,2513389896815,1643258667137
page-rank,413965986841,3155920354987,1643258667137
page-rank,393210686335,3741477635941,1643258667137
page-rank,399748219349,4273705477273,1643258667137
page-rank,408611083698,4833527676727,1643258667137
page-rank,424603938371,5444206093950,1643258667137
page-rank,398440852895,6059076979312,1643258667137
page-rank,438196428929,6662989992361,1643258667137
page-rank,526962162641,7310497374012,1643258667137
page-rank,431434862911,8036877855753,1643258667137
page-rank,404325939661,8669293757437,1643258667137
page-rank,469002741941,9239542612856,1643258667137
akka-uct,869540711145,11080828255005,1643258667137
akka-uct,629668030305,11954327811145,1643258667137
akka-uct,616172809352,12587751503664,1643258667137
akka-uct,598799124185,13207490059383,1643258667137
akka-uct,606223215345,13811452767736,1643258667137
akka-uct,649810265550,14421910319306,1643258667137
akka-uct,587032583535,15075430121941,1643258667137
akka-uct,572718471009,15666261849271,1643258667137
akka-uct,589600793163,16242837337274,1643258667137
akka-uct,615184103842,16837421213113,1643258667137
akka-uct,576596602032,17457236086313,1643258667137
akka-uct,586341408207,18038161393714,1643258667137
akka-uct,619650810980,18629581151363,1643258667137
akka-uct,618776133243,19253614679001,1643258667137
akka-uct,615667983648,19876677146725,1643258667137
scrabble,45010516004,28490813947,1643258667137
scrabble,20441618401,80352755687,1643258667137
scrabble,17460291910,102053584001,1643258667137
scrabble,17878417322,121727135830,1643258667137
scrabble,18289626351,143789891732,1643258667137
scrabble,18985840002,163105491629,1643258667137
scrabble,18671895363,183173200854,1643258667137
scrabble,18518408571,202944484006,1643258667137
scrabble,19707736849,222464854858,1643258667137
scrabble,20636468585,249487890622,1643258667137
scrabble,19017328810,271187012985,1643258667137
scrabble,19004322353,300079776628,1643258667137
scrabble,18741109119,326696955691,1643258667137
scrabble,19530785709,352666209336,1643258667137
scrabble,20450816309,373250579341,1643258667137
movie-lens,1219801068804,20775429102644,1643258667137
movie-lens,777014772993,22093577531051,1643258667137
movie-lens,931447363496,22965953879595,1643258667137
movie-lens,628798814468,23900102972650,1643258667137
movie-lens,704492638858,24531003829602,1643258667137
movie-lens,719813793975,25237816929772,1643258667137
movie-lens,711287721967,25959990536669,1643258667137
movie-lens,833456231923,26673183407849,1643258667137
movie-lens,753861343963,27509660998073,1643258667137
movie-lens,784348390798,28265610386256,1643258667137
movie-lens,1184058410553,29052421686987,1643258667137
movie-lens,637403915661,30238764839894,1643258667137
movie-lens,802827581423,30878183977421,1643258667137
movie-lens,837487616336,31682828653239,1643258667137
movie-lens,676631249208,32522292253820,1643258667137
future-genetic,137560840554,9976967539281,1643258667137
future-genetic,48261935288,10159908914412,1643258667137
future-genetic,45928334620,10209172397288,1643258667137
future-genetic,48270755763,10256095547656,1643258667137
future-genetic,53390721621,10305322652306,1643258667137
future-genetic,43652406481,10359986610111,1643258667137
future-genetic,45342210257,10448659104150,1643258667137
future-genetic,49423191558,10495921773851,1643258667137
future-genetic,43772472507,10547009163115,1643258667137
future-genetic,45520764411,10596298235258,1643258667137
future-genetic,47702649936,10644494688695,1643258667137
future-genetic,67202027647,10742215916894,1643258667137
future-genetic,68427053326,10857853294736,1643258667137
future-genetic,46973069654,10927661543683,1643258667137
future-genetic,45692268025,10975754754354,1643258667137

oe rv 运行后面 5 个 benchmark,其中 db-shootout 出错,出错信息如下所示,出错原因好像是共享库不存在:

[root@openEuler-RISCV-rare renaissance]# java -jar renaissance-gpl-0.13.0.jar --csv csvout2 -r 15 scala-doku chi-square fj-kmeans rx-scrabble db-shootout
====== scala-doku (scala) [default], iteration 0 started ======
GC before operation: completed in 373.474 ms, heap usage 10.104 MB -> 4.189 MB.
====== scala-doku (scala) [default], iteration 0 completed (81548.014 ms) ======
====== scala-doku (scala) [default], iteration 1 started ======
GC before operation: completed in 3941.692 ms, heap usage 193.896 MB -> 4.474 MB.
====== scala-doku (scala) [default], iteration 1 completed (72169.273 ms) ======
====== scala-doku (scala) [default], iteration 2 started ======
GC before operation: completed in 921.904 ms, heap usage 100.976 MB -> 4.475 MB.
====== scala-doku (scala) [default], iteration 2 completed (65127.347 ms) ======
====== scala-doku (scala) [default], iteration 3 started ======
GC before operation: completed in 13520.913 ms, heap usage 327.431 MB -> 4.475 MB.
====== scala-doku (scala) [default], iteration 3 completed (69954.567 ms) ======
====== scala-doku (scala) [default], iteration 4 started ======
GC before operation: completed in 693.252 ms, heap usage 107.902 MB -> 4.475 MB.
====== scala-doku (scala) [default], iteration 4 completed (69234.362 ms) ======
====== scala-doku (scala) [default], iteration 5 started ======
GC before operation: completed in 594.236 ms, heap usage 182.151 MB -> 4.475 MB.
====== scala-doku (scala) [default], iteration 5 completed (64164.821 ms) ======
====== scala-doku (scala) [default], iteration 6 started ======
GC before operation: completed in 516.692 ms, heap usage 123.660 MB -> 4.476 MB.
====== scala-doku (scala) [default], iteration 6 completed (65921.344 ms) ======
====== scala-doku (scala) [default], iteration 7 started ======
GC before operation: completed in 586.748 ms, heap usage 129.978 MB -> 4.477 MB.
====== scala-doku (scala) [default], iteration 7 completed (71327.236 ms) ======
====== scala-doku (scala) [default], iteration 8 started ======
GC before operation: completed in 682.571 ms, heap usage 92.633 MB -> 4.477 MB.
====== scala-doku (scala) [default], iteration 8 completed (79081.786 ms) ======
====== scala-doku (scala) [default], iteration 9 started ======
GC before operation: completed in 1458.669 ms, heap usage 87.590 MB -> 4.477 MB.
====== scala-doku (scala) [default], iteration 9 completed (80359.045 ms) ======
====== scala-doku (scala) [default], iteration 10 started ======
GC before operation: completed in 2507.997 ms, heap usage 185.755 MB -> 4.477 MB.
====== scala-doku (scala) [default], iteration 10 completed (74667.533 ms) ======
====== scala-doku (scala) [default], iteration 11 started ======
GC before operation: completed in 3575.837 ms, heap usage 196.685 MB -> 4.477 MB.
====== scala-doku (scala) [default], iteration 11 completed (75110.432 ms) ======
====== scala-doku (scala) [default], iteration 12 started ======
GC before operation: completed in 2178.126 ms, heap usage 218.353 MB -> 4.477 MB.
====== scala-doku (scala) [default], iteration 12 completed (73404.866 ms) ======
====== scala-doku (scala) [default], iteration 13 started ======
GC before operation: completed in 10065.387 ms, heap usage 90.082 MB -> 3.859 MB.
====== scala-doku (scala) [default], iteration 13 completed (72783.088 ms) ======
====== scala-doku (scala) [default], iteration 14 started ======
GC before operation: completed in 1218.833 ms, heap usage 147.849 MB -> 3.311 MB.
====== scala-doku (scala) [default], iteration 14 completed (77401.598 ms) ======
# 省略
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Benchmark 'db-shootout' failed with exception:
java.lang.UnsatisfiedLinkError: Native library (com/sun/jna/linux-riscv64/libjnidispatch.so) not found in resource path ([file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/database_2.13-0.13.0.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/scala-library-2.13.6.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/jnr-posix-3.0.29.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/commons-math3-3.6.1.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/agrona-0.9.7.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/zero-allocation-hashing-0.6.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/mapdb-3.0.1.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/h2-mvstore-1.4.192.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/chronicle-core-2.17.2.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/chronicle-bytes-2.17.7.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/chronicle-threads-2.17.1.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/chronicle-map-3.17.0.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/jnr-ffi-2.0.9.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/jnr-constants-0.9.1.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/kotlin-stdlib-1.0.2.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/eclipse-collections-api-7.1.2.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/eclipse-collections-7.1.2.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/eclipse-collections-forkjoin-7.1.2.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/guava-19.0.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/lz4-1.3.0.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/elsa-3.0.0-M5.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/slf4j-api-1.7.25.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/annotations-12.0.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/affinity-3.1.10.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/chronicle-values-2.16.1.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/chronicle-wire-2.17.5.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/chronicle-algorithms-1.16.0.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/jna-4.2.1.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/jna-platform-4.2.1.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/xstream-1.4.9.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/jettison-1.3.8.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/pax-url-aether-2.4.5.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/jffi-1.2.11.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/jffi-1.2.11-native.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/asm-5.0.3.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/asm-commons-5.0.3.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/asm-analysis-5.0.3.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/asm-tree-5.0.3.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/asm-util-5.0.3.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/jnr-x86asm-1.0.2.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/kotlin-runtime-1.0.2.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/javapoet-1.5.1.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/xmlpull-1.1.3.1.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/xpp3_min-1.1.4c.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/stax-api-1.0.1.jar, file:/root/local/java-test/renaissance/harness-030018-12539238027838761898/database/lib/jcl-over-slf4j-1.6.6.jar])
    at com.sun.jna.Native.loadNativeDispatchLibraryFromClasspath(Native.java:866)
    at com.sun.jna.Native.loadNativeDispatchLibrary(Native.java:826)
    at com.sun.jna.Native.<clinit>(Native.java:140)
    at net.openhft.chronicle.hash.impl.util.jna.PosixFallocate.<clinit>(PosixFallocate.java:18)
    at net.openhft.chronicle.hash.impl.VanillaChronicleHash.map(VanillaChronicleHash.java:977)
    at net.openhft.chronicle.hash.impl.VanillaChronicleHash.createMappedStoreAndSegments(VanillaChronicleHash.java:485)
    at net.openhft.chronicle.map.ChronicleMapBuilder.createWithNewFile(ChronicleMapBuilder.java:1748)
    at net.openhft.chronicle.map.ChronicleMapBuilder.createWithFile(ChronicleMapBuilder.java:1652)
    at net.openhft.chronicle.map.ChronicleMapBuilder.createPersistedTo(ChronicleMapBuilder.java:1552)
    at org.lmdbjava.bench.Chronicle$CommonChronicleMap.setup(Chronicle.java:122)
    at org.lmdbjava.bench.Chronicle$Reader.setup(Chronicle.java:204)
    at org.renaissance.database.DbShootout.setUpBeforeAll(DbShootout.scala:65)
    at org.renaissance.harness.ExecutionDriver.executeBenchmark(ExecutionDriver.java:82)
    at org.renaissance.harness.RenaissanceSuite$.$anonfun$runBenchmarks$1(RenaissanceSuite.scala:140)
    at org.renaissance.harness.RenaissanceSuite$.$anonfun$runBenchmarks$1$adapted(RenaissanceSuite.scala:136)
    at scala.collection.immutable.List.foreach(List.scala:333)
    at org.renaissance.harness.RenaissanceSuite$.runBenchmarks(RenaissanceSuite.scala:136)
    at org.renaissance.harness.RenaissanceSuite$.main(RenaissanceSuite.scala:117)
    at org.renaissance.harness.RenaissanceSuite.main(RenaissanceSuite.scala)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at org.renaissance.core.Launcher.loadAndInvokeHarnessClass(Launcher.java:114)
    at org.renaissance.core.Launcher.launchHarnessClass(Launcher.java:73)
    at org.renaissance.core.Launcher.main(Launcher.java:37)
The following benchmarks failed: db-shootout

csvout2 如下:

benchmark,duration_ns,uptime_ns,vm_start_unix_ms
chi-square,137271987577,1326617244418,1643338807081
chi-square,29651791665,1477787556795,1643338807081
chi-square,25701144560,1512850001277,1643338807081
chi-square,22397410609,1540906274138,1643338807081
chi-square,20882216929,1565587306130,1643338807081
chi-square,21597325267,1588638682483,1643338807081
chi-square,25591552853,1641539827329,1643338807081
chi-square,19991560125,1698379362933,1643338807081
chi-square,18969317128,1720331092228,1643338807081
chi-square,17508679485,1741219620943,1643338807081
chi-square,20149883059,1760673944780,1643338807081
chi-square,19211397220,1782949756550,1643338807081
chi-square,19494418464,1804142340350,1643338807081
chi-square,21525499582,1825565845795,1643338807081
chi-square,18325959241,1850731303468,1643338807081
fj-kmeans,165714228990,1882238567645,1643338807081
fj-kmeans,76136230729,2086644794384,1643338807081
fj-kmeans,86254448311,2199907428178,1643338807081
fj-kmeans,85773322136,2323335163167,1643338807081
fj-kmeans,85462220788,2411100493493,1643338807081
fj-kmeans,83345824825,2501489674581,1643338807081
fj-kmeans,92464032175,2622101320457,1643338807081
fj-kmeans,86710288669,2716252400937,1643338807081
fj-kmeans,85462338074,2806135212979,1643338807081
fj-kmeans,84062446076,2893189209807,1643338807081
fj-kmeans,82379528419,2982683908084,1643338807081
fj-kmeans,79982734983,3077027989489,1643338807081
fj-kmeans,76948940693,3161768311305,1643338807081
fj-kmeans,76802010120,3240343541554,1643338807081
fj-kmeans,76361882480,3318529042126,1643338807081
rx-scrabble,31962474809,3446968279416,1643338807081
rx-scrabble,12522134178,3517623162229,1643338807081
rx-scrabble,9100051402,3531471307178,1643338807081
rx-scrabble,8763453076,3541508012750,1643338807081
rx-scrabble,8078788617,3551260716758,1643338807081
rx-scrabble,10829683879,3560264273307,1643338807081
rx-scrabble,7655524931,3572596870065,1643338807081
rx-scrabble,10165793541,3581207917816,1643338807081
rx-scrabble,8247313314,3593608238457,1643338807081
rx-scrabble,7936644775,3602808296852,1643338807081
rx-scrabble,5988560218,3613008360132,1643338807081
rx-scrabble,9821771561,3619939328055,1643338807081
rx-scrabble,8531031512,3631009415565,1643338807081
rx-scrabble,10024549755,3640396265002,1643338807081
rx-scrabble,8081442715,3651677986027,1643338807081
scala-doku,81548013556,18058792596,1643338807081
scala-doku,72169272557,103621753826,1643338807081
scala-doku,65127347186,176765090312,1643338807081
scala-doku,69954567468,255466522093,1643338807081
scala-doku,69234361662,326140727592,1643338807081
scala-doku,64164820555,396000924222,1643338807081
scala-doku,65921343985,460705297167,1643338807081
scala-doku,71327235908,527248983834,1643338807081
scala-doku,79081786129,599291270124,1643338807081
scala-doku,80359045011,679866774218,1643338807081
scala-doku,74667533322,762754523273,1643338807081
scala-doku,75110432269,841026686270,1643338807081
scala-doku,73404866209,918373489065,1643338807081
scala-doku,72783088047,1001891367851,1643338807081
scala-doku,77401597644,1075947465119,1643338807081
BigBrotherJu commented 2 years ago

继续运行后面 5 个 benchmark。我是在宿主机上 ssh 连接到虚拟机中的 oe rv 的,每次运行到 reactors 时,宿主机上连接到 oe rv 的终端模拟器就会卡住,没有反应。另外 neo4j-analytics 一直会出错,原因好像也是一个共享库找不到。结果如下:

[root@openEuler-RISCV-rare renaissance]# java -jar renaissance-gpl-0.13.0.jar --csv csvout3 -r 15 neo4j-analytics finagle-http reactors dec-tree scala-stm-bench7
Creating graph database...
Benchmark 'neo4j-analytics' failed with exception:
java.lang.UnsatisfiedLinkError: Native library (com/sun/jna/linux-riscv64/libjnidispatch.so) not found in resource path ([file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j_2.12-0.13.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/scala-library-2.12.15.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/lift-json_2.12-3.4.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/annotations-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-kernel-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-fabric-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-procedure-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-lucene-index-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-fulltext-index-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-graph-algo-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-data-collector-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-security-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-bolt-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-consistency-check-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-record-storage-engine-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-dbms-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-import-tool-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-batch-insert-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-server-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/scalap-2.12.15.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/paranamer-2.8.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/scala-xml_2.12-1.3.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/eclipse-collections-10.3.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/commons-lang3-3.11.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-native-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-graphdb-api-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-storage-engine-api-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-kernel-api-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-common-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-values-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-collections-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-io-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-logging-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-configuration-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-layout-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-index-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-spatial-index-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-id-generator-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-label-index-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-wal-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jctools-core-3.1.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/commons-io-2.7.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/reactor-core-3.3.9.RELEASE.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-front-end-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/scala-reflect-2.12.15.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-procedure-api-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-codegen-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-expression-evaluator-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-resource-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/lucene-analyzers-common-8.5.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/lucene-core-8.5.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/lucene-queryparser-8.5.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/lucene-backward-codecs-8.5.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-runtime-util-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-parser-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-rewriting-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-exceptions-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-util-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-planner-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-planner-spi-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-interpreted-runtime-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/parboiled-scala_2.12-1.2.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/shiro-core-1.7.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-command-line-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/slf4j-nop-1.7.30.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-ssl-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/netty-all-4.1.55.Final.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/bcpkix-jdk15on-1.68.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-import-util-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jProcesses-1.6.5.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/commons-compress-1.20.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/commons-text-1.9.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/zstd-proxy-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/server-api-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jetty-server-9.4.38.v20210224.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jetty-webapp-9.4.38.v20210224.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jersey-server-2.32.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jersey-hk2-2.32.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jersey-container-servlet-2.32.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/commons-logging-1.2.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jackson-core-2.11.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jackson-jaxrs-json-provider-2.11.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jackson-databind-2.11.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/bcprov-jdk15on-1.68.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/activation-1.1.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jaxb-runtime-2.3.2.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jaxb-api-2.3.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/scala-compiler-2.12.15.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/eclipse-collections-api-10.3.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jna-5.6.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-lock-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-diagnostics-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-token-api-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-schema-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-monitoring-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-unsafe-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-concurrent-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jettison-1.4.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/commons-exec-1.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/reactive-streams-1.0.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-expressions-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-macros-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/asm-8.0.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/asm-util-8.0.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/asm-analysis-8.0.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/asm-tree-8.0.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-ast-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-logical-plans-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jamm-0.3.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-ir-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-javacc-parser-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-cypher-ast-factory-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/caffeine-2.8.5.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/neo4j-csv-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/parboiled-core-1.2.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/shiro-lang-1.7.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/shiro-cache-1.7.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/shiro-crypto-hash-1.7.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/shiro-crypto-cipher-1.7.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/shiro-config-core-1.7.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/shiro-config-ogdl-1.7.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/shiro-event-1.7.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/picocli-4.5.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/slf4j-api-1.7.30.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/WMI4Java-1.6.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/zstd-jni-1.4.5-6.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/javax.ws.rs-api-2.1.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/javax.servlet-api-3.1.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jetty-http-9.4.38.v20210224.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jetty-io-9.4.38.v20210224.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jetty-xml-9.4.38.v20210224.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jetty-servlet-9.4.38.v20210224.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jersey-common-2.32.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jersey-client-2.32.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jakarta.ws.rs-api-2.1.6.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jersey-media-jaxb-2.32.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jakarta.annotation-api-1.3.5.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jakarta.inject-2.6.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jakarta.validation-api-2.0.2.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/hk2-locator-2.6.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/javassist-3.25.0-GA.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jersey-container-servlet-core-2.32.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jackson-jaxrs-base-2.11.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jackson-module-jaxb-annotations-2.11.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jackson-annotations-2.11.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/txw2-2.3.2.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/istack-commons-runtime-3.0.8.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/stax-ex-1.8.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/FastInfoset-1.2.16.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jakarta.activation-api-1.2.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/cypher-ast-factory-4.2.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/shiro-crypto-core-1.7.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/commons-beanutils-1.9.4.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jPowerShell-3.0.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jetty-util-9.4.38.v20210224.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jetty-security-9.4.38.v20210224.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/jetty-util-ajax-9.4.38.v20210224.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/osgi-resource-locator-1.0.3.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/hk2-api-2.6.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/hk2-utils-2.6.1.jar, file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/neo4j/lib/commons-collections-3.2.2.jar])
    at com.sun.jna.Native.loadNativeDispatchLibraryFromClasspath(Native.java:1032)
    at com.sun.jna.Native.loadNativeDispatchLibrary(Native.java:988)
    at com.sun.jna.Native.<clinit>(Native.java:195)
    at org.neo4j.internal.unsafe.UnsafeUtil.allocateMemory(UnsafeUtil.java:441)
    at org.neo4j.io.pagecache.impl.muninn.VictimPageReference.getVictimPage(VictimPageReference.java:42)
    at org.neo4j.io.pagecache.impl.muninn.MuninnPageCache.<init>(MuninnPageCache.java:284)
    at org.neo4j.io.pagecache.impl.muninn.MuninnPageCache.<init>(MuninnPageCache.java:256)
    at org.neo4j.kernel.impl.pagecache.ConfiguringPageCacheFactory.createPageCache(ConfiguringPageCacheFactory.java:99)
    at org.neo4j.kernel.impl.pagecache.ConfiguringPageCacheFactory.getOrCreatePageCache(ConfiguringPageCacheFactory.java:87)
    at org.neo4j.graphdb.factory.module.GlobalModule.createPageCache(GlobalModule.java:373)
    at org.neo4j.graphdb.factory.module.GlobalModule.lambda$new$1(GlobalModule.java:219)
    at org.neo4j.graphdb.factory.module.GlobalModule.tryResolveOrCreate(GlobalModule.java:261)
    at org.neo4j.graphdb.factory.module.GlobalModule.<init>(GlobalModule.java:218)
    at org.neo4j.graphdb.facade.DatabaseManagementServiceFactory.createGlobalModule(DatabaseManagementServiceFactory.java:252)
    at org.neo4j.graphdb.facade.DatabaseManagementServiceFactory.build(DatabaseManagementServiceFactory.java:126)
    at org.neo4j.dbms.api.DatabaseManagementServiceBuilder.newDatabaseManagementService(DatabaseManagementServiceBuilder.java:95)
    at org.neo4j.dbms.api.DatabaseManagementServiceBuilder.build(DatabaseManagementServiceBuilder.java:88)
    at org.renaissance.neo4j.Neo4jAnalytics.createGraphDatabase(Neo4jAnalytics.scala:85)
    at org.renaissance.neo4j.Neo4jAnalytics.setUpBeforeAll(Neo4jAnalytics.scala:49)
    at org.renaissance.harness.ExecutionDriver.executeBenchmark(ExecutionDriver.java:82)
    at org.renaissance.harness.RenaissanceSuite$.$anonfun$runBenchmarks$1(RenaissanceSuite.scala:140)
    at org.renaissance.harness.RenaissanceSuite$.$anonfun$runBenchmarks$1$adapted(RenaissanceSuite.scala:136)
    at scala.collection.immutable.List.foreach(List.scala:333)
    at org.renaissance.harness.RenaissanceSuite$.runBenchmarks(RenaissanceSuite.scala:136)
    at org.renaissance.harness.RenaissanceSuite$.main(RenaissanceSuite.scala:117)
    at org.renaissance.harness.RenaissanceSuite.main(RenaissanceSuite.scala)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at org.renaissance.core.Launcher.loadAndInvokeHarnessClass(Launcher.java:114)
    at org.renaissance.core.Launcher.launchHarnessClass(Launcher.java:73)
    at org.renaissance.core.Launcher.main(Launcher.java:37)
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.twitter.jvm.Hotspot (file:/root/local/java-test/renaissance/harness-090011-8473462232088143436/twitter-finagle/lib/util-jvm_2.12-19.4.0.jar) to field sun.management.ManagementFactoryHelper.jvm
WARNING: Please consider reporting this to the maintainers of com.twitter.jvm.Hotspot
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
[2022-01-28T09:01:43.875+0000] com.twitter.finagle (com.twitter.finagle.Init$ $anonfun$once$1)
INFO: Finagle version 19.4.0 (rev=15ae0aba979a2c11ed4a71774b2e995f5df918b4) built at 20190418-114348
finagle-http on :38289 spawning 4 client and default number of server workers.
====== finagle-http (web) [default], iteration 0 started ======
GC before operation: completed in 477.801 ms, heap usage 31.692 MB -> 23.645 MB.
====== finagle-http (web) [default], iteration 0 completed (777902.537 ms) ======
====== finagle-http (web) [default], iteration 1 started ======
GC before operation: completed in 19552.992 ms, heap usage 161.321 MB -> 27.048 MB.
====== finagle-http (web) [default], iteration 1 completed (271884.338 ms) ======
====== finagle-http (web) [default], iteration 2 started ======
GC before operation: completed in 31649.268 ms, heap usage 44.214 MB -> 22.151 MB.
====== finagle-http (web) [default], iteration 2 completed (283593.572 ms) ======
====== finagle-http (web) [default], iteration 3 started ======
GC before operation: completed in 1271.853 ms, heap usage 128.989 MB -> 22.142 MB.
====== finagle-http (web) [default], iteration 3 completed (272979.272 ms) ======
====== finagle-http (web) [default], iteration 4 started ======
GC before operation: completed in 1226.227 ms, heap usage 33.414 MB -> 22.260 MB.
====== finagle-http (web) [default], iteration 4 completed (259833.374 ms) ======
====== finagle-http (web) [default], iteration 5 started ======
GC before operation: completed in 43094.235 ms, heap usage 175.391 MB -> 22.173 MB.
====== finagle-http (web) [default], iteration 5 completed (306656.233 ms) ======
====== finagle-http (web) [default], iteration 6 started ======
GC before operation: completed in 1576.375 ms, heap usage 74.117 MB -> 22.181 MB.
====== finagle-http (web) [default], iteration 6 completed (336341.421 ms) ======
====== finagle-http (web) [default], iteration 7 started ======
GC before operation: completed in 4866.081 ms, heap usage 183.710 MB -> 22.209 MB.
====== finagle-http (web) [default], iteration 7 completed (282426.172 ms) ======
====== finagle-http (web) [default], iteration 8 started ======
GC before operation: completed in 2602.376 ms, heap usage 96.492 MB -> 22.176 MB.
====== finagle-http (web) [default], iteration 8 completed (253138.764 ms) ======
====== finagle-http (web) [default], iteration 9 started ======
GC before operation: completed in 3699.490 ms, heap usage 128.155 MB -> 22.262 MB.
====== finagle-http (web) [default], iteration 9 completed (220823.739 ms) ======
====== finagle-http (web) [default], iteration 10 started ======
GC before operation: completed in 1078.924 ms, heap usage 27.283 MB -> 22.219 MB.
====== finagle-http (web) [default], iteration 10 completed (250178.990 ms) ======
====== finagle-http (web) [default], iteration 11 started ======
GC before operation: completed in 4206.825 ms, heap usage 149.749 MB -> 22.175 MB.
====== finagle-http (web) [default], iteration 11 completed (244324.637 ms) ======
====== finagle-http (web) [default], iteration 12 started ======
GC before operation: completed in 1136.674 ms, heap usage 69.697 MB -> 22.195 MB.
====== finagle-http (web) [default], iteration 12 completed (226171.192 ms) ======
====== finagle-http (web) [default], iteration 13 started ======
GC before operation: completed in 2174.845 ms, heap usage 213.451 MB -> 22.258 MB.
====== finagle-http (web) [default], iteration 13 completed (242946.735 ms) ======
====== finagle-http (web) [default], iteration 14 started ======
GC before operation: completed in 2203.784 ms, heap usage 97.178 MB -> 22.196 MB.
====== finagle-http (web) [default], iteration 14 completed (232106.323 ms) ======
====== reactors (concurrency) [default], iteration 0 started ======
GC before operation: completed in 3941.496 ms, heap usage 205.943 MB -> 23.832 MB.
Baseline workload: Reactor scheduling events
BigBench workload: Many-to-many message ping pong
CountingActor workload: Single reactor event processing
Fibonacci workload: Dynamic reactor mix with varying lifetimes
ForkJoinCreation workload: Reactor creation performance
ForkJoinThroughput workload: Reactor processing performance
PingPong workload: Reactor pair sequential ping pong performance
StreamingPingPong workload: Reactor pair overlapping ping pong performance
Roundabout workload: Many channels reactor performance
ThreadRing workload: Reactor ring forwarding performance
====== reactors (concurrency) [default], iteration 0 completed (664465.706 ms) ======
====== reactors (concurrency) [default], iteration 1 started ======
GC before operation: completed in 49968.988 ms, heap usage 535.658 MB -> 26.362 MB.
Baseline workload: Reactor scheduling events
BigBench workload: Many-to-many message ping pong
CountingActor workload: Single reactor event processing
Fibonacci workload: Dynamic reactor mix with varying lifetimes
ForkJoinCreation workload: Reactor creation performance
ForkJoinThroughput workload: Reactor processing performance
PingPong workload: Reactor pair sequential ping pong performance
java.lang.NullPointerException
    at io.reactors.package$LowPriorityChannelOps$.$bang$extension(package.scala:171)
    at org.renaissance.actors.PingPong$PingPongInner$1.$anonfun$ping$3(Reactors.scala:413)
    at org.renaissance.actors.PingPong$PingPongInner$1.$anonfun$ping$3$adapted(Reactors.scala:412)
    at io.reactors.Events$OnEventOrDone.react(Events.scala:1665)
    at io.reactors.Events$Push.reactAll(Events.scala:1275)
    at io.reactors.Events$Push.reactAll$(Events.scala:1267)
    at io.reactors.Events$Emitter.reactAll(Events.scala:1489)
    at io.reactors.Events$Emitter.react(Events.scala:1494)
    at io.reactors.EventQueue$UnrolledRing.dequeue(EventQueue.scala:116)
    at io.reactors.Connector.dequeue(Connector.scala:61)
    at io.reactors.concurrent.Frame.drain$1(Frame.scala:322)
    at io.reactors.concurrent.Frame.processEvents(Frame.scala:345)
    at io.reactors.concurrent.Frame.processBatch(Frame.scala:494)
    at io.reactors.concurrent.Frame.isolateAndProcessBatch(Frame.scala:210)
    at io.reactors.concurrent.Frame.executeBatch(Frame.scala:183)
    at io.reactors.JvmScheduler$ReactorForkJoinWorkerThread.executeNow(JvmScheduler.scala:126)
    at io.reactors.JvmScheduler$ReactorForkJoinWorkerThread.postschedule(JvmScheduler.scala:151)
    at io.reactors.JvmScheduler$Executed.postschedule(JvmScheduler.scala:273)
    at io.reactors.concurrent.Frame.executeBatch(Frame.scala:202)
    at io.reactors.JvmScheduler$Executed$$anon$5.run(JvmScheduler.scala:255)
    at java.base/java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1426)
    at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
    at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
    at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
    at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
    at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)

如上所示,reactors 运行好第一遍以后,第二遍输出错误内容以后就卡住了。

下面尝试直接在虚拟机的终端模拟器中运行这 5 个 benchmark,结果类似,neo4j-analytics 会出错,finagle-http 可以正常运行,reactors 在运行到第8遍的时候终端模拟器卡住了。接下来,我们只运行 finagle-http dec-tree scala-stm-bench7。

finagle-http 和 scala-stm-bench7 的运行没有什么问题,dec-tree 的运行过程中有报 error 和 warning,但是最后可以得到运行时间。iteration 5、6、12、13 好像是超时原因报的错。

[root@openEuler-RISCV-rare renaissance]# java -jar renaissance-gpl-0.13.0.jar --csv csvout3 -r 15 finagle-http dec-tree scala-stm-bench7
# 省略 finagle-http 结果
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
NOTE: 'dec-tree' benchmark uses Spark local executor with 4 (out of 4) threads.
====== dec-tree (apache-spark) [default], iteration 0 started ======
GC before operation: completed in 55175.155 ms, heap usage 69.503 MB -> 34.212 MB.
====== dec-tree (apache-spark) [default], iteration 0 completed (409482.279 ms) ======
====== dec-tree (apache-spark) [default], iteration 1 started ======
GC before operation: completed in 78464.802 ms, heap usage 109.751 MB -> 63.896 MB.
====== dec-tree (apache-spark) [default], iteration 1 completed (77799.202 ms) ======
====== dec-tree (apache-spark) [default], iteration 2 started ======
GC before operation: completed in 78717.050 ms, heap usage 99.018 MB -> 64.137 MB.
====== dec-tree (apache-spark) [default], iteration 2 completed (83360.592 ms) ======
====== dec-tree (apache-spark) [default], iteration 3 started ======
GC before operation: completed in 87624.017 ms, heap usage 143.410 MB -> 63.892 MB.
====== dec-tree (apache-spark) [default], iteration 3 completed (53809.056 ms) ======
====== dec-tree (apache-spark) [default], iteration 4 started ======
GC before operation: completed in 91045.691 ms, heap usage 205.630 MB -> 63.510 MB.
====== dec-tree (apache-spark) [default], iteration 4 completed (49684.374 ms) ======
====== dec-tree (apache-spark) [default], iteration 5 started ======
GC before operation: completed in 113465.305 ms, heap usage 319.015 MB -> 62.797 MB.
22/01/29 09:42:07 WARN HeartbeatReceiver: Removing executor driver with no recent heartbeats: 123135 ms exceeds timeout 120000 ms
22/01/29 09:42:09 WARN SparkContext: Killing executors is not supported by current scheduler.
====== dec-tree (apache-spark) [default], iteration 5 completed (61376.356 ms) ======
====== dec-tree (apache-spark) [default], iteration 6 started ======
GC before operation: completed in 120201.246 ms, heap usage 168.041 MB -> 61.796 MB.
22/01/29 09:45:13 WARN NettyRpcEnv: Ignored message: 0
22/01/29 09:45:14 ERROR BlockManagerMasterEndpoint: Fail to know the executor driver is alive or not.
org.apache.spark.SparkException: Exception thrown in awaitResult:
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
    at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:112)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:111)
    at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$handleBlockRemovalFailure$1.applyOrElse(BlockManagerMasterEndpoint.scala:226)
    at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$handleBlockRemovalFailure$1.applyOrElse(BlockManagerMasterEndpoint.scala:217)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
    at scala.util.Failure.recover(Try.scala:234)
    at scala.concurrent.Future.$anonfun$recover$1(Future.scala:395)
    at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:41245
    at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148)
    at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144)
    at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307)
    at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99)
    at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
    at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallback(Promise.scala:316)
    at scala.concurrent.impl.Promise$DefaultPromise.onComplete(Promise.scala:307)
    at scala.concurrent.impl.Promise.transformWith(Promise.scala:40)
    at scala.concurrent.impl.Promise.transformWith$(Promise.scala:38)
    at scala.concurrent.impl.Promise$DefaultPromise.transformWith(Promise.scala:187)
    at scala.concurrent.Future.flatMap(Future.scala:306)
    at scala.concurrent.Future.flatMap$(Future.scala:306)
    at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:187)
    at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150)
    ... 16 more
22/01/29 09:45:15 WARN BlockManagerMasterEndpoint: Error trying to remove broadcast 78. The executor driver may have been lost.
org.apache.spark.rpc.RpcTimeoutException: Cannot receive any reply from localhost:41245 in 120 seconds. This timeout is controlled by spark.rpc.askTimeout
    at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
    at scala.util.Failure.recover(Try.scala:234)
    at scala.concurrent.Future.$anonfun$recover$1(Future.scala:395)
    at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99)
    at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
    at scala.concurrent.Promise.complete(Promise.scala:53)
    at scala.concurrent.Promise.complete$(Promise.scala:52)
    at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67)
    at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
    at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59)
    at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875)
    at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110)
    at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107)
    at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
    at scala.concurrent.Promise.tryFailure(Promise.scala:112)
    at scala.concurrent.Promise.tryFailure$(Promise.scala:112)
    at scala.concurrent.impl.Promise$DefaultPromise.tryFailure(Promise.scala:187)
    at org.apache.spark.rpc.netty.NettyRpcEnv.org$apache$spark$rpc$netty$NettyRpcEnv$$onFailure$1(NettyRpcEnv.scala:214)
    at org.apache.spark.rpc.netty.NettyRpcEnv$$anon$1.run(NettyRpcEnv.scala:264)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.util.concurrent.TimeoutException: Cannot receive any reply from localhost:41245 in 120 seconds
    at org.apache.spark.rpc.netty.NettyRpcEnv$$anon$1.run(NettyRpcEnv.scala:265)
    ... 6 more
====== dec-tree (apache-spark) [default], iteration 6 completed (42183.276 ms) ======
====== dec-tree (apache-spark) [default], iteration 7 started ======
GC before operation: completed in 125113.721 ms, heap usage 281.963 MB -> 61.997 MB.
====== dec-tree (apache-spark) [default], iteration 7 completed (45289.724 ms) ======
====== dec-tree (apache-spark) [default], iteration 8 started ======
GC before operation: completed in 112248.914 ms, heap usage 215.085 MB -> 61.933 MB.
====== dec-tree (apache-spark) [default], iteration 8 completed (184315.847 ms) ======
====== dec-tree (apache-spark) [default], iteration 9 started ======
GC before operation: completed in 3107.822 ms, heap usage 264.748 MB -> 62.319 MB.
====== dec-tree (apache-spark) [default], iteration 9 completed (42934.935 ms) ======
====== dec-tree (apache-spark) [default], iteration 10 started ======
GC before operation: completed in 135746.759 ms, heap usage 191.794 MB -> 62.342 MB.
====== dec-tree (apache-spark) [default], iteration 10 completed (45550.167 ms) ======
====== dec-tree (apache-spark) [default], iteration 11 started ======
GC before operation: completed in 125547.534 ms, heap usage 192.270 MB -> 62.246 MB.
====== dec-tree (apache-spark) [default], iteration 11 completed (42059.435 ms) ======
====== dec-tree (apache-spark) [default], iteration 12 started ======
GC before operation: completed in 143698.553 ms, heap usage 223.560 MB -> 62.339 MB.
22/01/29 10:02:55 WARN NettyRpcEnv: Ignored failure: java.util.concurrent.TimeoutException: Cannot receive any reply from localhost:41245 in 120 seconds
22/01/29 10:02:57 WARN Executor: Issue communicating with driver in heartbeater
java.lang.NullPointerException
    at org.apache.spark.storage.memory.MemoryStore.getSize(MemoryStore.scala:131)
    at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$getCurrentBlockStatus(BlockManager.scala:815)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3(BlockManager.scala:571)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3$adapted(BlockManager.scala:570)
    at scala.collection.Iterator.foreach(Iterator.scala:943)
    at scala.collection.Iterator.foreach$(Iterator.scala:943)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
    at org.apache.spark.storage.BlockManager.reportAllBlocks(BlockManager.scala:570)
    at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:590)
    at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1000)
    at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:212)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
    at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
====== dec-tree (apache-spark) [default], iteration 12 completed (41860.986 ms) ======
====== dec-tree (apache-spark) [default], iteration 13 started ======
GC before operation: completed in 145228.106 ms, heap usage 257.776 MB -> 62.131 MB.
22/01/29 10:06:05 WARN NettyRpcEnv: Ignored failure: java.util.concurrent.TimeoutException: Cannot receive any reply from localhost:41245 in 120 seconds
22/01/29 10:06:06 WARN NettyRpcEnv: Ignored message: 0
22/01/29 10:06:07 ERROR BlockManagerMasterEndpoint: Fail to know the executor driver is alive or not.
org.apache.spark.SparkException: Exception thrown in awaitResult:
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
    at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:112)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:111)
    at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$handleBlockRemovalFailure$1.applyOrElse(BlockManagerMasterEndpoint.scala:226)
    at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$handleBlockRemovalFailure$1.applyOrElse(BlockManagerMasterEndpoint.scala:217)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
    at scala.util.Failure.recover(Try.scala:234)
    at scala.concurrent.Future.$anonfun$recover$1(Future.scala:395)
    at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:41245
    at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148)
    at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144)
    at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307)
    at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99)
    at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
    at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallback(Promise.scala:316)
    at scala.concurrent.impl.Promise$DefaultPromise.onComplete(Promise.scala:307)
    at scala.concurrent.impl.Promise.transformWith(Promise.scala:40)
    at scala.concurrent.impl.Promise.transformWith$(Promise.scala:38)
    at scala.concurrent.impl.Promise$DefaultPromise.transformWith(Promise.scala:187)
    at scala.concurrent.Future.flatMap(Future.scala:306)
    at scala.concurrent.Future.flatMap$(Future.scala:306)
    at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:187)
    at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150)
    ... 16 more
22/01/29 10:06:07 WARN BlockManagerMasterEndpoint: Error trying to remove broadcast 176. The executor driver may have been lost.
org.apache.spark.rpc.RpcTimeoutException: Cannot receive any reply from localhost:41245 in 120 seconds. This timeout is controlled by spark.rpc.askTimeout
    at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
    at scala.util.Failure.recover(Try.scala:234)
    at scala.concurrent.Future.$anonfun$recover$1(Future.scala:395)
    at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99)
    at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
    at scala.concurrent.Promise.complete(Promise.scala:53)
    at scala.concurrent.Promise.complete$(Promise.scala:52)
    at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67)
    at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
    at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59)
    at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875)
    at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110)
    at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107)
    at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
    at scala.concurrent.Promise.tryFailure(Promise.scala:112)
    at scala.concurrent.Promise.tryFailure$(Promise.scala:112)
    at scala.concurrent.impl.Promise$DefaultPromise.tryFailure(Promise.scala:187)
    at org.apache.spark.rpc.netty.NettyRpcEnv.org$apache$spark$rpc$netty$NettyRpcEnv$$onFailure$1(NettyRpcEnv.scala:214)
    at org.apache.spark.rpc.netty.NettyRpcEnv$$anon$1.run(NettyRpcEnv.scala:264)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.util.concurrent.TimeoutException: Cannot receive any reply from localhost:41245 in 120 seconds
    at org.apache.spark.rpc.netty.NettyRpcEnv$$anon$1.run(NettyRpcEnv.scala:265)
    ... 6 more
====== dec-tree (apache-spark) [default], iteration 13 completed (42430.799 ms) ======
====== dec-tree (apache-spark) [default], iteration 14 started ======
GC before operation: completed in 152010.117 ms, heap usage 297.578 MB -> 62.065 MB.
====== dec-tree (apache-spark) [default], iteration 14 completed (51648.828 ms) ======
# 省略 scala-stm-bench7 结果

csvout3 如下:

benchmark,duration_ns,uptime_ns,vm_start_unix_ms
dec-tree,409482279408,5302420258992,1643442888954
dec-tree,77799201948,5793520476078,1643442888954
dec-tree,83360591644,5952525426729,1643442888954
dec-tree,53809056349,6125902578513,1643442888954
dec-tree,49684373800,6271881139911,1643442888954
dec-tree,61376355807,6438784412132,1643442888954
dec-tree,42183276370,6624551409678,1643442888954
dec-tree,45289723653,6795494027799,1643442888954
dec-tree,184315847467,6955752583835,1643442888954
dec-tree,42934934923,7143218338184,1643442888954
dec-tree,45550166955,7325736280125,1643442888954
dec-tree,42059434806,7497701187979,1643442888954
dec-tree,41860986475,7686400401671,1643442888954
dec-tree,42430799068,7876846495664,1643442888954
dec-tree,51648827562,8073393618551,1643442888954
scala-stm-bench7,84045943064,8278249946759,1643442888954
scala-stm-bench7,61004438539,8497332613809,1643442888954
scala-stm-bench7,50443275432,8562288287157,1643442888954
scala-stm-bench7,56653354028,8760676701426,1643442888954
scala-stm-bench7,62060466180,8967348240019,1643442888954
scala-stm-bench7,65077299768,9184095848706,1643442888954
scala-stm-bench7,62325670802,9253900503938,1643442888954
scala-stm-bench7,56375477376,9319202044866,1643442888954
scala-stm-bench7,58233684277,9380354600082,1643442888954
scala-stm-bench7,58998584004,9441318886928,1643442888954
scala-stm-bench7,61521815176,9504725014997,1643442888954
scala-stm-bench7,65871952165,9715877079875,1643442888954
scala-stm-bench7,58679136355,9786790458066,1643442888954
scala-stm-bench7,59700009606,9854304541629,1643442888954
scala-stm-bench7,57509269275,9917196364010,1643442888954
finagle-http,468582002762,57246411594,1643442888954
finagle-http,358378299233,562437630066,1643442888954
finagle-http,318604098547,958098092343,1643442888954
finagle-http,328926803359,1315357393567,1643442888954
finagle-http,340027462090,1649572478874,1643442888954
finagle-http,348000915868,2027627346863,1643442888954
finagle-http,310347161900,2377241834702,1643442888954
finagle-http,315127047363,2692101587962,1643442888954
finagle-http,311213339093,3014495815330,1643442888954
finagle-http,333516775407,3332898767730,1643442888954
finagle-http,329654612343,3667994562677,1643442888954
finagle-http,282799927582,3999413023235,1643442888954
finagle-http,284077381079,4283792472506,1643442888954
finagle-http,268215338054,4570352988810,1643442888954
finagle-http,269907502433,4840554111262,1643442888954
BigBrotherJu commented 2 years ago

继续运行后面 5 个 benchmark,naive-bayes 和 als 因为内存不够和超时会报一些警告,另外三个 benchmark 运行正常。

[root@openEuler-RISCV-rare renaissance]# java -jar renaissance-gpl-0.13.0.jar --csv csvout4 -r 15 naive-bayes als par-mnemonics scala-kmeans philosophers
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
NOTE: 'naive-bayes' benchmark uses Spark local executor with 4 (out of 4) threads.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.util.SizeEstimator$ (file:/root/local/java-test/renaissance/harness-064746-3962280706498894762/apache-spark/lib/spark-core_2.12-3.1.2.jar) to field java.net.URI.scheme
WARNING: Please consider reporting this to the maintainers of org.apache.spark.util.SizeEstimator$
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
====== naive-bayes (apache-spark) [default], iteration 0 started ======
GC before operation: completed in 466.381 ms, heap usage 52.464 MB -> 27.033 MB.
22/01/30 06:58:05 WARN HeartbeatReceiver: Removing executor driver with no recent heartbeats: 135347 ms exceeds timeout 120000 ms
22/01/30 06:58:47 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 132.0 MiB so far)
22/01/30 06:58:46 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 132.0 MiB so far)
22/01/30 06:58:46 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 132.0 MiB so far)
22/01/30 06:58:47 WARN SparkContext: Killing executors is not supported by current scheduler.
22/01/30 06:58:47 WARN Executor: Issue communicating with driver in heartbeater
org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [10000 milliseconds]. This timeout is controlled by spark.executor.heartbeatInterval
    at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)
    at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103)
    at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:996)
    at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:212)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
    at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:259)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:263)
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:293)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    ... 12 more
22/01/30 06:58:47 WARN NettyRpcEnv: Ignored message: HeartbeatResponse(true)
22/01/30 06:58:47 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 06:58:47 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 06:58:47 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 06:58:51 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 06:58:51 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 06:58:51 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 06:59:28 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 132.0 MiB so far)
22/01/30 06:59:28 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 06:59:28 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 07:08:10 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 07:08:10 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 07:08:10 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 0 completed (1248941.855 ms) ======
====== naive-bayes (apache-spark) [default], iteration 1 started ======
GC before operation: completed in 53347.997 ms, heap usage 733.296 MB -> 299.390 MB.
22/01/30 07:14:11 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 68.0 MiB so far)
22/01/30 07:14:11 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 07:14:12 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 07:14:58 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 07:14:59 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 07:14:59 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 07:16:03 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 07:16:03 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 07:16:03 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 07:16:26 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 132.0 MiB so far)
22/01/30 07:16:26 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 07:16:26 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 07:22:17 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 07:22:17 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 07:22:17 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 1 completed (690744.734 ms) ======
====== naive-bayes (apache-spark) [default], iteration 2 started ======
GC before operation: completed in 62629.916 ms, heap usage 706.126 MB -> 298.374 MB.
22/01/30 07:27:11 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 68.0 MiB so far)
22/01/30 07:27:11 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 07:27:11 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 07:28:08 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 07:28:08 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 07:28:08 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 07:29:29 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 07:29:29 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 07:29:29 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 07:29:45 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 132.0 MiB so far)
22/01/30 07:29:45 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 07:29:45 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 07:35:55 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 07:35:55 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 07:35:55 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 2 completed (743265.638 ms) ======
====== naive-bayes (apache-spark) [default], iteration 3 started ======
GC before operation: completed in 68270.950 ms, heap usage 597.632 MB -> 298.462 MB.
22/01/30 07:40:43 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 68.0 MiB so far)
22/01/30 07:40:43 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 07:40:43 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 07:41:45 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 07:41:45 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 07:41:45 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 07:43:11 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 07:43:11 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 07:43:11 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 07:43:35 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 132.0 MiB so far)
22/01/30 07:43:35 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 07:43:35 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 07:50:29 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 07:50:29 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 07:50:29 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 3 completed (810962.600 ms) ======
====== naive-bayes (apache-spark) [default], iteration 4 started ======
GC before operation: completed in 38567.672 ms, heap usage 361.209 MB -> 298.642 MB.
22/01/30 07:54:40 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 68.0 MiB so far)
22/01/30 07:54:40 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 07:54:40 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 07:55:49 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 07:55:49 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 07:55:49 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 07:57:24 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 07:57:24 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 07:57:24 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 07:57:25 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 132.0 MiB so far)
22/01/30 07:57:25 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 07:57:25 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 08:02:55 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 08:02:55 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 08:02:55 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 4 completed (711482.070 ms) ======
====== naive-bayes (apache-spark) [default], iteration 5 started ======
GC before operation: completed in 74351.185 ms, heap usage 882.892 MB -> 298.783 MB.
22/01/30 08:05:26 WARN Executor: Issue communicating with driver in heartbeater
java.lang.NullPointerException
    at org.apache.spark.storage.memory.MemoryStore.getSize(MemoryStore.scala:131)
    at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$getCurrentBlockStatus(BlockManager.scala:815)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3(BlockManager.scala:571)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3$adapted(BlockManager.scala:570)
    at scala.collection.Iterator.foreach(Iterator.scala:943)
    at scala.collection.Iterator.foreach$(Iterator.scala:943)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
    at org.apache.spark.storage.BlockManager.reportAllBlocks(BlockManager.scala:570)
    at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:590)
    at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1000)
    at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:212)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
    at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
22/01/30 08:07:35 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 68.0 MiB so far)
22/01/30 08:07:35 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 08:07:35 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 08:08:12 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 08:08:14 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 08:08:14 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 08:09:38 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 08:09:41 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 08:09:41 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 08:10:07 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 132.0 MiB so far)
22/01/30 08:10:07 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 08:10:07 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 08:17:30 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 08:17:30 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 08:17:30 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 5 completed (804328.721 ms) ======
====== naive-bayes (apache-spark) [default], iteration 6 started ======
GC before operation: completed in 78564.825 ms, heap usage 557.454 MB -> 299.367 MB.
22/01/30 08:21:52 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 68.0 MiB so far)
22/01/30 08:21:52 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 08:21:52 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 08:22:28 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 08:22:28 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 08:22:28 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 08:23:52 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 08:23:52 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 08:23:52 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 08:24:00 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 132.0 MiB so far)
22/01/30 08:24:00 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 08:24:00 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 08:31:08 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 08:31:08 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 08:31:08 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 6 completed (743938.899 ms) ======
====== naive-bayes (apache-spark) [default], iteration 7 started ======
GC before operation: completed in 81584.209 ms, heap usage 773.066 MB -> 298.988 MB.
22/01/30 08:36:11 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 68.0 MiB so far)
22/01/30 08:36:11 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 08:36:11 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 08:36:53 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 08:36:53 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 08:36:53 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 08:38:19 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 08:38:19 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 08:38:19 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 08:38:31 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 132.0 MiB so far)
22/01/30 08:38:31 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 08:38:31 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 08:45:50 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 08:45:50 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 08:45:50 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 7 completed (804715.298 ms) ======
====== naive-bayes (apache-spark) [default], iteration 8 started ======
GC before operation: completed in 86632.017 ms, heap usage 529.783 MB -> 299.083 MB.
22/01/30 08:48:48 WARN Executor: Issue communicating with driver in heartbeater
java.lang.NullPointerException
    at org.apache.spark.storage.memory.MemoryStore.getSize(MemoryStore.scala:131)
    at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$getCurrentBlockStatus(BlockManager.scala:815)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3(BlockManager.scala:571)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3$adapted(BlockManager.scala:570)
    at scala.collection.Iterator.foreach(Iterator.scala:943)
    at scala.collection.Iterator.foreach$(Iterator.scala:943)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
    at org.apache.spark.storage.BlockManager.reportAllBlocks(BlockManager.scala:570)
    at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:590)
    at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1000)
    at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:212)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
    at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
22/01/30 08:50:52 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 68.0 MiB so far)
22/01/30 08:50:52 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 08:50:52 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 08:51:36 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 08:51:36 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 08:51:38 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 08:52:49 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 08:52:49 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 08:52:49 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 08:53:09 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 132.0 MiB so far)
22/01/30 08:53:09 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 08:53:09 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 09:00:50 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 09:00:50 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 09:00:50 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 8 completed (811372.398 ms) ======
====== naive-bayes (apache-spark) [default], iteration 9 started ======
GC before operation: completed in 94249.080 ms, heap usage 804.389 MB -> 299.184 MB.
22/01/30 09:06:00 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 68.0 MiB so far)
22/01/30 09:06:00 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 09:06:00 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 09:06:45 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 09:06:45 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 09:06:45 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 09:08:23 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 09:08:23 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 09:08:23 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 09:08:36 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 132.0 MiB so far)
22/01/30 09:08:36 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 09:08:36 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 09:14:42 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 09:14:42 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 09:14:42 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 9 completed (730579.803 ms) ======
====== naive-bayes (apache-spark) [default], iteration 10 started ======
GC before operation: completed in 86044.515 ms, heap usage 382.676 MB -> 299.304 MB.
22/01/30 09:19:33 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 68.0 MiB so far)
22/01/30 09:19:34 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 09:19:34 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 09:20:32 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 09:20:34 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 09:20:34 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 09:22:40 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 132.0 MiB so far)
22/01/30 09:22:40 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 09:22:40 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 09:22:44 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 09:22:44 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 09:22:44 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 09:29:24 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 09:29:24 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 09:29:24 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 10 completed (796089.710 ms) ======
====== naive-bayes (apache-spark) [default], iteration 11 started ======
GC before operation: completed in 92910.445 ms, heap usage 358.587 MB -> 299.416 MB.
22/01/30 09:34:07 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 68.0 MiB so far)
22/01/30 09:34:07 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 09:34:08 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 09:34:57 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 09:34:57 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 09:34:57 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 09:36:40 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 09:36:40 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 09:36:40 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 09:37:02 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 132.0 MiB so far)
22/01/30 09:37:02 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 09:37:02 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 09:43:54 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 09:43:55 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 09:43:55 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 11 completed (769510.463 ms) ======
====== naive-bayes (apache-spark) [default], iteration 12 started ======
GC before operation: completed in 101325.516 ms, heap usage 885.850 MB -> 299.566 MB.
22/01/30 09:49:59 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 68.0 MiB so far)
22/01/30 09:49:59 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 09:49:59 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 09:51:01 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 09:51:01 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 09:51:01 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 09:52:49 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 09:52:49 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 09:52:49 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 09:55:06 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 132.0 MiB so far)
22/01/30 09:55:06 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 09:55:06 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 10:01:31 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 10:01:31 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 10:01:31 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 12 completed (976406.766 ms) ======
====== naive-bayes (apache-spark) [default], iteration 13 started ======
GC before operation: completed in 101933.019 ms, heap usage 445.992 MB -> 299.621 MB.
22/01/30 10:07:18 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 68.0 MiB so far)
22/01/30 10:07:18 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 10:07:18 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 10:08:22 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 10:08:22 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 10:08:22 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 10:09:52 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 10:09:55 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 10:09:55 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 10:14:32 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 132.0 MiB so far)
22/01/30 10:14:37 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 10:14:37 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 10:25:05 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 10:25:05 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 10:25:05 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 13 completed (1354585.423 ms) ======
====== naive-bayes (apache-spark) [default], iteration 14 started ======
GC before operation: completed in 102381.363 ms, heap usage 540.563 MB -> 300.243 MB.
22/01/30 10:32:03 WARN MemoryStore: Not enough space to cache rdd_3_0 in memory! (computed 68.0 MiB so far)
22/01/30 10:32:03 WARN BlockManager: Block rdd_3_0 could not be removed as it was not found on disk or in memory
22/01/30 10:32:03 WARN BlockManager: Putting block rdd_3_0 failed
22/01/30 10:33:05 WARN MemoryStore: Not enough space to cache rdd_3_2 in memory! (computed 68.0 MiB so far)
22/01/30 10:33:05 WARN BlockManager: Block rdd_3_2 could not be removed as it was not found on disk or in memory
22/01/30 10:33:05 WARN BlockManager: Putting block rdd_3_2 failed
22/01/30 10:35:05 WARN MemoryStore: Not enough space to cache rdd_3_3 in memory! (computed 68.0 MiB so far)
22/01/30 10:35:05 WARN BlockManager: Block rdd_3_3 could not be removed as it was not found on disk or in memory
22/01/30 10:35:05 WARN BlockManager: Putting block rdd_3_3 failed
22/01/30 10:41:11 WARN MemoryStore: Not enough space to cache rdd_3_1 in memory! (computed 132.0 MiB so far)
22/01/30 10:41:20 WARN BlockManager: Block rdd_3_1 could not be removed as it was not found on disk or in memory
22/01/30 10:41:20 WARN BlockManager: Putting block rdd_3_1 failed
22/01/30 10:48:59 WARN MemoryStore: Not enough space to cache rdd_3_5 in memory! (computed 132.0 MiB so far)
22/01/30 10:48:59 WARN BlockManager: Block rdd_3_5 could not be removed as it was not found on disk or in memory
22/01/30 10:48:59 WARN BlockManager: Putting block rdd_3_5 failed
====== naive-bayes (apache-spark) [default], iteration 14 completed (1283823.993 ms) ======
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
NOTE: 'als' benchmark uses Spark local executor with 4 (out of 4) threads.
====== als (apache-spark) [default], iteration 0 started ======
GC before operation: completed in 5907.299 ms, heap usage 314.397 MB -> 45.899 MB.
22/01/30 10:59:46 WARN HeartbeatReceiver: Removing executor driver with no recent heartbeats: 138906 ms exceeds timeout 120000 ms
22/01/30 11:02:03 WARN NettyRpcEnv: Ignored message: true
22/01/30 11:02:03 WARN NettyRpcEnv: Ignored message: true
22/01/30 11:02:03 WARN SparkContext: Killing executors is not supported by current scheduler.
22/01/30 11:04:26 WARN NettyRpcEnv: Ignored failure: java.util.concurrent.TimeoutException: Cannot receive any reply from localhost:43431 in 120 seconds
22/01/30 11:09:28 WARN Executor: Issue communicating with driver in heartbeater
org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout
    at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)
    at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103)
    at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87)
    at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:78)
    at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:589)
    at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1000)
    at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:212)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
    at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [120 seconds]
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:259)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:263)
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:293)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    ... 15 more
22/01/30 11:09:30 ERROR BlockManagerMasterEndpoint: Fail to know the executor driver is alive or not.
org.apache.spark.SparkException: Exception thrown in awaitResult:
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
    at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:112)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:111)
    at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$handleBlockRemovalFailure$1.applyOrElse(BlockManagerMasterEndpoint.scala:226)
    at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$handleBlockRemovalFailure$1.applyOrElse(BlockManagerMasterEndpoint.scala:217)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
    at scala.util.Failure.recover(Try.scala:234)
    at scala.concurrent.Future.$anonfun$recover$1(Future.scala:395)
    at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43431
    at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148)
    at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144)
    at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307)
    at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99)
    at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
    at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallback(Promise.scala:316)
    at scala.concurrent.impl.Promise$DefaultPromise.onComplete(Promise.scala:307)
    at scala.concurrent.impl.Promise.transformWith(Promise.scala:40)
    at scala.concurrent.impl.Promise.transformWith$(Promise.scala:38)
    at scala.concurrent.impl.Promise$DefaultPromise.transformWith(Promise.scala:187)
    at scala.concurrent.Future.flatMap(Future.scala:306)
    at scala.concurrent.Future.flatMap$(Future.scala:306)
    at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:187)
    at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150)
    ... 16 more
22/01/30 11:09:28 ERROR BlockManagerStorageEndpoint: Error in removing broadcast 4
org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout
    at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)
    at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103)
    at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87)
    at org.apache.spark.storage.BlockManagerMaster.updateBlockInfo(BlockManagerMaster.scala:90)
    at org.apache.spark.storage.BlockManager.tryToReportBlockStatus(BlockManager.scala:791)
    at org.apache.spark.storage.BlockManager.reportBlockStatus(BlockManager.scala:770)
    at org.apache.spark.storage.BlockManager.removeBlockInternal(BlockManager.scala:1903)
    at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1877)
    at org.apache.spark.storage.BlockManager.$anonfun$removeBroadcast$3(BlockManager.scala:1863)
    at org.apache.spark.storage.BlockManager.$anonfun$removeBroadcast$3$adapted(BlockManager.scala:1863)
    at scala.collection.Iterator.foreach(Iterator.scala:943)
    at scala.collection.Iterator.foreach$(Iterator.scala:943)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
    at org.apache.spark.storage.BlockManager.removeBroadcast(BlockManager.scala:1863)
    at org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1.$anonfun$applyOrElse$4(BlockManagerStorageEndpoint.scala:69)
    at scala.runtime.java8.JFunction0$mcI$sp.apply(JFunction0$mcI$sp.java:23)
    at org.apache.spark.storage.BlockManagerStorageEndpoint.$anonfun$doAsync$1(BlockManagerStorageEndpoint.scala:89)
    at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
    at scala.util.Success.$anonfun$map$1(Try.scala:255)
    at scala.util.Success.map(Try.scala:213)
    at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
    at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [120 seconds]
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:259)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:263)
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:293)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    ... 26 more
22/01/30 11:09:30 WARN NettyRpcEnv: Ignored failure: org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout
22/01/30 11:09:30 WARN BlockManagerMasterEndpoint: Error trying to remove broadcast 4. The executor driver may have been lost.
org.apache.spark.rpc.RpcTimeoutException: Cannot receive any reply from localhost:43431 in 120 seconds. This timeout is controlled by spark.rpc.askTimeout
    at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
    at scala.util.Failure.recover(Try.scala:234)
    at scala.concurrent.Future.$anonfun$recover$1(Future.scala:395)
    at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99)
    at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
    at scala.concurrent.Promise.complete(Promise.scala:53)
    at scala.concurrent.Promise.complete$(Promise.scala:52)
    at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
    at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67)
    at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
    at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59)
    at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875)
    at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110)
    at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107)
    at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
    at scala.concurrent.Promise.tryFailure(Promise.scala:112)
    at scala.concurrent.Promise.tryFailure$(Promise.scala:112)
    at scala.concurrent.impl.Promise$DefaultPromise.tryFailure(Promise.scala:187)
    at org.apache.spark.rpc.netty.NettyRpcEnv.org$apache$spark$rpc$netty$NettyRpcEnv$$onFailure$1(NettyRpcEnv.scala:214)
    at org.apache.spark.rpc.netty.NettyRpcEnv$$anon$1.run(NettyRpcEnv.scala:264)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.util.concurrent.TimeoutException: Cannot receive any reply from localhost:43431 in 120 seconds
    at org.apache.spark.rpc.netty.NettyRpcEnv$$anon$1.run(NettyRpcEnv.scala:265)
    ... 6 more
22/01/30 11:14:05 WARN NettyRpcEnv: Ignored failure: java.util.concurrent.TimeoutException: Cannot receive any reply from localhost:43431 in 120 seconds
22/01/30 11:14:05 WARN Executor: Issue communicating with driver in heartbeater
org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout
    at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)
    at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103)
    at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87)
    at org.apache.spark.storage.BlockManagerMaster.updateBlockInfo(BlockManagerMaster.scala:90)
    at org.apache.spark.storage.BlockManager.tryToReportBlockStatus(BlockManager.scala:791)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3(BlockManager.scala:572)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3$adapted(BlockManager.scala:570)
    at scala.collection.Iterator.foreach(Iterator.scala:943)
    at scala.collection.Iterator.foreach$(Iterator.scala:943)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
    at org.apache.spark.storage.BlockManager.reportAllBlocks(BlockManager.scala:570)
    at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:590)
    at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1000)
    at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:212)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
    at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [120 seconds]
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:259)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:263)
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:293)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    ... 22 more
22/01/30 11:18:47 WARN Executor: Issue communicating with driver in heartbeater
java.lang.NullPointerException
    at org.apache.spark.storage.memory.MemoryStore.getSize(MemoryStore.scala:131)
    at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$getCurrentBlockStatus(BlockManager.scala:815)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3(BlockManager.scala:571)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3$adapted(BlockManager.scala:570)
    at scala.collection.Iterator.foreach(Iterator.scala:943)
    at scala.collection.Iterator.foreach$(Iterator.scala:943)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
    at org.apache.spark.storage.BlockManager.reportAllBlocks(BlockManager.scala:570)
    at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:590)
    at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1000)
    at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:212)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
    at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 0 completed (1518542.858 ms) ======
====== als (apache-spark) [default], iteration 1 started ======
GC before operation: completed in 152811.362 ms, heap usage 259.382 MB -> 75.526 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 1 completed (212820.953 ms) ======
====== als (apache-spark) [default], iteration 2 started ======
GC before operation: completed in 4159.271 ms, heap usage 415.666 MB -> 78.059 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 2 completed (251653.345 ms) ======
====== als (apache-spark) [default], iteration 3 started ======
GC before operation: completed in 3304.829 ms, heap usage 270.346 MB -> 78.427 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 3 completed (549575.716 ms) ======
====== als (apache-spark) [default], iteration 4 started ======
GC before operation: completed in 5969.558 ms, heap usage 282.726 MB -> 78.696 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 4 completed (225435.808 ms) ======
====== als (apache-spark) [default], iteration 5 started ======
GC before operation: completed in 2501.973 ms, heap usage 311.101 MB -> 78.460 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 5 completed (203897.476 ms) ======
====== als (apache-spark) [default], iteration 6 started ======
GC before operation: completed in 2969.698 ms, heap usage 225.348 MB -> 78.797 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 6 completed (173543.443 ms) ======
====== als (apache-spark) [default], iteration 7 started ======
GC before operation: completed in 6189.160 ms, heap usage 272.029 MB -> 79.211 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 7 completed (178648.488 ms) ======
====== als (apache-spark) [default], iteration 8 started ======
GC before operation: completed in 2651.284 ms, heap usage 186.416 MB -> 79.460 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 8 completed (259257.205 ms) ======
====== als (apache-spark) [default], iteration 9 started ======
GC before operation: completed in 3075.529 ms, heap usage 205.074 MB -> 80.476 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 9 completed (251503.113 ms) ======
====== als (apache-spark) [default], iteration 10 started ======
GC before operation: completed in 3065.641 ms, heap usage 271.234 MB -> 80.419 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 10 completed (161369.743 ms) ======
====== als (apache-spark) [default], iteration 11 started ======
GC before operation: completed in 10517.972 ms, heap usage 345.947 MB -> 80.859 MB.
22/01/30 12:07:47 WARN Executor: Issue communicating with driver in heartbeater
java.lang.NullPointerException
    at org.apache.spark.storage.memory.MemoryStore.getSize(MemoryStore.scala:131)
    at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$getCurrentBlockStatus(BlockManager.scala:815)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3(BlockManager.scala:571)
    at org.apache.spark.storage.BlockManager.$anonfun$reportAllBlocks$3$adapted(BlockManager.scala:570)
    at scala.collection.Iterator.foreach(Iterator.scala:943)
    at scala.collection.Iterator.foreach$(Iterator.scala:943)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
    at org.apache.spark.storage.BlockManager.reportAllBlocks(BlockManager.scala:570)
    at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:590)
    at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1000)
    at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:212)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
    at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 11 completed (191529.961 ms) ======
====== als (apache-spark) [default], iteration 12 started ======
GC before operation: completed in 3535.279 ms, heap usage 217.529 MB -> 80.889 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 12 completed (176219.247 ms) ======
====== als (apache-spark) [default], iteration 13 started ======
GC before operation: completed in 2919.890 ms, heap usage 366.036 MB -> 81.541 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 13 completed (179134.730 ms) ======
====== als (apache-spark) [default], iteration 14 started ======
GC before operation: completed in 3845.676 ms, heap usage 227.311 MB -> 81.907 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== als (apache-spark) [default], iteration 14 completed (208189.231 ms) ======
# 后面三个 benchmark 忽略

csvout4 如下:

benchmark,duration_ns,uptime_ns,vm_start_unix_ms
par-mnemonics,124125968798,19981921384090,1643525254510
par-mnemonics,99447751596,20288485917568,1643525254510
par-mnemonics,123724684341,20392915617394,1643525254510
par-mnemonics,132867731400,20689309442177,1643525254510
par-mnemonics,134618961565,21011621956504,1643525254510
par-mnemonics,135579676282,21151710484705,1643525254510
par-mnemonics,128393864357,21292962903544,1643525254510
par-mnemonics,125936902769,21426240044717,1643525254510
par-mnemonics,130616502802,21556927348783,1643525254510
par-mnemonics,120491343511,21692268779979,1643525254510
par-mnemonics,143062579411,21818372977377,1643525254510
par-mnemonics,152223973204,21966063297004,1643525254510
par-mnemonics,143970415838,22122654394550,1643525254510
par-mnemonics,131400437108,22270708642668,1643525254510
par-mnemonics,135110619203,22406498504862,1643525254510
naive-bayes,1248941854957,166727025230,1643525254510
naive-bayes,690744733868,1469631626516,1643525254510
naive-bayes,743265638366,2223609213842,1643525254510
naive-bayes,810962599523,3035545426116,1643525254510
naive-bayes,711482069880,3885598782818,1643525254510
naive-bayes,804328721292,4671728581197,1643525254510
naive-bayes,743938899106,5556177127039,1643525254510
naive-bayes,804715297971,6382339679428,1643525254510
naive-bayes,811372397839,7273917598291,1643525254510
naive-bayes,730579803046,8179875939811,1643525254510
naive-bayes,796089709905,8996886603607,1643525254510
naive-bayes,769510463184,9886150090761,1643525254510
naive-bayes,976406766141,10757510272672,1643525254510
naive-bayes,1354585422665,11836165561865,1643525254510
naive-bayes,1283823993460,13293407755434,1643525254510
scala-kmeans,13023392094,22703411982055,1643525254510
scala-kmeans,6142216231,22718690674384,1643525254510
scala-kmeans,5058675288,22727428500350,1643525254510
scala-kmeans,4653927563,22734551763814,1643525254510
scala-kmeans,4030069663,22741367638109,1643525254510
scala-kmeans,3818421905,22747528169362,1643525254510
scala-kmeans,3620097734,22753483138053,1643525254510
scala-kmeans,3553412032,22759280253836,1643525254510
scala-kmeans,3619250338,22764945697661,1643525254510
scala-kmeans,3607931844,22771034914249,1643525254510
scala-kmeans,3613502293,22777187713038,1643525254510
scala-kmeans,3953726756,22783300111182,1643525254510
scala-kmeans,5182301948,22789682463874,1643525254510
scala-kmeans,4905958994,22798586980510,1643525254510
scala-kmeans,4486041896,22806546982539,1643525254510
als,1518542857841,14838230044535,1643525254510
als,212820952930,16514286428152,1643525254510
als,251653345198,16731326742477,1643525254510
als,549575716020,16986311498155,1643525254510
als,225435808180,17541887868462,1643525254510
als,203897476065,17769846741442,1643525254510
als,173543443346,17976728951024,1643525254510
als,178648487776,18156548131627,1643525254510
als,259257204893,18337857736519,1643525254510
als,251503112691,18600377184669,1643525254510
als,161369743469,18854996968249,1643525254510
als,191529960601,19026888947383,1643525254510
als,176219246762,19221965304548,1643525254510
als,179134730258,19401118607413,1643525254510
als,208189230967,19584119645250,1643525254510
philosophers,94895057473,23002496041270,1643525254510
philosophers,25745013176,23288711127153,1643525254510
philosophers,23506678392,23318076504006,1643525254510
philosophers,19572090612,23344001291018,1643525254510
philosophers,20197784791,23366902846689,1643525254510
philosophers,18575396888,23389593014055,1643525254510
philosophers,20144742669,23410798392630,1643525254510
philosophers,19685497375,23433452205540,1643525254510
philosophers,17026115448,23455580258419,1643525254510
philosophers,196542576229,23477351879606,1643525254510
philosophers,166528893060,23859737975439,1643525254510
philosophers,18581921751,24028670951272,1643525254510
philosophers,18928290489,24049783983418,1643525254510
philosophers,20566466333,24071000208033,1643525254510
philosophers,19734599337,24094172253157,1643525254510
lazyparser commented 2 years ago

goooooooooood

BigBrotherJu commented 2 years ago

继续运行最后 5 个 benchmark,其中 gauss-mix 运行时间非常久,好几个小时第一个 iteration 都没运行好,另外 4 个 benchmark 运行正常,所以下面只测试另外 4 个 benchmark:

[root@openEuler-RISCV-rare renaissance]# java -jar renaissance-gpl-0.13.0.jar --csv csvout5 -r 15 log-regression mnemonics dotty finagle-chirper
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
NOTE: 'log-regression' benchmark uses Spark local executor with 4 (out of 4) threads.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.util.SizeEstimator$ (file:/root/local/java-test/renaissance/harness-112827-9768669690835747919/apache-spark/lib/spark-core_2.12-3.1.2.jar) to field java.net.URI.scheme
WARNING: Please consider reporting this to the maintainers of org.apache.spark.util.SizeEstimator$
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
====== log-regression (apache-spark) [default], iteration 0 started ======
GC before operation: completed in 462.988 ms, heap usage 32.254 MB -> 27.015 MB.
====== log-regression (apache-spark) [default], iteration 0 completed (273059.686 ms) ======
====== log-regression (apache-spark) [default], iteration 1 started ======
GC before operation: completed in 59005.457 ms, heap usage 252.991 MB -> 105.374 MB.
====== log-regression (apache-spark) [default], iteration 1 completed (68928.697 ms) ======
====== log-regression (apache-spark) [default], iteration 2 started ======
GC before operation: completed in 59836.451 ms, heap usage 243.835 MB -> 105.847 MB.
====== log-regression (apache-spark) [default], iteration 2 completed (105611.163 ms) ======
====== log-regression (apache-spark) [default], iteration 3 started ======
GC before operation: completed in 1228.365 ms, heap usage 299.716 MB -> 106.270 MB.
====== log-regression (apache-spark) [default], iteration 3 completed (55862.138 ms) ======
====== log-regression (apache-spark) [default], iteration 4 started ======
GC before operation: completed in 1652.555 ms, heap usage 262.101 MB -> 106.446 MB.
====== log-regression (apache-spark) [default], iteration 4 completed (58405.907 ms) ======
====== log-regression (apache-spark) [default], iteration 5 started ======
GC before operation: completed in 5027.869 ms, heap usage 366.374 MB -> 106.259 MB.
====== log-regression (apache-spark) [default], iteration 5 completed (56341.931 ms) ======
====== log-regression (apache-spark) [default], iteration 6 started ======
GC before operation: completed in 77840.875 ms, heap usage 447.425 MB -> 106.029 MB.
====== log-regression (apache-spark) [default], iteration 6 completed (243075.722 ms) ======
====== log-regression (apache-spark) [default], iteration 7 started ======
GC before operation: completed in 94095.248 ms, heap usage 339.122 MB -> 104.194 MB.
====== log-regression (apache-spark) [default], iteration 7 completed (53042.528 ms) ======
====== log-regression (apache-spark) [default], iteration 8 started ======
GC before operation: completed in 1549.888 ms, heap usage 334.226 MB -> 104.383 MB.
====== log-regression (apache-spark) [default], iteration 8 completed (42967.179 ms) ======
====== log-regression (apache-spark) [default], iteration 9 started ======
GC before operation: completed in 6270.843 ms, heap usage 368.129 MB -> 104.717 MB.
====== log-regression (apache-spark) [default], iteration 9 completed (48722.177 ms) ======
====== log-regression (apache-spark) [default], iteration 10 started ======
GC before operation: completed in 1470.116 ms, heap usage 277.419 MB -> 104.696 MB.
====== log-regression (apache-spark) [default], iteration 10 completed (44332.977 ms) ======
====== log-regression (apache-spark) [default], iteration 11 started ======
GC before operation: completed in 5422.626 ms, heap usage 491.106 MB -> 105.317 MB.
====== log-regression (apache-spark) [default], iteration 11 completed (55525.957 ms) ======
====== log-regression (apache-spark) [default], iteration 12 started ======
GC before operation: completed in 2261.990 ms, heap usage 214.038 MB -> 105.015 MB.
====== log-regression (apache-spark) [default], iteration 12 completed (42460.969 ms) ======
====== log-regression (apache-spark) [default], iteration 13 started ======
GC before operation: completed in 10227.166 ms, heap usage 363.069 MB -> 105.507 MB.
====== log-regression (apache-spark) [default], iteration 13 completed (41689.956 ms) ======
====== log-regression (apache-spark) [default], iteration 14 started ======
GC before operation: completed in 4740.434 ms, heap usage 385.429 MB -> 105.736 MB.
====== log-regression (apache-spark) [default], iteration 14 completed (55224.566 ms) ======
====== mnemonics (functional) [default], iteration 0 started ======
GC before operation: completed in 123560.391 ms, heap usage 200.694 MB -> 32.781 MB.
====== mnemonics (functional) [default], iteration 0 completed (134313.891 ms) ======
====== mnemonics (functional) [default], iteration 1 started ======
GC before operation: completed in 2302.145 ms, heap usage 157.283 MB -> 32.799 MB.
====== mnemonics (functional) [default], iteration 1 completed (106766.454 ms) ======
====== mnemonics (functional) [default], iteration 2 started ======
GC before operation: completed in 1632.080 ms, heap usage 156.299 MB -> 32.799 MB.
====== mnemonics (functional) [default], iteration 2 completed (113007.691 ms) ======
====== mnemonics (functional) [default], iteration 3 started ======
GC before operation: completed in 2131.326 ms, heap usage 151.799 MB -> 32.800 MB.
====== mnemonics (functional) [default], iteration 3 completed (117138.700 ms) ======
====== mnemonics (functional) [default], iteration 4 started ======
GC before operation: completed in 1845.772 ms, heap usage 158.300 MB -> 32.797 MB.
====== mnemonics (functional) [default], iteration 4 completed (141775.103 ms) ======
====== mnemonics (functional) [default], iteration 5 started ======
GC before operation: completed in 2092.741 ms, heap usage 157.297 MB -> 32.797 MB.
====== mnemonics (functional) [default], iteration 5 completed (120133.149 ms) ======
====== mnemonics (functional) [default], iteration 6 started ======
GC before operation: completed in 130875.631 ms, heap usage 151.797 MB -> 30.995 MB.
====== mnemonics (functional) [default], iteration 6 completed (112059.644 ms) ======
====== mnemonics (functional) [default], iteration 7 started ======
GC before operation: completed in 2578.757 ms, heap usage 163.995 MB -> 30.994 MB.
====== mnemonics (functional) [default], iteration 7 completed (119314.514 ms) ======
====== mnemonics (functional) [default], iteration 8 started ======
GC before operation: completed in 2930.917 ms, heap usage 157.494 MB -> 30.994 MB.
====== mnemonics (functional) [default], iteration 8 completed (112967.516 ms) ======
====== mnemonics (functional) [default], iteration 9 started ======
GC before operation: completed in 1649.154 ms, heap usage 161.494 MB -> 30.994 MB.
====== mnemonics (functional) [default], iteration 9 completed (115082.981 ms) ======
====== mnemonics (functional) [default], iteration 10 started ======
GC before operation: completed in 1988.455 ms, heap usage 155.494 MB -> 30.994 MB.
====== mnemonics (functional) [default], iteration 10 completed (107771.138 ms) ======
====== mnemonics (functional) [default], iteration 11 started ======
GC before operation: completed in 1678.180 ms, heap usage 163.994 MB -> 30.994 MB.
====== mnemonics (functional) [default], iteration 11 completed (107544.224 ms) ======
====== mnemonics (functional) [default], iteration 12 started ======
GC before operation: completed in 1979.470 ms, heap usage 162.994 MB -> 30.994 MB.
====== mnemonics (functional) [default], iteration 12 completed (110139.923 ms) ======
====== mnemonics (functional) [default], iteration 13 started ======
GC before operation: completed in 1533.733 ms, heap usage 153.994 MB -> 30.994 MB.
====== mnemonics (functional) [default], iteration 13 completed (128004.982 ms) ======
====== mnemonics (functional) [default], iteration 14 started ======
GC before operation: completed in 2796.238 ms, heap usage 185.494 MB -> 30.994 MB.
====== mnemonics (functional) [default], iteration 14 completed (119906.135 ms) ======
====== dotty (scala) [default], iteration 0 started ======
GC before operation: completed in 119673.207 ms, heap usage 145.699 MB -> 32.873 MB.
====== dotty (scala) [default], iteration 0 completed (900021.116 ms) ======
====== dotty (scala) [default], iteration 1 started ======
GC before operation: completed in 185521.869 ms, heap usage 113.774 MB -> 49.042 MB.
====== dotty (scala) [default], iteration 1 completed (181488.853 ms) ======
====== dotty (scala) [default], iteration 2 started ======
GC before operation: completed in 3764.922 ms, heap usage 106.909 MB -> 48.889 MB.
====== dotty (scala) [default], iteration 2 completed (112577.802 ms) ======
====== dotty (scala) [default], iteration 3 started ======
GC before operation: completed in 5737.312 ms, heap usage 92.473 MB -> 48.901 MB.
====== dotty (scala) [default], iteration 3 completed (101879.522 ms) ======
====== dotty (scala) [default], iteration 4 started ======
GC before operation: completed in 5481.282 ms, heap usage 101.210 MB -> 48.906 MB.
====== dotty (scala) [default], iteration 4 completed (76364.724 ms) ======
====== dotty (scala) [default], iteration 5 started ======
GC before operation: completed in 4146.871 ms, heap usage 90.254 MB -> 48.920 MB.
====== dotty (scala) [default], iteration 5 completed (115206.013 ms) ======
====== dotty (scala) [default], iteration 6 started ======
GC before operation: completed in 5929.294 ms, heap usage 110.587 MB -> 48.294 MB.
====== dotty (scala) [default], iteration 6 completed (134599.806 ms) ======
====== dotty (scala) [default], iteration 7 started ======
GC before operation: completed in 4009.479 ms, heap usage 86.680 MB -> 48.299 MB.
====== dotty (scala) [default], iteration 7 completed (88853.225 ms) ======
====== dotty (scala) [default], iteration 8 started ======
GC before operation: completed in 3826.674 ms, heap usage 98.741 MB -> 48.304 MB.
====== dotty (scala) [default], iteration 8 completed (102189.865 ms) ======
====== dotty (scala) [default], iteration 9 started ======
GC before operation: completed in 3209.832 ms, heap usage 111.274 MB -> 48.307 MB.
====== dotty (scala) [default], iteration 9 completed (64358.068 ms) ======
====== dotty (scala) [default], iteration 10 started ======
GC before operation: completed in 3766.409 ms, heap usage 106.951 MB -> 48.344 MB.
====== dotty (scala) [default], iteration 10 completed (87102.559 ms) ======
====== dotty (scala) [default], iteration 11 started ======
GC before operation: completed in 3685.323 ms, heap usage 99.927 MB -> 48.347 MB.
====== dotty (scala) [default], iteration 11 completed (85173.507 ms) ======
====== dotty (scala) [default], iteration 12 started ======
GC before operation: completed in 3354.972 ms, heap usage 103.068 MB -> 48.317 MB.
====== dotty (scala) [default], iteration 12 completed (69024.388 ms) ======
====== dotty (scala) [default], iteration 13 started ======
GC before operation: completed in 3558.359 ms, heap usage 99.602 MB -> 48.319 MB.
====== dotty (scala) [default], iteration 13 completed (119997.221 ms) ======
====== dotty (scala) [default], iteration 14 started ======
GC before operation: completed in 3785.869 ms, heap usage 92.241 MB -> 48.321 MB.
====== dotty (scala) [default], iteration 14 completed (106231.190 ms) ======
22/02/01 13:23:27 INFO finagle: Finagle version 19.4.0 (rev=15ae0aba979a2c11ed4a71774b2e995f5df918b4) built at 20190418-114348
Master port: 34195
Cache ports: 34227, 43817, 39135, 33135
====== finagle-chirper (web) [default], iteration 0 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 4299.535 ms, heap usage 96.685 MB -> 58.522 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 0 completed (569423.243 ms) ======
====== finagle-chirper (web) [default], iteration 1 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 291969.472 ms, heap usage 76.447 MB -> 61.361 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 1 completed (269681.801 ms) ======
====== finagle-chirper (web) [default], iteration 2 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 296393.506 ms, heap usage 168.239 MB -> 60.513 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 2 completed (236780.557 ms) ======
====== finagle-chirper (web) [default], iteration 3 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 5312.689 ms, heap usage 132.557 MB -> 60.476 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 3 completed (221228.145 ms) ======
====== finagle-chirper (web) [default], iteration 4 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 6456.228 ms, heap usage 198.384 MB -> 60.553 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 4 completed (220272.619 ms) ======
====== finagle-chirper (web) [default], iteration 5 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 10886.845 ms, heap usage 293.976 MB -> 60.542 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 5 completed (234579.933 ms) ======
====== finagle-chirper (web) [default], iteration 6 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 5850.789 ms, heap usage 109.134 MB -> 60.578 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 6 completed (210049.970 ms) ======
====== finagle-chirper (web) [default], iteration 7 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 10282.722 ms, heap usage 218.080 MB -> 60.629 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 7 completed (190401.194 ms) ======
====== finagle-chirper (web) [default], iteration 8 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 10802.876 ms, heap usage 228.926 MB -> 60.631 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 8 completed (187195.498 ms) ======
====== finagle-chirper (web) [default], iteration 9 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 6774.376 ms, heap usage 310.837 MB -> 60.629 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 9 completed (209329.762 ms) ======
====== finagle-chirper (web) [default], iteration 10 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 6103.830 ms, heap usage 132.635 MB -> 60.626 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 10 completed (228389.682 ms) ======
====== finagle-chirper (web) [default], iteration 11 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 10710.940 ms, heap usage 223.459 MB -> 60.630 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 11 completed (227582.271 ms) ======
====== finagle-chirper (web) [default], iteration 12 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 6704.343 ms, heap usage 135.391 MB -> 60.627 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 12 completed (202037.207 ms) ======
====== finagle-chirper (web) [default], iteration 13 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 7843.162 ms, heap usage 274.476 MB -> 60.642 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 13 completed (196448.589 ms) ======
====== finagle-chirper (web) [default], iteration 14 started ======
Resetting master, feed map size: 5000
GC before operation: completed in 7949.937 ms, heap usage 199.797 MB -> 60.641 MB.
WARNING: This benchmark provides no result that can be validated.
There is no way to check that no silent failure occurred.
====== finagle-chirper (web) [default], iteration 14 completed (189380.795 ms) ======

csvout5 如下:

benchmark,duration_ns,uptime_ns,vm_start_unix_ms
mnemonics,134313891162,1878030216185,1643714894219
mnemonics,106766454155,2019942901217,1643714894219
mnemonics,113007691447,2129914249032,1643714894219
mnemonics,117138700176,2246724201174,1643714894219
mnemonics,141775102934,2367522911844,1643714894219
mnemonics,120133148701,2513288098850,1643714894219
mnemonics,112059644349,2765797631090,1643714894219
mnemonics,119314513561,2883712090472,1643714894219
mnemonics,112967516137,3007787790288,1643714894219
mnemonics,115082981384,3124141730451,1643714894219
mnemonics,107771138004,3243093813996,1643714894219
mnemonics,107544224468,3354693348629,1643714894219
mnemonics,110139922541,3466194898226,1643714894219
mnemonics,128004981830,3579140515905,1643714894219
mnemonics,119906134884,3712629964088,1643714894219
dotty,900021116256,3962525306788,1643714894219
dotty,181488853023,5070770795999,1643714894219
dotty,112577802398,5278862141062,1643714894219
dotty,101879521927,5422177828194,1643714894219
dotty,76364724185,5552049852682,1643714894219
dotty,115206013204,5655137071538,1643714894219
dotty,134599805759,5799609751426,1643714894219
dotty,88853225254,5959822475811,1643714894219
dotty,102189864831,6072964784250,1643714894219
dotty,64358067849,6198678153588,1643714894219
dotty,87102558896,6287349270583,1643714894219
dotty,85173506649,6398613129094,1643714894219
dotty,69024388450,6508777807553,1643714894219
dotty,119997221098,6603401782712,1643714894219
dotty,106231190453,6748519664016,1643714894219
finagle-chirper,569423243018,6947934172603,1643714894219
finagle-chirper,269681800899,7811136024019,1643714894219
finagle-chirper,236780556917,8378613402612,1643714894219
finagle-chirper,221228145477,8620820885359,1643714894219
finagle-chirper,220272619324,8848632275139,1643714894219
finagle-chirper,234579933111,9079909843255,1643714894219
finagle-chirper,210049970488,9320429072761,1643714894219
finagle-chirper,190401193605,9540863797238,1643714894219
finagle-chirper,187195498230,9742174117861,1643714894219
finagle-chirper,209329762356,9936283547622,1643714894219
finagle-chirper,228389681899,10151991868491,1643714894219
finagle-chirper,227582270700,10391247204737,1643714894219
finagle-chirper,202037207291,10625661199003,1643714894219
finagle-chirper,196448588527,10835670542674,1643714894219
finagle-chirper,189380794857,11040206588642,1643714894219
log-regression,273059685592,152381528370,1643714894219
log-regression,68928696920,486878947330,1643714894219
log-regression,105611163094,620646569610,1643714894219
log-regression,55862137525,727511962177,1643714894219
log-regression,58405906910,785092239883,1643714894219
log-regression,56341930558,848625552079,1643714894219
log-regression,243075722193,985014791458,1643714894219
log-regression,53042527748,1326389580127,1643714894219
log-regression,42967179396,1381000075808,1643714894219
log-regression,48722177131,1430246135971,1643714894219
log-regression,44332977389,1480453534682,1643714894219
log-regression,55525956787,1530243153839,1643714894219
log-regression,42460969204,1588056870612,1643714894219
log-regression,41689956343,1640758297196,1643714894219
log-regression,55224565535,1687245102974,1643714894219
BigBrotherJu commented 2 years ago

我们写一个简单的 python 脚本来计算所有 benchmark 的平均运行时间,并打印出 markdown 表格:

def read_and_cal(file_name):
    with open(file_name, 'r') as f:
        time_sum = 0
        iter = 0
        name = ''
        for line in f.readlines():
            if line.strip().split(',')[0] == 'benchmark':
                continue
            else:
                if iter == 0:
                    name = line.strip().split(',')[0]
                iter += 1
                time_sum += int(line.strip().split(',')[1])
                # print(iter, line.strip().split(',')[1])
                if iter == 15:
                    results.append( name + ' | ' + format(time_sum/1e6/15, '0.3f') + ' ms');
                    iter = 0
                    time_sum = 0
                    name = ''
results = []
read_and_cal('csvout1')
read_and_cal('csvout2')
read_and_cal('csvout3')
read_and_cal('csvout4')
read_and_cal('csvout5')
results.append('db-shootout | 共享库不存在出错')
results.append('neo4j-analytics | 共享库不存在出错')
results.append('reactors | NullPointerException')
results.append('gauss-mix | 太久无响应')

sorted_results = sorted(results)

print('benchmark | 平均时间')
print(':- | :-')
for result in sorted_results:
    print(result)

markdown 表格如下:

benchmark 平均时间
akka-uct 623452.203 ms
als 316088.088 ms
chi-square 29218.010 ms
db-shootout 共享库不存在出错
dec-tree 84919.057 ms
dotty 156337.857 ms
finagle-chirper 239518.751 ms
finagle-http 324491.911 ms
fj-kmeans 88257.365 ms
future-genetic 55808.047 ms
gauss-mix 太久无响应
log-regression 83016.770 ms
mnemonics 117728.403 ms
movie-lens 813515.394 ms
naive-bayes 885383.225 ms
neo4j-analytics 共享库不存在出错
page-rank 439006.333 ms
par-mnemonics 130771.427 ms
philosophers 46682.075 ms
reactors NullPointerException
rx-scrabble 10513.948 ms
scala-doku 72817.021 ms
scala-kmeans 4884.595 ms
scala-stm-bench7 61233.358 ms
scrabble 20823.012 ms
BigBrotherJu commented 2 years ago

我们把 fedora 和 oe rv 的两个表格对比一下:

benchmark oe rv64 平均时间 fedora amd64 平均时间
akka-uct 623452.203 ms 22831.624 ms
als 316088.088 ms 9232.720 ms
chi-square 29218.010 ms 1492.328 ms
db-shootout 共享库不存在出错 10345.956 ms
dec-tree 84919.057 ms 2571.260 ms
dotty 156337.857 ms 3148.241 ms
finagle-chirper 239518.751 ms 3404.060 ms
finagle-http 324491.911 ms 3147.346 ms
fj-kmeans 88257.365 ms 12571.596 ms
future-genetic 55808.047 ms 11413.216 ms
gauss-mix 太久无响应 4074.455 ms
log-regression 83016.770 ms 3478.772 ms
mnemonics 117728.403 ms 6028.293 ms
movie-lens 813515.394 ms 15388.416 ms
naive-bayes 885383.225 ms 34277.048 ms
neo4j-analytics 共享库不存在出错 堆内存原因出错
page-rank 439006.333 ms 15286.194 ms
par-mnemonics 130771.427 ms 4660.160 ms
philosophers 46682.075 ms 833.986 ms
reactors NullPointerException 15771.422 ms
rx-scrabble 10513.948 ms 409.123 ms
scala-doku 72817.021 ms 3568.910 ms
scala-kmeans 4884.595 ms 270.718 ms
scala-stm-bench7 61233.358 ms 1920.753 ms
scrabble 20823.012 ms 1488.428 ms

然后写一个简单 python 脚本来读取上面的表格文本并算出时间比值,python 脚本如下:

with open('table', 'r') as f:
    for line in f.readlines():
        if line.strip().split('|')[0].strip() == 'benchmark' \
           or line.strip().split('|')[0].strip() == ':-':
            continue
        else:
            try:
                oe = float(line.strip().split('|')[1].strip().split()[0])
                fedora = float(line.strip().split('|')[2].strip().split()[0])
                print(format(oe/fedora, '0.2f'))
            except ValueError as e:
                print('N/A')

有时间比值的表格如下:

benchmark oe rv64 平均时间 fedora amd64 平均时间 时间比值
akka-uct 623452.203 ms 22831.624 ms 27.31
als 316088.088 ms 9232.720 ms 34.24
chi-square 29218.010 ms 1492.328 ms 19.58
db-shootout 共享库不存在出错 10345.956 ms N/A
dec-tree 84919.057 ms 2571.260 ms 33.03
dotty 156337.857 ms 3148.241 ms 49.66
finagle-chirper 239518.751 ms 3404.060 ms 70.36
finagle-http 324491.911 ms 3147.346 ms 103.10
fj-kmeans 88257.365 ms 12571.596 ms 7.02
future-genetic 55808.047 ms 11413.216 ms 4.89
gauss-mix 太久无响应 4074.455 ms N/A
log-regression 83016.770 ms 3478.772 ms 23.86
mnemonics 117728.403 ms 6028.293 ms 19.53
movie-lens 813515.394 ms 15388.416 ms 52.87
naive-bayes 885383.225 ms 34277.048 ms 25.83
neo4j-analytics 共享库不存在出错 堆内存原因出错 N/A
page-rank 439006.333 ms 15286.194 ms 28.72
par-mnemonics 130771.427 ms 4660.160 ms 28.06
philosophers 46682.075 ms 833.986 ms 55.97
reactors NullPointerException 15771.422 ms N/A
rx-scrabble 10513.948 ms 409.123 ms 25.70
scala-doku 72817.021 ms 3568.910 ms 20.40
scala-kmeans 4884.595 ms 270.718 ms 18.04
scala-stm-bench7 61233.358 ms 1920.753 ms 31.88
scrabble 20823.012 ms 1488.428 ms 13.99

可以看到,和之前测试的其他 benchmark 集类似,时间比值通常在 20 倍左右,有时甚至可以达到 50 到 100 倍。