When running the second example, the part working with the Spark Platform is not working.
All other examples in this notebook work smoothly.
I get this error log in step #8:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/jovyan/.cache/coursier/v1/https/repo1.maven.org/maven2/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/jovyan/.cache/coursier/v1/https/repo1.maven.org/maven2/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
23/10/03 08:52:59 INFO SparkContext: Running Spark version 3.1.2
23/10/03 08:53:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
23/10/03 08:53:00 INFO ResourceUtils: ==============================================================
23/10/03 08:53:00 INFO ResourceUtils: No custom resources configured for spark.driver.
23/10/03 08:53:00 INFO ResourceUtils: ==============================================================
23/10/03 08:53:00 INFO SparkContext: Submitted application: WordCount (file:/home/jovyan/book.txt)
23/10/03 08:53:00 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
23/10/03 08:53:00 INFO ResourceProfile: Limiting resource is cpu
23/10/03 08:53:00 INFO ResourceProfileManager: Added ResourceProfile id: 0
23/10/03 08:53:00 INFO SecurityManager: Changing view acls to: jovyan
23/10/03 08:53:00 INFO SecurityManager: Changing modify acls to: jovyan
23/10/03 08:53:00 INFO SecurityManager: Changing view acls groups to:
23/10/03 08:53:00 INFO SecurityManager: Changing modify acls groups to:
23/10/03 08:53:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jovyan); groups with view permissions: Set(); users with modify permissions: Set(jovyan); groups with modify permissions: Set()
23/10/03 08:53:00 INFO Utils: Successfully started service 'sparkDriver' on port 33033.
23/10/03 08:53:00 INFO SparkEnv: Registering MapOutputTracker
23/10/03 08:53:00 INFO SparkEnv: Registering BlockManagerMaster
23/10/03 08:53:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
23/10/03 08:53:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
23/10/03 08:53:00 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
23/10/03 08:53:01 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-d281332b-82d7-4978-a015-91c7dbc58c73
23/10/03 08:53:01 INFO MemoryStore: MemoryStore started with capacity 2.8 GiB
23/10/03 08:53:01 INFO SparkEnv: Registering OutputCommitCoordinator
23/10/03 08:53:01 INFO Utils: Successfully started service 'SparkUI' on port 4040.
23/10/03 08:53:01 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://e1600cbd6265:4040/
23/10/03 08:53:01 INFO Executor: Starting executor ID driver on host e1600cbd6265
23/10/03 08:53:01 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35425.
23/10/03 08:53:01 INFO NettyBlockTransferService: Server created on e1600cbd6265:35425
23/10/03 08:53:01 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
23/10/03 08:53:01 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, e1600cbd6265, 35425, None)
23/10/03 08:53:01 INFO BlockManagerMasterEndpoint: Registering block manager e1600cbd6265:35425 with 2.8 GiB RAM, BlockManagerId(driver, e1600cbd6265, 35425, None)
23/10/03 08:53:01 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, e1600cbd6265, 35425, None)
23/10/03 08:53:01 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, e1600cbd6265, 35425, None)
org.apache.wayang.core.api.exception.WayangException: Job execution failed.
org.apache.wayang.core.api.Job.doExecute(Job.java:335)
org.apache.wayang.core.util.OneTimeExecutable.tryExecute(OneTimeExecutable.java:41)
org.apache.wayang.core.util.OneTimeExecutable.execute(OneTimeExecutable.java:54)
org.apache.wayang.core.api.Job.execute(Job.java:249)
org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:133)
org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:121)
org.apache.wayang.api.PlanBuilder.buildAndExecute(PlanBuilder.scala:105)
org.apache.wayang.api.DataQuanta.collect(DataQuanta.scala:758)
ammonite.$sess.cmd1$Helper.wordcount(cmd1.sc:12)
ammonite.$sess.cmd7$Helper.(cmd7.sc:1)
ammonite.$sess.cmd7$.(cmd7.sc:7)
ammonite.$sess.cmd7$.(cmd7.sc:-1)
java.lang.ExceptionInInitializerError
org.apache.spark.SparkContext.withScope(SparkContext.scala:786)
org.apache.spark.SparkContext.textFile(SparkContext.scala:917)
org.apache.spark.api.java.JavaSparkContext.textFile(JavaSparkContext.scala:175)
org.apache.wayang.spark.operators.SparkTextFileSource.evaluate(SparkTextFileSource.java:70)
org.apache.wayang.spark.execution.SparkExecutor.execute(SparkExecutor.java:114)
org.apache.wayang.core.platform.PushExecutorTemplate.execute(PushExecutorTemplate.java:73)
org.apache.wayang.core.platform.PushExecutorTemplate.access$100(PushExecutorTemplate.java:47)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.execute(PushExecutorTemplate.java:195)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.doExecute(PushExecutorTemplate.java:166)
org.apache.wayang.core.util.OneTimeExecutable.tryExecute(OneTimeExecutable.java:41)
org.apache.wayang.core.util.OneTimeExecutable.execute(OneTimeExecutable.java:54)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.executeStage(PushExecutorTemplate.java:156)
org.apache.wayang.core.platform.PushExecutorTemplate.execute(PushExecutorTemplate.java:61)
org.apache.wayang.core.platform.CrossPlatformExecutor.execute(CrossPlatformExecutor.java:378)
org.apache.wayang.core.platform.CrossPlatformExecutor.executeSingleStage(CrossPlatformExecutor.java:248)
org.apache.wayang.core.platform.CrossPlatformExecutor.runToBreakpoint(CrossPlatformExecutor.java:320)
org.apache.wayang.core.platform.CrossPlatformExecutor.executeUntilBreakpoint(CrossPlatformExecutor.java:156)
org.apache.wayang.core.api.Job.execute(Job.java:529)
org.apache.wayang.core.api.Job.doExecute(Job.java:314)
org.apache.wayang.core.util.OneTimeExecutable.tryExecute(OneTimeExecutable.java:41)
org.apache.wayang.core.util.OneTimeExecutable.execute(OneTimeExecutable.java:54)
org.apache.wayang.core.api.Job.execute(Job.java:249)
org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:133)
org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:121)
org.apache.wayang.api.PlanBuilder.buildAndExecute(PlanBuilder.scala:105)
org.apache.wayang.api.DataQuanta.collect(DataQuanta.scala:758)
ammonite.$sess.cmd1$Helper.wordcount(cmd1.sc:12)
ammonite.$sess.cmd7$Helper.(cmd7.sc:1)
ammonite.$sess.cmd7$.(cmd7.sc:7)
ammonite.$sess.cmd7$.(cmd7.sc:-1)
com.fasterxml.jackson.databind.JsonMappingException: Scala module 2.10.0 requires Jackson Databind version >= 2.10.0 and < 2.11.0
com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:61)
com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:46)
com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:17)
com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:835)
org.apache.spark.rdd.RDDOperationScope$.(RDDOperationScope.scala:82)
org.apache.spark.rdd.RDDOperationScope$.(RDDOperationScope.scala:-1)
org.apache.spark.SparkContext.withScope(SparkContext.scala:786)
org.apache.spark.SparkContext.textFile(SparkContext.scala:917)
org.apache.spark.api.java.JavaSparkContext.textFile(JavaSparkContext.scala:175)
org.apache.wayang.spark.operators.SparkTextFileSource.evaluate(SparkTextFileSource.java:70)
org.apache.wayang.spark.execution.SparkExecutor.execute(SparkExecutor.java:114)
org.apache.wayang.core.platform.PushExecutorTemplate.execute(PushExecutorTemplate.java:73)
org.apache.wayang.core.platform.PushExecutorTemplate.access$100(PushExecutorTemplate.java:47)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.execute(PushExecutorTemplate.java:195)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.doExecute(PushExecutorTemplate.java:166)
org.apache.wayang.core.util.OneTimeExecutable.tryExecute(OneTimeExecutable.java:41)
org.apache.wayang.core.util.OneTimeExecutable.execute(OneTimeExecutable.java:54)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.executeStage(PushExecutorTemplate.java:156)
org.apache.wayang.core.platform.PushExecutorTemplate.execute(PushExecutorTemplate.java:61)
org.apache.wayang.core.platform.CrossPlatformExecutor.execute(CrossPlatformExecutor.java:378)
org.apache.wayang.core.platform.CrossPlatformExecutor.executeSingleStage(CrossPlatformExecutor.java:248)
org.apache.wayang.core.platform.CrossPlatformExecutor.runToBreakpoint(CrossPlatformExecutor.java:320)
org.apache.wayang.core.platform.CrossPlatformExecutor.executeUntilBreakpoint(CrossPlatformExecutor.java:156)
org.apache.wayang.core.api.Job.execute(Job.java:529)
org.apache.wayang.core.api.Job.doExecute(Job.java:314)
org.apache.wayang.core.util.OneTimeExecutable.tryExecute(OneTimeExecutable.java:41)
org.apache.wayang.core.util.OneTimeExecutable.execute(OneTimeExecutable.java:54)
org.apache.wayang.core.api.Job.execute(Job.java:249)
org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:133)
org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:121)
org.apache.wayang.api.PlanBuilder.buildAndExecute(PlanBuilder.scala:105)
org.apache.wayang.api.DataQuanta.collect(DataQuanta.scala:758)
ammonite.$sess.cmd1$Helper.wordcount(cmd1.sc:12)
ammonite.$sess.cmd7$Helper.(cmd7.sc:1)
ammonite.$sess.cmd7$.(cmd7.sc:7)
When running the second example, the part working with the Spark Platform is not working. All other examples in this notebook work smoothly.
I get this error log in step #8:
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/jovyan/.cache/coursier/v1/https/repo1.maven.org/maven2/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/jovyan/.cache/coursier/v1/https/repo1.maven.org/maven2/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 23/10/03 08:52:59 INFO SparkContext: Running Spark version 3.1.2 23/10/03 08:53:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 23/10/03 08:53:00 INFO ResourceUtils: ============================================================== 23/10/03 08:53:00 INFO ResourceUtils: No custom resources configured for spark.driver. 23/10/03 08:53:00 INFO ResourceUtils: ============================================================== 23/10/03 08:53:00 INFO SparkContext: Submitted application: WordCount (file:/home/jovyan/book.txt) 23/10/03 08:53:00 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 23/10/03 08:53:00 INFO ResourceProfile: Limiting resource is cpu 23/10/03 08:53:00 INFO ResourceProfileManager: Added ResourceProfile id: 0 23/10/03 08:53:00 INFO SecurityManager: Changing view acls to: jovyan 23/10/03 08:53:00 INFO SecurityManager: Changing modify acls to: jovyan 23/10/03 08:53:00 INFO SecurityManager: Changing view acls groups to: 23/10/03 08:53:00 INFO SecurityManager: Changing modify acls groups to: 23/10/03 08:53:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jovyan); groups with view permissions: Set(); users with modify permissions: Set(jovyan); groups with modify permissions: Set() 23/10/03 08:53:00 INFO Utils: Successfully started service 'sparkDriver' on port 33033. 23/10/03 08:53:00 INFO SparkEnv: Registering MapOutputTracker 23/10/03 08:53:00 INFO SparkEnv: Registering BlockManagerMaster 23/10/03 08:53:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 23/10/03 08:53:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 23/10/03 08:53:00 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 23/10/03 08:53:01 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-d281332b-82d7-4978-a015-91c7dbc58c73 23/10/03 08:53:01 INFO MemoryStore: MemoryStore started with capacity 2.8 GiB 23/10/03 08:53:01 INFO SparkEnv: Registering OutputCommitCoordinator 23/10/03 08:53:01 INFO Utils: Successfully started service 'SparkUI' on port 4040. 23/10/03 08:53:01 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://e1600cbd6265:4040/ 23/10/03 08:53:01 INFO Executor: Starting executor ID driver on host e1600cbd6265 23/10/03 08:53:01 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35425. 23/10/03 08:53:01 INFO NettyBlockTransferService: Server created on e1600cbd6265:35425 23/10/03 08:53:01 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 23/10/03 08:53:01 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, e1600cbd6265, 35425, None) 23/10/03 08:53:01 INFO BlockManagerMasterEndpoint: Registering block manager e1600cbd6265:35425 with 2.8 GiB RAM, BlockManagerId(driver, e1600cbd6265, 35425, None) 23/10/03 08:53:01 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, e1600cbd6265, 35425, None) 23/10/03 08:53:01 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, e1600cbd6265, 35425, None) org.apache.wayang.core.api.exception.WayangException: Job execution failed. org.apache.wayang.core.api.Job.doExecute(Job.java:335) org.apache.wayang.core.util.OneTimeExecutable.tryExecute(OneTimeExecutable.java:41) org.apache.wayang.core.util.OneTimeExecutable.execute(OneTimeExecutable.java:54) org.apache.wayang.core.api.Job.execute(Job.java:249) org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:133) org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:121) org.apache.wayang.api.PlanBuilder.buildAndExecute(PlanBuilder.scala:105) org.apache.wayang.api.DataQuanta.collect(DataQuanta.scala:758) ammonite.$sess.cmd1$Helper.wordcount(cmd1.sc:12) ammonite.$sess.cmd7$Helper.(cmd7.sc:1)
ammonite.$sess.cmd7$.(cmd7.sc:7)
ammonite.$sess.cmd7$.(cmd7.sc:-1)
java.lang.ExceptionInInitializerError
org.apache.spark.SparkContext.withScope(SparkContext.scala:786)
org.apache.spark.SparkContext.textFile(SparkContext.scala:917)
org.apache.spark.api.java.JavaSparkContext.textFile(JavaSparkContext.scala:175)
org.apache.wayang.spark.operators.SparkTextFileSource.evaluate(SparkTextFileSource.java:70)
org.apache.wayang.spark.execution.SparkExecutor.execute(SparkExecutor.java:114)
org.apache.wayang.core.platform.PushExecutorTemplate.execute(PushExecutorTemplate.java:73)
org.apache.wayang.core.platform.PushExecutorTemplate.access$100(PushExecutorTemplate.java:47)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.execute(PushExecutorTemplate.java:195)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.doExecute(PushExecutorTemplate.java:166)
org.apache.wayang.core.util.OneTimeExecutable.tryExecute(OneTimeExecutable.java:41)
org.apache.wayang.core.util.OneTimeExecutable.execute(OneTimeExecutable.java:54)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.executeStage(PushExecutorTemplate.java:156)
org.apache.wayang.core.platform.PushExecutorTemplate.execute(PushExecutorTemplate.java:61)
org.apache.wayang.core.platform.CrossPlatformExecutor.execute(CrossPlatformExecutor.java:378)
org.apache.wayang.core.platform.CrossPlatformExecutor.executeSingleStage(CrossPlatformExecutor.java:248)
org.apache.wayang.core.platform.CrossPlatformExecutor.runToBreakpoint(CrossPlatformExecutor.java:320)
org.apache.wayang.core.platform.CrossPlatformExecutor.executeUntilBreakpoint(CrossPlatformExecutor.java:156)
org.apache.wayang.core.api.Job.execute(Job.java:529)
org.apache.wayang.core.api.Job.doExecute(Job.java:314)
org.apache.wayang.core.util.OneTimeExecutable.tryExecute(OneTimeExecutable.java:41)
org.apache.wayang.core.util.OneTimeExecutable.execute(OneTimeExecutable.java:54)
org.apache.wayang.core.api.Job.execute(Job.java:249)
org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:133)
org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:121)
org.apache.wayang.api.PlanBuilder.buildAndExecute(PlanBuilder.scala:105)
org.apache.wayang.api.DataQuanta.collect(DataQuanta.scala:758)
ammonite.$sess.cmd1$Helper.wordcount(cmd1.sc:12)
ammonite.$sess.cmd7$Helper.(cmd7.sc:1)
ammonite.$sess.cmd7$.(cmd7.sc:7)
ammonite.$sess.cmd7$.(cmd7.sc:-1)
com.fasterxml.jackson.databind.JsonMappingException: Scala module 2.10.0 requires Jackson Databind version >= 2.10.0 and < 2.11.0
com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:61)
com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:46)
com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:17)
com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:835)
org.apache.spark.rdd.RDDOperationScope$.(RDDOperationScope.scala:82)
org.apache.spark.rdd.RDDOperationScope$.(RDDOperationScope.scala:-1)
org.apache.spark.SparkContext.withScope(SparkContext.scala:786)
org.apache.spark.SparkContext.textFile(SparkContext.scala:917)
org.apache.spark.api.java.JavaSparkContext.textFile(JavaSparkContext.scala:175)
org.apache.wayang.spark.operators.SparkTextFileSource.evaluate(SparkTextFileSource.java:70)
org.apache.wayang.spark.execution.SparkExecutor.execute(SparkExecutor.java:114)
org.apache.wayang.core.platform.PushExecutorTemplate.execute(PushExecutorTemplate.java:73)
org.apache.wayang.core.platform.PushExecutorTemplate.access$100(PushExecutorTemplate.java:47)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.execute(PushExecutorTemplate.java:195)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.doExecute(PushExecutorTemplate.java:166)
org.apache.wayang.core.util.OneTimeExecutable.tryExecute(OneTimeExecutable.java:41)
org.apache.wayang.core.util.OneTimeExecutable.execute(OneTimeExecutable.java:54)
org.apache.wayang.core.platform.PushExecutorTemplate$StageExecution.executeStage(PushExecutorTemplate.java:156)
org.apache.wayang.core.platform.PushExecutorTemplate.execute(PushExecutorTemplate.java:61)
org.apache.wayang.core.platform.CrossPlatformExecutor.execute(CrossPlatformExecutor.java:378)
org.apache.wayang.core.platform.CrossPlatformExecutor.executeSingleStage(CrossPlatformExecutor.java:248)
org.apache.wayang.core.platform.CrossPlatformExecutor.runToBreakpoint(CrossPlatformExecutor.java:320)
org.apache.wayang.core.platform.CrossPlatformExecutor.executeUntilBreakpoint(CrossPlatformExecutor.java:156)
org.apache.wayang.core.api.Job.execute(Job.java:529)
org.apache.wayang.core.api.Job.doExecute(Job.java:314)
org.apache.wayang.core.util.OneTimeExecutable.tryExecute(OneTimeExecutable.java:41)
org.apache.wayang.core.util.OneTimeExecutable.execute(OneTimeExecutable.java:54)
org.apache.wayang.core.api.Job.execute(Job.java:249)
org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:133)
org.apache.wayang.core.api.WayangContext.execute(WayangContext.java:121)
org.apache.wayang.api.PlanBuilder.buildAndExecute(PlanBuilder.scala:105)
org.apache.wayang.api.DataQuanta.collect(DataQuanta.scala:758)
ammonite.$sess.cmd1$Helper.wordcount(cmd1.sc:12)
ammonite.$sess.cmd7$Helper.(cmd7.sc:1)
ammonite.$sess.cmd7$.(cmd7.sc:7)