streaming-graphs / NOUS

NOUS: Construction, Querying and Reasoning with Knowledge Graphs
http://aim.pnnl.gov/projects/nous-incremental-maintenance-knowledge-graphs
71 stars 38 forks source link

NumberFormatException error on hello world example #5

Closed hack-er closed 7 years ago

hack-er commented 7 years ago

Hi there! so, this is what i've got:

mvn --version:
Apache Maven 3.3.9
Maven home: /usr/share/maven
Java version: 1.8.0_121, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-8-oracle/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "4.8.0-41-generic", arch: "amd64", family: "unix"
spark-submit --version:
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
      /_/

Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.7.0_80
Branch 
Compiled by user jenkins on 2016-12-16T02:04:48Z
Revision 
Url 
Type --help for more information.

annnnnnd I'm getting this when trying to run the example as seen in readme:

[SPARK_HOME]/bin/spark-submit --verbose --jars "[PATH_TO_NOUS_JAR]" --master [SPARK_MASTER] --class "gov.pnnl.aristotle.algorithms.GraphMiner" target/knowledge_graph-0.1-SNAPSHOT-jar-with-dependencies.jar rdf:type 10 5 3 ../examples/graphmining/dronedata.ttl

what I end up running into is this:

root@Nous:/mnt/stor# /usr/lib/spark/bin/spark-submit --verbose --jars /root/NOUS/knowledge_graph/target/ --class gov.pnnl.aristotle.algorithms.GraphMiner /root/NOUS/knowledge_graph/target/knowledge_graph-0.1-SNAPSHOT.jar rdf:type 10 5 3 ../examples/graphmining/dronedata.ttl
Using properties file: null
Parsed arguments:
  master                  local[*]
  deployMode              null
  executorMemory          null
  executorCores           null
  totalExecutorCores      null
  propertiesFile          null
  driverMemory            null
  driverCores             null
  driverExtraClassPath    null
  driverExtraLibraryPath  null
  driverExtraJavaOptions  null
  supervise               false
  queue                   null
  numExecutors            null
  files                   null
  pyFiles                 null
  archives                null
  mainClass               gov.pnnl.aristotle.algorithms.GraphMiner
  primaryResource         file:/root/NOUS/knowledge_graph/target/knowledge_graph-0.1-SNAPSHOT.jar
  name                    gov.pnnl.aristotle.algorithms.GraphMiner
  childArgs               [rdf:type 10 5 3 ../examples/graphmining/dronedata.ttl]
  jars                    file:/root/NOUS/knowledge_graph/target/
  packages                null
  packagesExclusions      null
  repositories            null
  verbose                 true

Spark properties used, including those specified through
 --conf and those from the properties file null:

Main class:
gov.pnnl.aristotle.algorithms.GraphMiner
Arguments:
rdf:type
10
5
3
../examples/graphmining/dronedata.ttl
System properties:
SPARK_SUBMIT -> true
spark.app.name -> gov.pnnl.aristotle.algorithms.GraphMiner
spark.jars -> file:/root/NOUS/knowledge_graph/target/,file:/root/NOUS/knowledge_graph/target/knowledge_graph-0.1-SNAPSHOT.jar
spark.submit.deployMode -> client
spark.master -> local[*]
Classpath elements:
file:/root/NOUS/knowledge_graph/target/knowledge_graph-0.1-SNAPSHOT.jar
file:/root/NOUS/knowledge_graph/target/

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/04/13 11:08:04 INFO SparkContext: Running Spark version 2.1.0
17/04/13 11:08:04 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/04/13 11:08:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/04/13 11:08:05 WARN Utils: Your hostname, Nous resolves to a loopback address: 127.0.1.1; using 10.90.90.13 instead (on interface ens160)
17/04/13 11:08:05 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
17/04/13 11:08:05 INFO SecurityManager: Changing view acls to: root
17/04/13 11:08:05 INFO SecurityManager: Changing modify acls to: root
17/04/13 11:08:05 INFO SecurityManager: Changing view acls groups to: 
17/04/13 11:08:05 INFO SecurityManager: Changing modify acls groups to: 
17/04/13 11:08:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
17/04/13 11:08:05 INFO Utils: Successfully started service 'sparkDriver' on port 42755.
17/04/13 11:08:05 INFO SparkEnv: Registering MapOutputTracker
17/04/13 11:08:05 INFO SparkEnv: Registering BlockManagerMaster
17/04/13 11:08:05 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/04/13 11:08:05 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/04/13 11:08:05 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-338feee7-c9d6-40af-9181-f38f6083b331
17/04/13 11:08:06 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
17/04/13 11:08:06 INFO SparkEnv: Registering OutputCommitCoordinator
17/04/13 11:08:06 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/04/13 11:08:06 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.90.90.13:4040
17/04/13 11:08:06 INFO SparkContext: Added JAR file:/root/NOUS/knowledge_graph/target/ at spark://10.90.90.13:42755/jars/target with timestamp 1492099686705
17/04/13 11:08:06 INFO SparkContext: Added JAR file:/root/NOUS/knowledge_graph/target/knowledge_graph-0.1-SNAPSHOT.jar at spark://10.90.90.13:42755/jars/knowledge_graph-0.1-SNAPSHOT.jar with timestamp 1492099686706
17/04/13 11:08:06 INFO Executor: Starting executor ID driver on host localhost
17/04/13 11:08:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46637.
17/04/13 11:08:06 INFO NettyBlockTransferService: Server created on 10.90.90.13:46637
17/04/13 11:08:06 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/04/13 11:08:06 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.90.90.13, 46637, None)
17/04/13 11:08:06 INFO BlockManagerMasterEndpoint: Registering block manager 10.90.90.13:46637 with 366.3 MB RAM, BlockManagerId(driver, 10.90.90.13, 46637, None)
17/04/13 11:08:06 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.90.90.13, 46637, None)
17/04/13 11:08:06 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.90.90.13, 46637, None)
Exception in thread "main" java.lang.NumberFormatException: For input string: "rdf:type"
        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
        at java.lang.Integer.parseInt(Integer.java:492)
        at java.lang.Integer.parseInt(Integer.java:527)
        at scala.collection.immutable.StringLike$class.toInt(StringLike.scala:272)
        at scala.collection.immutable.StringOps.toInt(StringOps.scala:29)
        at gov.pnnl.aristotle.algorithms.GraphMiner$.main(GraphMiner.scala:66)
        at gov.pnnl.aristotle.algorithms.GraphMiner.main(GraphMiner.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/04/13 11:08:07 INFO SparkContext: Invoking stop() from shutdown hook
17/04/13 11:08:07 INFO SparkUI: Stopped Spark web UI at http://10.90.90.13:4040
17/04/13 11:08:07 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/04/13 11:08:07 INFO MemoryStore: MemoryStore cleared
17/04/13 11:08:07 INFO BlockManager: BlockManager stopped
17/04/13 11:08:07 INFO BlockManagerMaster: BlockManagerMaster stopped
17/04/13 11:08:07 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/04/13 11:08:07 INFO SparkContext: Successfully stopped SparkContext
17/04/13 11:08:07 INFO ShutdownHookManager: Shutdown hook called
17/04/13 11:08:07 INFO ShutdownHookManager: Deleting directory /tmp/spark-75e2a284-3418-47aa-a10e-02157e28ce0

any help would be appriciated, thanks!

hack-er commented 7 years ago

would also like to mention that if i were to change the --jars paramater to the full path to the same jar file I get the same errors. ie: --jars /root/NOUS/knowledge_graph/target/uber-knowledge_graph-0.1-SNAPSHOT.jar OR --jars /root/NOUS/knowledge_graph/target/knowledge_graph-0.1-SNAPSHOT.jar

purohitsumit commented 7 years ago

Hi @hack-er , This bug referes to an older version of the code. We are in the process of releasing major refactored code by early next week. I will notify you once we have a stable version for you to try. Sorry about this inconvenience.

purohitsumit commented 7 years ago

Hi @hack-er Re-factored code is committed. Please take a look and let me know if you find any issue in the current "master" branch.