ICT-BDA / EasyML

Easy Machine Learning is a general-purpose dataflow-based system for easing the process of applying machine learning algorithms to real world tasks.
Apache License 2.0
1.98k stars 440 forks source link

运行RMSE时提示错误,请问如何解决 #102

Closed TangVVV closed 5 years ago

TangVVV commented 5 years ago

19/04/24 06:29:36 WARN spark.SparkConf: SPARK_CLASSPATH was detected (set to '/usr/local/spark/jars/:/usr/local/spark/ext_libs/'). This is deprecated in Spark 1.0+.

Please instead use:

19/04/24 06:29:36 WARN spark.SparkConf: Setting 'spark.executor.extraClassPath' to '/usr/local/spark/jars/:/usr/local/spark/ext_libs/' as a work-around. 19/04/24 06:29:36 WARN spark.SparkConf: Setting 'spark.driver.extraClassPath' to '/usr/local/spark/jars/:/usr/local/spark/ext_libs/' as a work-around. 19/04/24 06:29:37 INFO slf4j.Slf4jLogger: Slf4jLogger started 19/04/24 06:29:37 INFO Remoting: Starting remoting 19/04/24 06:29:37 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@172.18.0.5:40703] Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://hadoop-master:9000/user/root/597B1331-CDAA-4917-B5A7-DD7D9C73CEAD at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:285) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313) at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:199) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1952) at org.apache.spark.rdd.RDD$$anonfun$reduce$1.apply(RDD.scala:1025) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111) at org.apache.spark.rdd.RDD.withScope(RDD.scala:316) at org.apache.spark.rdd.RDD.reduce(RDD.scala:1007) at bda.spark.evaluate.Regression$.RMSE(Regression.scala:19) at bda.spark.runnable.evaluate.RMSERunner$.run(RMSERunner.scala:60) at bda.spark.runnable.evaluate.RMSERunner$.main(RMSERunner.scala:42) at bda.spark.runnable.evaluate.RMSERunner.main(RMSERunner.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 19/04/24 06:29:39 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 19/04/24 06:29:39 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 19/04/24 06:29:39 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down.