I am getting following error while running below command :
pyspark rnn.py -e 10 -hl 5 -i dataset/iris.data -t dataset/data.config -p 5
Caused by: org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File "/home/carl/project/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 111, in main
process()
File "/home/carl/project/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 106, in process
serializer.dump_stream(func(split_index, iterator), outfile)
File "/home/carl/project/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/rdd.py", line 2346, in pipeline_func
File "/home/carl/project/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/rdd.py", line 2346, in pipeline_func
File "/home/carl/project/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/rdd.py", line 2346, in pipeline_func
File "/home/carl/project/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/rdd.py", line 2346, in pipeline_func
File "/home/carl/project/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/rdd.py", line 2346, in pipeline_func
File "/home/carl/project/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/rdd.py", line 2346, in pipeline_func
File "/home/carl/project/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/rdd.py", line 317, in func
File "/home/carl/gitprojects/LSTM-TensorSpark-master/src/rnn.py", line 442, in <lambda>
lambda x: train_rnn(x, multilayer_props, epoch=epoch, target=target), True)
File "/home/carl/gitprojects/LSTM-TensorSpark-master/src/rnn.py", line 337, in train_rnn
ret = rnn.fit_layer(partition[0][1], s, partition=int(partition[0][0]))
File "/home/carl/gitprojects/LSTM-TensorSpark-master/src/rnn.py", line 258, in fit_layer
self.layer.fit_next(in_data, s)
File "/home/carl/gitprojects/LSTM-TensorSpark-master/src/rnn.py", line 157, in fit_next
self.train_layer(input_data_T, s)
File "/home/carl/gitprojects/LSTM-TensorSpark-master/src/rnn.py", line 121, in train_layer
self.forget_gate_layer(input_data_T)
File "/home/carl/gitprojects/LSTM-TensorSpark-master/src/rnn.py", line 95, in forget_gate_layer
'ft_%d' % self.node_id))
File "/home/carl/gitprojects/LSTM-TensorSpark-master/src/rnn.py", line 87, in layer_step
data_concat = tf.concat(0, [self.ht, input_data])
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/array_ops.py", line 1029, in concat
dtype=dtypes.int32).get_shape(
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/ops.py", line 639, in convert_to_tensor
as_ref=False)
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/ops.py", line 704, in internal_convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/constant_op.py", line 113, in _constant_tensor_conversion_function
return constant(v, dtype=dtype, name=name)
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/constant_op.py", line 102, in constant
tensor_util.make_tensor_proto(value, dtype=dtype, shape=shape, verify_shape=verify_shape))
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/tensor_util.py", line 370, in make_tensor_proto
_AssertCompatible(values, dtype)
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/tensor_util.py", line 302, in _AssertCompatible
(dtype.name, repr(mismatch), type(mismatch).__name__))
TypeError: Expected int32, got <tf.Variable 'optimizer/ht_16610936633777551933:0' shape=(5, 1) dtype=float32_ref> of type 'Variable' instead.
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:166)
at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:207)
at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:125)
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
I am getting following error while running below command : pyspark rnn.py -e 10 -hl 5 -i dataset/iris.data -t dataset/data.config -p 5