Closed liangxi11 closed 4 years ago
Hi,
It looks like you created your own Extractor, but you have paths in your data that are longer than config.MAX_PATH_LENGTH
.
Is this true?
If so, you can either increase this configuration value, or limit the paths that your Extractor produces to be up to 9 nodes long.
If this is not the case, please provide more details. Uri
Hi, It looks like you created your own Extractor, but you have paths in your data that are longer than
config.MAX_PATH_LENGTH
. Is this true? If so, you can either increase this configuration value, or limit the paths that your Extractor produces to be up to 9 nodes long.If this is not the case, please provide more details. Uri
Hi, Yes, the paths in my data are longer than config.MAX_PATH_LENGTH. I have the same error even though I increase config.MAX_PATH_LENGTH to 51 and keep increasing it seems not a good solution, so I'd like to limit the paths length in my data.
And I want to ask how to limit the paths length? Keep the first config.MAX_PATH_LENGTH nodes? Or in another limiting way?
Xi
Hi, This really depends on what kinds of paths are you using and what are their typical values (avg, min, max, median, etc). Which approach will keep most of the information: taking only the path's prefix of size N, taking the path's suffix of size N, taking N/2 from its beginning and N/2 nodes from its end, or using some other building blocks that will make them shorter?
Thanks!
How can I deal with it? Does it result from my preprocess python data.
WARNING:tensorflow:From /usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/sparse_ops.py:1165: sparse_to_dense (from tensorflow.python.ops.sparse_ops) is deprecated and will be removed in a future version. Instructions for updating: Create a
tf.sparse.SparseTensor
and usetf.sparse.to_dense
instead. 2019-12-14 19:30:28.721583: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[12] = [3,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:28.724205: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[9] = [0,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:28.724281: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[11] = [2,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:28.730303: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[9] = [0,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:28.730911: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[9] = [0,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:28.731179: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[15] = [1,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:28.731549: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[9] = [0,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:29.089994: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[10] = [1,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:29.095160: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[15] = [1,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:29.099648: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[9] = [0,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:33.144993: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[25] = [4,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:33.145895: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[75] = [13,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:33.345387: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[17] = [2,9] is out of bounds: need 0 <= index < [200,9] 2019-12-14 19:30:33.345840: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at sparse_to_dense_op.cc:128 : Invalid argument: indices[9] = [0,9] is out of bounds: need 0 <= index < [200,9]Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1334, in _do_call return fn(*args) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1319, in _run_fn options, feed_dict, fetch_list, target_list, run_metadata) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1407, in _call_tf_sessionrun run_metadata) tensorflow.python.framework.errors_impl.OutOfRangeError: End of sequence [[{{node IteratorGetNext}} = IteratorGetNextoutput_shapes=[[?,200,9], [?,200], [?,200,5], [?,200], [?,200,1], [?,200,1], [?,200,5], [?,200], [?,200,1], [?,?], [?], [?], [?,200]], output_types=[DT_INT32, DT_INT32, DTINT32, DT INT32, DT_STRING, DT_STRING, DT_INT32, DT_INT32, DT_STRING, DT_INT32, DT_INT64, DT_STRING, DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"]] [[{{node model/dense/Tensordot/GatherV2/_183}} = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device_incarnation=1, tensor_name="edge_1720_model/dense/Tensordot/GatherV2", tensor_type=DT_INT32, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/home/lxr/baseline/code2seq/code2seq/model.py", line 95, in train _, batch_loss = self.sess.run([optimizer, train_loss]) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 929, in run run_metadata_ptr) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1152, in _run feed_dict_tensor, options, run_metadata) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1328, in _do_run run_metadata) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1348, in _do_call raise type(e)(node_def, op, message) tensorflow.python.framework.errors_impl.OutOfRangeError: End of sequence [[node IteratorGetNext (defined at /home/lxr/baseline/code2seq/code2seq/reader.py:192) = IteratorGetNextoutput_shapes=[[?,200,9], [?,200], [?,200,5], [?,200], [?,200,1], [?,200,1], [?,200,5], [?,200], [?,200,1], [?,?], [?], [ ?], [?,200]], output_types=[DT_INT32, DT_INT32, DT_INT32, DT_INT32, DT_STRING, DT_STRING, DT_INT32, DT_INT32, DT_STRING, DT_INT32, DT_INT64, DT_STRING, DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"]] [[{{node model/dense/Tensordot/GatherV2/_183}} = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device_incarnation=1, tensor_name="edge_1720_model/dense/Tensordot/GatherV2", tensor_type=DT_INT32, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]
Caused by op 'IteratorGetNext', defined at: File "code2seq.py", line 33, in
model.train()
File "/home/lxr/baseline/code2seq/code2seq/model.py", line 76, in train
config=self.config)
File "/home/lxr/baseline/code2seq/code2seq/reader.py", line 43, in init
self.output_tensors = self.compute_output()
File "/home/lxr/baseline/code2seq/code2seq/reader.py", line 192, in compute_output
return self.iterator.get_next()
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/data/ops/iterator_ops.py", line 421, in get_next
name=name)), self._output_types,
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/gen_dataset_ops.py", line 2069, in iterator_get_next
output_shapes=output_shapes, name=name)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/op_def_library.py", line 787, in _apply_op_helper
op_def=op_def)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/util/deprecation.py", line 488, in new_func
return func(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/ops.py", line 3274, in create_op
op_def=op_def)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/ops.py", line 1770, in init
self._traceback = tf_stack.extract_stack()
OutOfRangeError (see above for traceback): End of sequence [[node IteratorGetNext (defined at /home/lxr/baseline/code2seq/code2seq/reader.py:192) = IteratorGetNextoutput_shapes=[[?,200,9], [?,200], [?,200,5], [?,200], [?,200,1], [?,200,1], [?,200,5], [?,200], [?,200,1], [?,?], [?], [ ?], [?,200]], output_types=[DT_INT32, DT_INT32, DT_INT32, DT_INT32, DT_STRING, DT_STRING, DT_INT32, DT_INT32, DT_STRING, DT_INT32, DT_INT64, DT_STRING, DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"]] [[{{node model/dense/Tensordot/GatherV2/_183}} = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device_incarnation=1, tensor_name="edge_1720_model/dense/Tensordot/GatherV2", tensor_type=DT_INT32, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "code2seq.py", line 33, in
model.train()
File "/home/lxr/baseline/code2seq/code2seq/model.py", line 106, in train
results, precision, recall, f1 = self.evaluate()
File "/home/lxr/baseline/code2seq/code2seq/model.py", line 220, in evaluate
output_file.write(str(num_correct_predictions / total_predictions) + '\n')
ZeroDivisionError: division by zero