Closed Mendel1 closed 6 years ago
Using the index 1 means that I squeeze the second dimension.
tf.squeeze
only removes "useless" dimensions. In this particular case:
Let batch_size = 64
, cnn_out_width = 36
(example), and char_count = 69
The function CNN()
will output a tensor of size (64, 1, 36, 69). tf.squeeze(x, [1])
makes it (64, 36, 69).
Can you tell me which page of the paper mentions the map_to_sequence
part?
I got a problem,InvalidArgumentError (see above for traceback): sequence_length(0) <= 31. Can you give me some advice?thanks a lot.
It's bugging me too. I know a workaround but the network still won't correct train.
Basically, the BLSTM returns 31 time steps so the seq_lens are automatically 31 max.
I'll push a fix later today.
Okay,thanks.
i have the same problem, have you solved it? @Belval @ajiaxiaoyi
The sequence length issue is solved in origin/remove-ctc but it does not converge as of today.
er.. what you mean is the network still won't correct train. what should i do with my code? thanks a lot.
You can converge on small dataset with short words (it could read about a 100) but indeed it is not close to the paper performance-wise.
i see,and i will try again,thank you very much.
The sequence length error was related to an error in the CNN part of the network. Please retry with current master.
I have a little question about this part below. Does this mean you slice it along first axis, which means you slice it along batch-size dimension? But according to the paper,shouldn't it be sliced along 'w' dimension?