In the file model.py, in the init of class LSTMPolicy, step_size is defined as tf.shape(self.x)[:1] which means it is the batch size 1 in our case. However, we can see, in the following usage of tf.rnn.dynamic_rnn(), step_size has the meaning of the length of each input sequence. Since we have only one batch so step_size should be tf.shape(self.x)[1:2], am I right?
In the file model.py, in the init of class LSTMPolicy, step_size is defined as tf.shape(self.x)[:1] which means it is the batch size 1 in our case. However, we can see, in the following usage of tf.rnn.dynamic_rnn(), step_size has the meaning of the length of each input sequence. Since we have only one batch so step_size should be tf.shape(self.x)[1:2], am I right?
Thanks.