When we run p7_textcnn_train.py,We set a batch size to train the model, and we find it work well,but we it go to test process--> do_eval() function, there is a bug: the dismatch of batchsize. I think in the do_eval function, batchsize is 1,but in the model the batchsize is the same as the batch size in train process, this is why the batchsize is dismatch. But I do not know how to correct it?
In the model, the shape is [self.batch_size, self.embed_size], and in the eval process, the batchsize is the same as the batch size in training process.
self.left_side_first_word= tf.get_variable("left_side_first_word",shape=[self.batch_size, self.embed_size],initializer=self.initializer)
self.right_side_last_word = tf.get_variable("right_side_last_word",shape=[self.batch_size, self.embed_size],initializer=self.initializer)
When we run p7_textcnn_train.py,We set a batch size to train the model, and we find it work well,but we it go to test process--> do_eval() function, there is a bug: the dismatch of batchsize. I think in the do_eval function, batchsize is 1,but in the model the batchsize is the same as the batch size in train process, this is why the batchsize is dismatch. But I do not know how to correct it?
In the model, the shape is [self.batch_size, self.embed_size], and in the eval process, the batchsize is the same as the batch size in training process. self.left_side_first_word= tf.get_variable("left_side_first_word",shape=[self.batch_size, self.embed_size],initializer=self.initializer) self.right_side_last_word = tf.get_variable("right_side_last_word",shape=[self.batch_size, self.embed_size],initializer=self.initializer)
Thanks!