I'm trying to implement a Sentiment-Classifier using keras. But i got some problems with the batch_size param. It might be kind of a specific problem and I didn't find anything online that helped me.
I reshape the outputs of both embeddings (also using the batch_size) because the keras Concatenate-Layer needs tensors with equal shape on each axis except from the one you concat on. To be able to do the reshape I fit the number of my samples to a multiple of the batch_size. If I want to train my network everything is working fine until the last batch of the epoch. There I got an error that the number of input_values do not match the expected number:
InvalidArgumentError (see above for traceback): Input to reshape is a tensor with 320000 values, but the requested shape has 1280000 [[Node: Reshape_Batch_size/Reshape = Reshape[T=DT_FLOAT, Tshape=DT_INT32, _device="/job:localhost/replica:0/task:0/device:CPU:0"](Embedding_basic_em/Gather, Reshape_Batch_size/Reshape/shape)]]
320000 = 8 * 200 * 200
1280000 = 32 * 200 * 200
This looks like keras is cutting 24 samples from the last batch. But if I look at the output of the training it tells me there are 32 remaining samples:
Does anybody know how keras handles batches depending on the given batch_size param or what I'm missing here? Or maybe someone has a hint how i can do the concatination without reshaping my outputs.
My Configuration
I'm working on macOS Sierra version 10.12.6, using Python 3.5.3 from Anaconda 4.4.0 (x86_64). As keras backend I'm using tensorflow(1.4.0). keras is on version 2.1.1
I'm trying to implement a Sentiment-Classifier using keras. But i got some problems with the batch_size param. It might be kind of a specific problem and I didn't find anything online that helped me.
Here is an example how the model could look:
Reshaping is done by a Lambda-Layer using the reshape-function from keras.backend:
output = Lambda(lambda x: bd.reshape(x, (batch_size, 200, em_dim)), name='Reshape_Batch_size')(embedded)
I reshape the outputs of both embeddings (also using the batch_size) because the keras Concatenate-Layer needs tensors with equal shape on each axis except from the one you concat on. To be able to do the reshape I fit the number of my samples to a multiple of the batch_size. If I want to train my network everything is working fine until the last batch of the epoch. There I got an error that the number of input_values do not match the expected number:
This looks like keras is cutting 24 samples from the last batch. But if I look at the output of the training it tells me there are 32 remaining samples:
Does anybody know how keras handles batches depending on the given batch_size param or what I'm missing here? Or maybe someone has a hint how i can do the concatination without reshaping my outputs.
My Configuration I'm working on macOS Sierra version 10.12.6, using
Python 3.5.3
fromAnaconda 4.4.0 (x86_64)
. As keras backend I'm usingtensorflow(1.4.0)
.keras
is on version2.1.1
Thanks in advance