Closed vandenBergArthur closed 1 year ago
@vandenBergArthur thanks for the report.
Just a quick response before I have to look into it later. Keras usually assumes the 1st dimension of the input to the model (i.e. the batch size) can be variable and is encoded as None
. This of course still supports sending in inputs with a fixed batch size of 1.
On the hls4ml side, we expect models to be supplied this way (i.e. variable batch size), but then we only create HLS code / synthesize the model with a batch size of 1.
Is there a particular reason you need to specify the fixed batch size of 1 when building the model?
Otherwise, as you point out, the workaround is simply to not supply the batch size when building the model.
Nonetheless, we should be able to fix this in the hls4ml parsing.
Hi @jmduarte, thank you for the very swift response!
Is there a particular reason you need to specify the fixed batch size of 1 when building the model?
I was experimenting if I could 'trick' the model into using this batch-dimension for something useful. Originally, I added an extra dimension to my tensor on which I concatenated other tensors. Like this:
input_shape_x = (64,9,25)
input_x = Input(shape=input_shape_x, name='input_x')
# Change the dimensions of the original input graph frame
a = Permute((2,3,1))(input_x)
self_conv1 = Conv2D(filters=10, kernel_size=1,data_format='channels_last')(a)
self_conv2 = Conv2D(filters=10, kernel_size=1,data_format='channels_last')(a)
self_conv3 = Conv2D(filters=10, kernel_size=1,data_format='channels_last')(a)
# Reshape each 4D tensor to add an extra 5th dimension
self_conv1 = Reshape(target_shape=(9,25,128,1))(self_conv1)
self_conv2 = Reshape(target_shape=(9,25,128,1))(self_conv2)
self_conv3 = Reshape(target_shape=(9,25,128,1))(self_conv3)
# Concatenate the 3 tensors along the 5th dimension
b = Concatenate(axis=-1)([self_conv1, self_conv2, self_conv3])
# Change the dimension order for the correct broadcasting of the adjacency matrix
c = Permute((1,4,3,2))(b)
model = Model(inputs=input_x, outputs=c)
Note: This model is part of a bigger model (that's why I start/end with Permute) and in the meantime I am aware that Permute / transposing these kind of tensors is not supported since your colleague mentioned this in one of my previous posts. (See #746)
But I quickly found out that Concatenation of tensors with rank > 3 is not yet supported. Hence, I tried to use the batch dimension to stack the tensors on. So, I specified batch_size = 1
and tried to concatenate with a = Concatenate(axis=0)([self_conv1, self_conv2])
.
I can conclude that using the batch dimension for something else is not a good solution, hence I will close this issue. So you do not need to look into it deeper.
Thanks again for your input!
Hi all,
To demonstrate the issue, I created the following model:
When running
config = hls4ml.utils.config_from_keras_model(model, granularity='model')
I encounter the following error:I noticed that if I simply remove the
batch_size=1
argument ininput_x
, the error is gone. But I don't understand where this error comes from since the batch dimension is often stripped away. (right?)Any ideas where this problem comes from and how it can be solved?
Kind regards, Arthur