Closed dmkr001 closed 3 years ago
It's explicitly stated here https://github.com/keras-team/keras/blob/369854e5da3f9c58eedd9b77e778cf87755bd6a4/keras/engine/topology.py#L1415 that the shape
argument of Input
should not include the batch size.
So how about
input_a = Input(shape=input_shape[1:])
input_b = Input(shape=input_shape[1:])
instead?
yeah,you are right
ffbcf 邮箱:xuzihua001@163.com |
---|
签名由 网易邮箱大师 定制
On 03/22/2018 08:12, connection-on-fiber-bundles wrote:
It's explicitly stated here https://github.com/keras-team/keras/blob/369854e5da3f9c58eedd9b77e778cf87755bd6a4/keras/engine/topology.py#L1415 that the shape argument of Input should not include the batch size.
So how about
input_a = Input(shape=input_shape[1:]) input_b = Input(shape=input_shape[1:])
instead?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.
I am trying to use a combination of siamese and 3dcnn, and I set the parameters like that:
def create_base_network():
and input shape like that:
tr_pair0=tr_pair0.reshape(10840,1,28,28,10) tr_pair1=tr_pair1.reshape(10840,1,28,28,10) tr_y=tr_y.reshape(10840,10,)
input_shape = tr_pair0.shape print(input_shape) input_a = Input(shape=input_shape) input_b = Input(shape=input_shape) print(input_a)
because we re-use the same instance
base_network
,the weights of the network
will be shared across the two branches
processed_a = base_network(input_a) processed_b = base_network(input_b)
but the problem is:
Traceback (most recent call last): File "/home/xzh/PycharmProjects/test/3DCNN.py", line 144, in
processed_a = base_network(input_a)
ValueError: Dimension must be 6 but is 5 for 'sequential_1/conv3d_1/transpose' (op: 'Transpose') with input shapes: [?,10840,1,28,28,10], [5].
I really do not know what's the problem,can anyone help me, thank you very much.