naturomics / CapsLayer

CapsLayer: An advanced library for capsule theory
Apache License 2.0
361 stars 116 forks source link

Variable batch size problem #18

Closed 0x454447415244 closed 6 years ago

0x454447415244 commented 6 years ago

I'm trying to integrate this implementation of capsule with RNN. I'm using the latest version of Tensorflow.

I'm getting errors like this:

Traceback (most recent call last):
  File "train.py", line 46, in <module>
    Inputs = CAPSULE_NET(x_expanded, phase_train, 'CAPSULE_NET_1')
  File "/home/user/Testing/DeepLearning/Systems/Experimental/Capsule/capsulenet.py", line 43, in CAPSULE_NET
    primaryCapsules, activation = primaryCaps(conv1, method='logistic', filters=32, kernel_size=9, strides=2, out_caps_shape=[8, 1])
  File "/home/user/Testing/DeepLearning/Systems/Experimental/Capsule/layers.py", line 91, in primaryCaps
    pose = tf.reshape(pose, shape=pose_shape)
  File "/usr/lib/python2.7/site-packages/tensorflow/python/ops/gen_array_ops.py", line 3997, in reshape
    "Reshape", tensor=tensor, shape=shape, name=name)
  File "/usr/lib/python2.7/site-packages/tensorflow/python/framework/op_def_library.py", line 513, in _apply_op_helper
    raise err
TypeError: Failed to convert object of type <type 'list'> to Tensor. Contents: [None, 24, 24, 32, 8, 1]. Consider casting elements to a supported type.

This can be resolved by changing

pose_shape = pose.get_shape().as_list()[:3] + [filters] + out_caps_shape

to

pose_shape = np.array([-1] + pose.get_shape().as_list()[1:3] + [filters] + out_caps_shape, dtype=np.int32)

The tensor has the following shape: [None, 24, 24, 32, 8, 1]. The batch size is variable, this is why it is None. I tried fixing lots of these problems in the code related to the batch size being a None Tensorflow dimension or -1 as with numpy. I'm now stuck on some of these problems in the EM part.

What do you guys think can be done to properly support variable batch size?

Thanks

Wowoho commented 6 years ago

I missed the same problem.

naturomics commented 6 years ago

Fixed, thank you for your feed back

SafaaDaf commented 4 years ago

How you fixed it please ?