Hi getting the following error when using attention_module.py with another keras code I have, features I use are from inception_resnet_v2 using "from tensorflow.keras.applications.inception_resnet_v2 import InceptionResNetV2" and "Time Distributed Layer":
Hi getting the following error when using attention_module.py with another keras code I have, features I use are from inception_resnet_v2 using "from tensorflow.keras.applications.inception_resnet_v2 import InceptionResNetV2" and "Time Distributed Layer":
error: TypeError: attach_attention_module() missing 1 required positional argument: 'net'
my code: base_model = TimeDistributed( InceptionResNetV2( weights="imagenet", pooling=None, include_top=False, input_shape=(270, 480, 3) ), input_shape=(self.time_steps, 270, 480, 3) )(inputs) base_model2 = TimeDistributed(attach_attention_module())(base_model)
NOTE: I have changed the 2nd argument to default to 'cbam_block'