kobiso / CBAM-keras

CBAM implementation on Keras
MIT License
359 stars 134 forks source link

TypeError: attach_attention_module() missing 1 required positional argument: 'net' #7

Open rohitsaluja22 opened 2 years ago

rohitsaluja22 commented 2 years ago

Hi getting the following error when using attention_module.py with another keras code I have, features I use are from inception_resnet_v2 using "from tensorflow.keras.applications.inception_resnet_v2 import InceptionResNetV2" and "Time Distributed Layer":

error: TypeError: attach_attention_module() missing 1 required positional argument: 'net'

my code: base_model = TimeDistributed( InceptionResNetV2( weights="imagenet", pooling=None, include_top=False, input_shape=(270, 480, 3) ), input_shape=(self.time_steps, 270, 480, 3) )(inputs) base_model2 = TimeDistributed(attach_attention_module())(base_model)

NOTE: I have changed the 2nd argument to default to 'cbam_block'