Open zeynepdenizgundogan opened 11 months ago
You can't use 19 for 3rd dimension. Try 32.
If you want to use 19, then you need to use custom stride size: Something like: ((2, 2, 1), (2, 2, 1), (2, 2, 2), (2, 2, 2), (2, 2, 2)) or ((2, 2, 2), (2, 2, 2), (2, 2, 2), (2, 2, 2), (2, 2, 1))
So is this the part that I should update ?
if type(stride_size) not in (tuple, list):
stride_size = [
(stride_size, stride_size, stride_size,),
(stride_size, stride_size, stride_size,),
(stride_size, stride_size, stride_size,),
(stride_size, stride_size, stride_size,),
]
else:
stride_size = list(stride_size)
I don't know which code you use. This works for me:
def tst_keras():
# for keras
from keras import __version__
from keras import backend as K
from keras import models
from classification_models_3D.keras import Classifiers
print('Keras version: {}'.format(__version__))
include_top = False
# use_weights = 'imagenet'
use_weights = None
list_of_models = [
'convnext_tiny',
]
for type in list_of_models:
modelPoint, preprocess_input = Classifiers.get(type)
model = modelPoint(input_shape=(128, 128, 32, 1), include_top=include_top, weights=use_weights)
print(model.summary())
K.clear_session()
if __name__ == '__main__':
tst_keras()
Also this works for me:
modelPoint, preprocess_input = Classifiers.get('convnext_tiny')
model = modelPoint(
input_shape=(128, 128, 19, 1),
include_top=include_top,
stride_size=((2, 2, 2), (2, 2, 2), (2, 2, 2), (2, 2, 1)),
weights=use_weights
)
Note: number of strides for convnext is 4. Usually it 5 for almost all other models.
Okay thank you so much!
I am trying to use convnext for images with 128x128x19x1 size but I get the following error why might that happen ; ValueError: Exception encountered when calling layer 'convnext_small_downsampling_conv_2' (type Conv3D).
Negative dimension size caused by subtracting 2 from 1 for '{{node convnext_small_downsampling_block_2/convnext_small_downsampling_conv_2/Conv3D}} = Conv3D[T=DT_FLOAT, data_format="NDHWC", dilations=[1, 1, 1, 1, 1], padding="VALID", strides=[1, 2, 2, 2, 1]](convnext_small_downsampling_block_2/convnext_small_downsampling_layernorm_2/batchnorm/add_1, convnext_small_downsampling_block_2/convnext_small_downsampling_conv_2/Conv3D/ReadVariableOp)' with input shapes: [?,8,8,1,384], [2,2,2,384,768].
Call arguments received by layer 'convnext_small_downsampling_conv_2' (type Conv3D): • inputs=tf.Tensor(shape=(None, 8, 8, 1, 384), dtype=float32)