keras-team / keras-preprocessing

Utilities for working with image data, text data, and sequence data.
Other
1.02k stars 444 forks source link

ImageDatagenerator Issue with Transforming Images and Masks Together. #274

Open Vaevin opened 4 years ago

Vaevin commented 4 years ago

https://gist.github.com/joshleiph/c4fac7f1f6e51c3739c41d2408395d70

I am having issues with the ImageDataGenerator and transforming images and masks together. I am attempting to do semantic segmentation, and I have a folder of ultrasound images and their corresponding masks for annotation. I created my ImageDataGenerator for both the images and masks using the example given in the keras documentation:

from keras.preprocessing.image import ImageDataGenerator

image_size = 128 epochs = 5 batch_size = 4

seed = 1

image_datagen = ImageDataGenerator(rescale=1./255) #Rescale all images by 1/255 mask_datagen = ImageDataGenerator(rescale=1./255)

image_datagen.fit(frame_images, augment=True, seed=seed) mask_datagen.fit(mask_images, augment=True, seed=seed)

train_image_generator = image_datagen.flow_from_directory( os.path.dirname(train_frames_fol), #Our target directory target_size = (image_size,image_size), batch_size = batch_size, class_mode='binary', seed =seed)

train_mask_generator = mask_datagen.flow_from_directory( os.path.dirname(train_mask_fol), target_size = (image_size,image_size), batch_size = batch_size, class_mode='binary', seed =seed)

val_image_generator = image_datagen.flow_from_directory( os.path.dirname(val_frames_fol), target_size = (image_size,image_size), batch_size = batch_size, class_mode='binary', seed =seed)

val_mask_generator = mask_datagen.flow_from_directory( os.path.dirname(val_mask_fol), target_size = (image_size,image_size), batch_size = batch_size, class_mode='binary', seed =seed)

train_generator = (pair for pair in zip(train_image_generator, train_mask_generator)) val_generator = (pair for pair in zip(val_image_generator, val_mask_generator))

The final output of this are my two generators, train_generator and val_generator. I then create my model (in this case I am using a U-Net model) and finally go to perform the training step. Code below is where the error is occurring:

%% -------------------- Training the model ------------------------------------

NO_OF_TRAINING_IMAGES = len(os.listdir(train_frames_fol)) NO_OF_VAL_IMAGES = len(os.listdir(val_frames_fol))

train_steps = (NO_OF_TRAINING_IMAGES//batch_size) valid_steps = (NO_OF_VAL_IMAGES//batch_size)

%%

model.fit_generator(train_generator, validation_data=val_generator, steps_per_epoch=train_steps, validation_steps=valid_steps, epochs=epochs)

After running this, I get the error: Error when checking model target: the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 1 array(s), but instead got the following list of 2 arrays:

I am not sure what is going on here. I can add more code here to help clarify.