SrijanShovit / HealthLearning

A repo comprising of various Machine Learning and Deep Learning projects in healthcare domain.
36 stars 51 forks source link

Add polyp segmentation #169

Open Kaushal-11 opened 3 weeks ago

Kaushal-11 commented 3 weeks ago

Resizing images to a fixed size (256x256) Normalizing pixel values (0-255 to 0-1 range) Using metrics like accuracy, recall, and precision Logging training progress and metrics to a CSV file Applying a threshold (0.5) to convert probabilities to binary masks Calculating evaluation metrics like Accuracy, F1 Score, Jaccard Index (IoU), Recall, Precision Visualizing results : [ Original image | Ground truth mask | Predicted mask]

SrijanShovit commented 2 weeks ago

Modify it for different activation functions and optimizers

Kaushal-11 commented 2 weeks ago

Other activation functions and optimizers did not yield as good results in terms of segmentation accuracy and training efficiency.

because there is main reason of using in such a complex model, In input layer, ReLu handle the vanishing gradient problem. in output layer, Sigmoid is suitable for binary segmentation tasks.

and Adam optimizer is good for it because of it dynamically adjusts the learning rate during training

So, this combination is giving me good accuracy for this segmentation

SrijanShovit commented 1 week ago

Can you try K-Fold method, tune early stopping epochs, Selu, Mish activation, Lion optimizer

Do an experimentation of these with several models. Then come to a conclusion.

Kaushal-11 commented 1 week ago

I implemented my model with selu activation, Lion optimizer with K-Fold validation and EarlyStopping callbacks but it is not give good results.

Here comparision between results, Metric SELU + Lion + K-Fold + EarlyStopping ReLU + Adam
Accuracy 0.92316 0.94052
F1 Score 0.34482 0.55769
Jaccard 0.25670 0.44812
Recall 0.34664 0.60501
Precision 0.52644 0.61600

Result of (ReLU + Adam)

image

Result of (SeLU + Lion)

image

Kaushal-11 commented 1 week ago

I performed all possible combination and played with hyperparameter I can conclude that ReLU often provides more stable and faster convergence during training compared to SELU, Sigmoid, Mish, tanh. The Adam optimizer, which works well with ReLU, provided better optimization and generalization in this scenario.

SrijanShovit commented 1 week ago

I performed all possible combination and played with hyperparameter I can conclude that ReLU often provides more stable and faster convergence during training compared to SELU, Sigmoid, Mish, tanh. The Adam optimizer, which works well with ReLU, provided better optimization and generalization in this scenario.

I don't agree. I have seen Mish and Selu completely outperform Relu sometimes. Anyways, have you tried some other strategies of augmentation or any different model?

Kaushal-11 commented 1 week ago

i inserted metrics results and final prediction image which derived using the SeLU. Unfortunately in this problem, SeLU and Mish is not working well.

Kaushal-11 commented 1 week ago

Yes, I have tried Data Augmentation in my local, And also implemented another model like DeepLabV3+ and ResUnet Model

Kaushal-11 commented 1 week ago

DeepLabv3+ model's results are not good as Unet But ResUnet's results is not much good but slightly better than Unet Model

Kaushal-11 commented 4 days ago

Hey, @SrijanShovit

can you answer my message