SrijanShovit / HealthLearning

A repo comprising of various Machine Learning and Deep Learning projects in healthcare domain.
39 stars 53 forks source link

Add polyp segmentation #169

Closed Kaushal-11 closed 4 months ago

Kaushal-11 commented 5 months ago

Resizing images to a fixed size (256x256) Normalizing pixel values (0-255 to 0-1 range) Using metrics like accuracy, recall, and precision Logging training progress and metrics to a CSV file Applying a threshold (0.5) to convert probabilities to binary masks Calculating evaluation metrics like Accuracy, F1 Score, Jaccard Index (IoU), Recall, Precision Visualizing results : [ Original image | Ground truth mask | Predicted mask]

SrijanShovit commented 5 months ago

Modify it for different activation functions and optimizers

Kaushal-11 commented 5 months ago

Other activation functions and optimizers did not yield as good results in terms of segmentation accuracy and training efficiency.

because there is main reason of using in such a complex model, In input layer, ReLu handle the vanishing gradient problem. in output layer, Sigmoid is suitable for binary segmentation tasks.

and Adam optimizer is good for it because of it dynamically adjusts the learning rate during training

So, this combination is giving me good accuracy for this segmentation

SrijanShovit commented 5 months ago

Can you try K-Fold method, tune early stopping epochs, Selu, Mish activation, Lion optimizer

Do an experimentation of these with several models. Then come to a conclusion.

Kaushal-11 commented 5 months ago

I implemented my model with selu activation, Lion optimizer with K-Fold validation and EarlyStopping callbacks but it is not give good results.

Here comparision between results, Metric SELU + Lion + K-Fold + EarlyStopping ReLU + Adam
Accuracy 0.92316 0.94052
F1 Score 0.34482 0.55769
Jaccard 0.25670 0.44812
Recall 0.34664 0.60501
Precision 0.52644 0.61600

Result of (ReLU + Adam)

image

Result of (SeLU + Lion)

image

Kaushal-11 commented 5 months ago

I performed all possible combination and played with hyperparameter I can conclude that ReLU often provides more stable and faster convergence during training compared to SELU, Sigmoid, Mish, tanh. The Adam optimizer, which works well with ReLU, provided better optimization and generalization in this scenario.

SrijanShovit commented 5 months ago

I performed all possible combination and played with hyperparameter I can conclude that ReLU often provides more stable and faster convergence during training compared to SELU, Sigmoid, Mish, tanh. The Adam optimizer, which works well with ReLU, provided better optimization and generalization in this scenario.

I don't agree. I have seen Mish and Selu completely outperform Relu sometimes. Anyways, have you tried some other strategies of augmentation or any different model?

Kaushal-11 commented 5 months ago

i inserted metrics results and final prediction image which derived using the SeLU. Unfortunately in this problem, SeLU and Mish is not working well.

Kaushal-11 commented 5 months ago

Yes, I have tried Data Augmentation in my local, And also implemented another model like DeepLabV3+ and ResUnet Model

Kaushal-11 commented 5 months ago

DeepLabv3+ model's results are not good as Unet But ResUnet's results is not much good but slightly better than Unet Model

Kaushal-11 commented 5 months ago

Hey, @SrijanShovit

can you answer my message

sanjay-kv commented 4 months ago

@SrijanShovit I understand the above PR doesnt come along with your expectation, let the contributor know what to go with it. Ideally its common in opnesource PR getting rejected. so let the contributor know are were proceeding with it or not.

github-actions[bot] commented 4 months ago

This issue has been automatically closed because it has been inactive for more than 7 days. If you believe this is still relevant, feel free to reopen it or create a new one. Thank you!

sanjay-kv commented 4 months ago

@SrijanShovit why there is no reply . we can help the contributor if you are not planning to merge