SrijanShovit / HealthLearning

A repo comprising of various Machine Learning and Deep Learning projects in healthcare domain.
34 stars 51 forks source link

Brain Tumor MRI Classification | Step 2 Cross Validation Techniques and Hyperparameter Tuning #136

Closed theiturhs closed 4 weeks ago

theiturhs commented 1 month ago

Description

This contains cross validation techniques and hyperparameter tuning. This is how these are implemented

Cross Validation Technique:

Different techniques for distributing dataset into training and testing dataset are implemented:

Along with average overall accuracies, class wise accuracies are calculated.

Out of these techniques, K-Fold CV technique gives better overall accuracy and class wise accuracy.

Hyper-parameter Tuning

For this Learner module, FastAI, has been considered which provides a way to create and fine-tune CNN models. We can specify a pre-trained model architecture and fine-tune it on the dataset.

These implementation are done:

Related Issues

Fixes #96

Testing Instructions

Necessary modules are already specified in the notebook itself. The work is carried out on Kaggle with accelarator set to GPU T4 x2.

Checklist

Make sure to check off all the items before submitting. Mark with [x] if done.

SrijanShovit commented 1 month ago

@theiturhs Looks like you have used very less number of epochs. Why so?

theiturhs commented 1 month ago

Yes. The number of epochs for deciding the cross validation technique is set to 3 just for comparison purposes and find the best technique. Where in hyper-parameter tuning, you can find best parameters which includes epochs too where the range provided was 5-15. The obtained results are not the final ones. These were for comparison purposes and selecting the best method that we will perform at the end with entire dataset while training our model.

@SrijanShovit

SrijanShovit commented 4 weeks ago

Hmmm...still good work. I appreciate. Let me merge this. Go ahead with same thing for prediction and plots.