This code repository is the final project of the Complex Data Mining course at Unicamp (MDC013).
Knee osteoarthritis is a pathology that occurs due to wear on the cartilage that protects the bones in this region from friction and impacts.
Some medical procedures are necessary to identify this pathology, such as X-rays or magnetic resonance imaging, in which it is possible to assess the loss in joint spacing, thus indicating the severity of the disease.
The severity of osteoarthritis was classified into 5 levels based on KL score, from the healthy level to the severe level, where the greater the degree of severity, the smaller the spacing of the joint.
The following image shows the different levels from Knee Osteoarthritis Dataset with Severity Grading.
The purpose of this project is to correctly classify the severity of osteoarthritis based on X-ray images.
.
├── README.md
├── app
│ ├── app.py
│ └── img
├── assets
├── dataset
│ ├── test
│ ├── train
│ └── val
├── environment.yml
└── src
├── 01_data_preparation.ipynb
├── 02_ensemble_models.ipynb
├── 02_model_inception_resnet_v2.ipynb
├── 02_model_resnet50.ipynb
├── 02_model_xception.ipynb
├── 03_best_model_on_test_xception.ipynb
└── models
└── model_Xception_ft.hdf5
How does the Web App to predict knee osteoarthritis grade using Deep Learning and Streamlit works?
(base)$: git clone git@github.com:mafda/knee_OA_dl_app.git
(base)$: cd knee_OA_dl_app
Create the conda environment
(base)$: conda env create -f environment.yml
Activate the environment
(base)$: conda activate knee_dl_app
Download model_Xception_ft.hdf5
model from
here
to run the application. Create models
folder and copy the model here.
(knee_dl_app)$: mkdir src/models
Download the dataset from Knee Osteoarthritis Dataset with Severity
Grading
to train the model and test the application. Create dataset
folder and copy
the data here.
(knee_dl_app)$: mkdir dataset
Run it:
(knee_dl_app)$: streamlit run app/app.py
As soon as you run the script, a local Streamlit server will spin up, and your app will open in a new tab in your default web browser.
Or you can navigate to http://localhost:8501
.
The following methodology has been proposed to correctly classify the degree of osteoarthritis based on X-ray images:
The dataset consisting of 8000 X-ray images, approximately, of the knee obtained from the Knee Osteoarthritis Dataset.
In the bar chart we can see the image distribution of the 5 grades (classes), for each of the training, validation and test datasets, and in the pie chart we can see the average percentage of data we have for each class. So we have an unbalanced dataset.
Three strategies were implemented to reduce the impact that the unbalanced base can have on the models:
See 01_data_preparation.ipynb notebook.
Three pre-trained networks were chosen: Xception, ResNet-50 e Inception Resnet v2.
The following table presents the summary of the partial results obtained in the implementation of the different pre-trained networks with fine tuning. Our metrics is Balanced Accuracy. Models were trained on Apple M1 Pro chip with 8-core CPU, 14-core GPU and 16-core Neural Engine.
Model | Balanced Accuracy | Time Execution |
---|---|---|
Xception fine tuning | 67% | 68min |
ResNet50 fine tuning | 65% | 80min |
Inception_resnet_v2 fine tuning | 64% | 56min |
Highlighting, the highest success rate of each model by class, we have:
Three ensemble approaches were performed with the previous results:
Model | Balanced Accuracy | Time Execution |
---|---|---|
Ensemble mean | 68.63% | 16seg |
Ensemble accuracy | 68.48% | 16seg |
Ensemble f1 | 68.69% | 16seg |
The three models had similar results, but we selected the ensemble with f1 model.
We evaluated the best model in the test set, a balanced accuracy of 71% was obtained, and in the confusion matrix we can observe the highlight of the moderate and severe classes.
We implemented the Grad-CAM explainability technique to better understand how classes are classified. The Grad-CAM indicates the parts of the image that most impact the classification score.
We can see in the images that for the healthy, doubtful and minimal classes, the most prominent areas are located in the center of the knee, and the moderate and severe classes are most prominent on the right or left edges of the knee.
Grad-CAM results were obtained from the last convolutional layer of the Xception model.
The web application allows you to select and load an X-Ray image, to later predict and evaluate the loss in joint spacing, and indicate the probability of disease severity, as well as the area that most impacted the classification score.
made with 💙 by mafda