UTSAVS26 / PyVerse

PyVerse is an open-source collection of diverse Python projects, tools, and scripts, ranging from beginner to advanced, across various domains like machine learning, web development, and automation.
MIT License
70 stars 205 forks source link

[Code Addition Request]: Predicting Obesity Risk and Identifying Contributing Factors through XAI Techniques #1117

Closed inkerton closed 3 weeks ago

inkerton commented 3 weeks ago

Have you completed your first issue?

Guidelines

Latest Merged PR Link

1109

Project Description

Background:

The continuous increase in obesity implies the growing costs and risks for individuals, society, and businesses. Furthermore, tackling obesity is one of the top government priorities in many nations. Obesity is a prevalent health issue globally, contributing to various chronic diseases and reducing overall quality of life. The goal is to prevent, reduce, and tackle obesity by helping people build healthy eating and physical habits.

Project Objective:

The objective of this project is to develop a machine learning model capable of predicting the risk of obesity across multiple classes. By accurately 1) predicting obesity risk & 2) explaining the model's results, we aim to empower individuals with insights into their health status and provide healthcare professionals with a tool for early intervention and personalized recommendations.

Summary of XAI Techniques applied:

XAI Method | Type | Description -- | -- | -- Permutation Feature Importance (PFI) | Global | Assess the importance of input features by measuring the change in model performance when the values of those features are randomly permuted.For example, if the model's accuracy drops a lot when a feature is shuffled, it means that feature is very important. SHapley Additive exPlanations (SHAP) | Global | Shows how much each feature contributes to a model's prediction by considering all possible combinations of features and their interactions.Features with positive SHAP values positively impact the prediction, while those with negative values have a negative impact. Partial Dependence Plot (PDP) | Global | Shows how changes in one feature affect a model's prediction while keeping other features constant.For example, a flat line implies little or no impact, while an upward slope indicates a positive influence. Local Interpretable Model-agnostic Explanations (LIME) | Local | Explains individual predictions of a model by approximating its behavior with a simpler, understandable model around a specific data point (local). Diverse Counterfactual Explanations (DiCE) | Local | Generates alternative or "what-if" scenarios to explain why a model made a specific prediction, offering insights into how changes in input features could lead to different outcomes.

Full Name

inkerton

Participant Role

GSSOC

github-actions[bot] commented 3 weeks ago

🙌 Thank you for bringing this issue to our attention! We appreciate your input and will investigate it as soon as possible.

Feel free to join our community on Discord to discuss more!

github-actions[bot] commented 3 weeks ago

✅ This issue has been closed. Thank you for your contribution! If you have any further questions or issues, feel free to join our community on Discord to discuss more!