openclimatefix / open-source-quartz-solar-forecast

Open Source Solar Site Level Forecast
MIT License
70 stars 55 forks source link

Evaluate Model Using XAI and Increase Interpretability #104

Open ombhojane opened 7 months ago

ombhojane commented 7 months ago

Detailed Description

The current solar forecasting model is a gradient boosted tree model, which can achieve high predictive accuracy but often lacks interpretability. It is proposed to evaluate the model using Explainable AI (XAI) techniques and increase its interpretability. This will involve:

  1. Researching and evaluating different XAI techniques suitable for gradient boosted tree models, such as feature importance analysis, shapley values, or local interpretable model-agnostic explanations (LIME) or Microsoft Explainable Boosting Machine.
  2. Implementing the selected XAI techniques and integrating them into the existing model evaluation and analysis pipeline.
  3. Analyzing and visualizing the model's behavior, feature importances, and decision-making process using the XAI techniques.

Context

Understanding the model's decision-making process and the relative importance of input features is crucial for trust, transparency, and accountability in this open-source project. Increasing the model's interpretability using XAI techniques can:

• Facilitate the integration of domain knowledge, potentially improving model performance and interpretability. • Enhance transparency and trust among users and stakeholders. • Guide model refinement and improvement efforts based on insights gained from XAI analysis. • Assess the model's robustness and fairness across different geographic regions or weather conditions, identifying potential biases or inconsistencies.

Possible Implementation

  1. Evaluate and select XAI techniques like SHAP or LIME for interpretability analysis of the gradient boosted tree model.
  2. Develop a separate module or script to integrate the chosen XAI techniques with the existing model evaluation pipeline.
  3. Visualize and analyze feature importances, decision paths, and local explanations using the XAI techniques.
ombhojane commented 7 months ago

Implementation Procedure

Aim: To increase the interpretability of the solar forecasting model and enable users to understand the model's decision-making process

Steps:

  1. A new command-line option or flag (e.g., --explain) will be added to the existing forecasting script (e.g., python forecast.py --explain).

  2. When the --explain flag is provided, the forecasting script will perform the following steps: a. Execute the existing forecasting function to generate solar energy predictions. b. Integrate the selected XAI technique (e.g., SHAP, LIME, etc.) with the trained XG boost tree model. After testing with data and performing iterations, a suitable XAI method will be selected
    c. To compute and analyze the feature importances, decision paths, and local explanations using the XAI technique. d. Display or visualize the model's decision-making process, highlighting the influential features and their respective contributions to the final prediction.

  3. The visualizations and explanations generated by the XAI module will be presented in a user-friendly format, such as interactive plots.

I hope this helps to improve the model and build trust in the decisions model would predict, @peterdudfield @zakwatts what you say?

I've submitted a GSoC proposal related to this issue, I'm happy to hear feedbacks regarding to this, I'm also learning XAI daily and getting so interested, and willing to implement this feature in Opneclimatefix to make the model transparent, trustworthy and ultimately benefiting the users.