Quantify uncertainty in any ML model
3
stars
3
forks
source link
Ensemble Automator
Conformal Prediction framework for Uncertainity Quantification of Stochastic Models.
YouTube presentation
- Conformal prediction is a framework that quantifies uncertainity by estimating the confidence and credibility of test point predictions.
- Conformal prediction works using a nearest centroid classifier, along with computing non-conformal and p-value score.
- Currently, our model is able to accurately predict on new predictions with reliable accuracy on ANN, CNN, and DQN models.
- This is part of our project on ML interability via Actor-Critic networks.
To run our models:
- Fork/download this repo.
cd src/code/pick_your_model
- run the respective jupyter notebook by going to terminal on the root directory of this project and entering
jupyter notebook
or in VSCode.
Requirements
- TensorFlow
- PyTorch
- TorchVision
- Numpy
- gym
- pandas
- matplotlib
- keras
- time
- tqdm