karrykkk / BayesDiff

10 stars 4 forks source link

BayesDiff: Estimating Pixel-wise Uncertainty in Diffusion via Bayesian Inference

This repository is our codebase for BayesDiff.

Installation

conda create --name BayesDiff python==3.8
conda activate BayesDiff
conda install pip
git clone https://github.com/karrykkk/BayesDiff.git
cd BayesDiff
pip install -r requirements.txt

Framework

This repository integrates uncertainty quantification into three models, each in its own folder:

  1. ddpm_and_guided - Guided Diffusion Repository Link
  2. sd - Stable Diffusion Repository Link
  3. uvit - U-ViT Repository Link

Each folder contains a custom_model.py that emerged with uncertainty quantification techniques.

Usage

1. Guided Diffusion

cd ddpm_and_guided

Download pre-trained model checkpoint

2. Stable Diffusion

cd sd

Download pre-trained model checkpoint

Download Stable Diffusion v1.5 to your_local_model_path

Download data to fit last-layer Laplace (LLLA)

Please download subset of laion-art to your_local_image_path. These images is a subset from the LAION-Art dataset, store it in your_laion_art_path. This will allow you to retrieve the corresponding prompts for the downloaded images. Note that a subset of approximately 1000 images is sufficient for effectively fitting the LLLA.

Sample and estimate corresponding pixel-wise uncertainty

In the file sd.sh, you will find a template for usage. Please adjust this template to match your local file path and the specific prompt you intend to use.

bash sd.sh

3. U-ViT

cd uvit

Download pre-trained model checkpoint

Citation

If you find out work useful, please cite our paper at:

@inproceedings{kou2023bayesdiff,
  title={BayesDiff: Estimating Pixel-wise Uncertainty in Diffusion via Bayesian Inference},
  author={Kou, Siqi and Gan, Lei and Wang, Dequan and Li, Chongxuan and Deng, Zhijie},
  booktitle={The Twelfth International Conference on Learning Representations},
  year={2023}
}