ubc-vision / StableKeypoints

Apache License 2.0
78 stars 6 forks source link

Unsupervised Keypoints from Pretrained Diffusion Models (CVPR 2024 Highlight)

Eric Hedlin, Gopal Sharma, Shweta Mahajan, Xingzhe He, Hossam Isack, Abhishek Kar, Helge Rhodin, Andrea Tagliasacchi, Kwang Moo Yi

Project Page

For more detailed information, visit our project page or read our paper

Interactive Demo

We provide an interactive demo in a Google Colab. This allows a user to upload custom images and optimizes and visualizes the found keypoints over the images.

Requirements

Set up environment

Create a conda environment using the provided requirements.yaml:

conda env create -f requirements.yaml
conda activate StableKeypoints

Download datasets

The CelebA, Taichi, Human3.6m, DeepFashion, and CUB datasets can be found on their websites.

Preprocessed data for CelebA, and CUB can be found in Autolink's repository.

Usage

To use the code, run:

python3 -m unsupervised_keypoints.main [arguments]

Main Arguments

Example Usage

python3 -m unsupervised_keypoints.main --dataset_loc /path/to/dataset --dataset_name celeba_wild --evaluation_method inter_eye_distance --save_folder /path/to/save

If you want to use a custom dataset you can run

python3 -m unsupervised_keypoints.main --dataset_loc /path/to/dataset --dataset_name custom

Precomputed tokens

We provide the precomputed tokens here

BibTeX

@article{hedlin2023keypoints,
  title={Unsupervised Keypoints from Pretrained Diffusion Models},
  author={Hedlin, Eric and Sharma, Gopal and Mahajan, Shweta and He, Xingzhe and Isack, Hossam and Rhodin, Abhishek Kar Helge and Tagliasacchi, Andrea and Yi, Kwang Moo},
  journal={arXiv preprint arXiv:2312.00065},
  year={2023}
}