Official code repository of "HyperDiffusion: Generating Implicit Neural Fields with Weight-Space Diffusion" @ ICCV 2023
Paper | Project website | Video | Data
I'll release rest of the weights/checkpoints after post-refactor tests are complete. You can see here for what's uploaded.
For full list please see hyperdiffusion_env.yaml file
All the data needed to train and evaluate HyperDiffusion is in this Drive folder. There are three main folders there:
We have a .yaml file that you can create a conda environment from. Simply run,
conda env create --file hyperdiffusion_env.yaml
conda activate hyper-diffusion
We specify our runtime parameters using .yaml files which are inside configs folder. There are different yaml files for each category and task.
Then, download MLP Weights from our Drive and put it into mlp_weights folder. Config files assume that weights are in that folder.
For 3D, download Point Clouds (2048) folder from Drive and save its content to data folder. Eventually, data folder should look like this:
data
|-- 02691156
|-- 02691156_2048_pc
|-- 02958343
|-- 02958343_2048_pc
|-- 03001627
|-- 03001627_2048_pc
|-- animals
Note: Category id to name conversion is as follows: 02691156 -> airplane, 02958343 -> car, 03001627 -> chair
Download Checkpoints folder from Drive. Assign the path of that checkpoint to the best_model_save_path
parameter.
to start evaluating, airplane category:
python main.py --config-name=train_plane mode=test best_model_save_path=<path/to/checkpoint>
(checkpoints coming soon!) car category:
python main.py --config-name=train_car mode=test best_model_save_path=<path/to/checkpoint>
(checkpoints coming soon!) chair category (we have special operations for chair, see our Supplementary Material for details):
python main.py --config-name=train_chair mode=test best_model_save_path=<path/to/checkpoint> test_sample_mult=2 dedup=True
(checkpoints coming soon) 4D animals category:
python main.py --config-name=train_4d_animals mode=test best_model_save_path=<path/to/checkpoint>
To start training, airplane category:
python main.py --config-name=train_plane
(MLP weights coming soon) car category:
python main.py --config-name=train_car
(MLP weights coming soon) chair category:
python main.py --config-name=train_chair
(MLP weights coming soon) 4D animals category:
python main.py --config-name=train_4d_animals
We are using hydra, you can either specify parameters from corresponding yaml file or directly modify them from terminal. For instance, to change the number of epochs:
python main.py --config-name=train_plane epochs=1
We already provide overfitted shapes but if you want to do it yourself make sure that you put downloaded ShapeNet shapes (we applied ManifoldPlus pre-processing) into data folder. After that, we first create point clouds and then start overfitting on those point clouds; following lines do exactly that:
python siren/experiment_scripts/train_sdf.py --config-name=overfit_plane strategy=save_pc
python siren/experiment_scripts/train_sdf.py --config-name=overfit_plane
Utils
WeightDataset
and VoxelDataset
definitions which are torch.Dataset
descendants. Former one is related to our HyperDiffusion method, while the latter one is for Voxel baseline.Evaluation
Entry Point
Models
We share training plots for better reproducibility. Links take you to Weights & Biases reports. (Note: Some links sometimes don't work for unknown reasons)
Plane | Car | Chair | 4D Animals
We mainly used codebases of SIREN and G.pt papers to build our repository. We also referred to DPC for codes like evaluation metrics. We used OpenAI Guided Diffusion as our diffusion backbone. LDM codebase was useful for us to implement our voxel baseline.
If you find our work useful, please cite using the following BibTex entry:
@misc{erkoç2023hyperdiffusion,
title={HyperDiffusion: Generating Implicit Neural Fields with Weight-Space Diffusion},
author={Ziya Erkoç and Fangchang Ma and Qi Shan and Matthias Nießner and Angela Dai},
year={2023},
eprint={2303.17015},
archivePrefix={arXiv},
primaryClass={cs.CV}
}