This repository is the official Pytorch implementation of "FunkNN: Neural Interpolation for Functional Generation" in ICLR 2023.
| Project Page |
(This code is tested with PyTorch 1.12.1, Python 3.8.3, CUDA 11.6 and cuDNN 7.)
Run the following code to install conda environment "environment.yml":
conda env create -f environment.yml
You can download the CelebA-HQ, LoDoPaB-CT and LSUN-bedroom validation datasets and split them into train and test sets and put them in the data folder. You should specify the data folder addresses in config_funknn.py and config_generative.py.
All arguments for training FunkNN model are explained in config_funknn.py. After specifying your arguments, you can run the following command to train the model:
python3 train_funknn.py
All arguments for training generative autoencoder are explained in config_generative.py. After specifying your arguments, you can run the following command to train the model:
python3 train_generative.py
All arguments for solving inverse problem by combining FunkNN and generative autoencoder are explained in config_IP_solver.py. After specifying your arguments including the folder address of trained FunkNN and generator, you can run the following command to solve the inverse problem of your choice (CT or PDE):
python3 IP_solver.py
If you find the code useful in your research, please consider citing the paper.
@inproceedings{
khorashadizadeh2023funknn,
title={Funk{NN}: Neural Interpolation for Functional Generation},
author={AmirEhsan Khorashadizadeh and Anadi Chaman and Valentin Debarnot and Ivan Dokmani{\'c}},
booktitle={The Eleventh International Conference on Learning Representations },
year={2023},
url={https://openreview.net/forum?id=BT4N_v7CLrk}
}