Deep Learning Project for Deep Learning Course (263-3210-00L)
by Department of Computer Science, ETH Zurich, Autumn Semester 2021
Authors:
Sebastian Frey (sefrey@student.ethz.ch)
Remo Kellenberger (remok@student.ethz.ch)
Aron Schmied (aronsch@student.ethz.ch)
Guney Tombak (gtombak@student.ethz.ch)
Professors:
Dr. Fernando Perez-Cruz
Dr. Aurelien Lucchi
All dependencies can be fulfilled by creating a conda environment using environment.yaml
:
conda env create -f environment.yaml && conda activate dalcs
For GPU implementation use also:
conda install pytorch torchvision torchaudio cudatoolkit=X.Y -c pytorch -c conda-forge
with a cuda version X.Y compatible with your GPU.
Before running, you should log in to your wandb (Weights and Biases) account:
wandb login
You can change the parameters regarding Weights and Biases in main.py
at line 83
.
For more information, please visit Weights and Biases.
The parameters of the run can be configured using YAML files. The predefined configurations can be found in configs
folder.
python main.py --config <configuration_file_path>
To run more than one configuration, multi_main.sh
can be used with the path of the configuration files to be sequentially run.
conda activate dalcs
chmod +x multi_main.sh
./multi_main.sh -c <folder_path_containing_configuration_files>
An explanatory configuration file can be found at configs/default_config.yaml
.
The results are saved in both your local device in the folder named wandb
and also the Weights and Biases cloud. You can inspect the results directly on web or to use the local files, please check the documentation and visualization/plot_results.ipynb
.
The code also contains a separate variational autoencoder trainer to use pretrained models.
To use it, set your current directory to vae_training
. Usage is similar to the main file:
python train_vae.py --config <configuration_file_path>