mlangguth89 / downscaling_benchmark

6 stars 0 forks source link

Add method to load model for inference to AbstractModelClass #38

Closed mlangguth89 closed 1 month ago

mlangguth89 commented 1 month ago

Loading saved models for inference can be heavily depending on the model type. In particular, only the generator-model must be loaded for (Wasserstein) GAN models for inference. The current approach via an if-statement in the get_model_info-method in postprocess.py is farily hacky (cf. here and ungeneric. In this branch, a load_inference_model is therefore added to the AbstractModelClass which can be overwritten in custom models if required. Additionally, the get_model_info-method will be revised to directly load the checkpointed model for inference.

Note that the corresponding issue-branch is based on #37. Thus, it is suifficient to only merge this branch into develop after accomplishing work in both isssues.

mlangguth89 commented 1 month ago

Work in this branch and in issue-branch #37 is finished and thus was merged to develop.