MingSun-Tse / Collaborative-Distillation

[CVPR'20] Collaborative Distillation for Ultra-Resolution Universal Style Transfer (PyTorch)
MIT License
185 stars 23 forks source link

WCT original #23

Closed tanbuinhat closed 2 years ago

tanbuinhat commented 3 years ago

So can I ask that the mode original is the same as original WCT method?

MingSun-Tse commented 3 years ago

Hi @tanbuinhat , During training (i.e., model compression), the architecture of the original decoder is the same as that in WCT. But we train our own decoder weights, following the training protocols in WCT. For the encoder, it is the exactly same as that in WCT.