pietrocarbo / deep-transfer

PyTorch implementation of "Universal Style Transfer via Feature Trasforms"
Apache License 2.0
87 stars 19 forks source link

Can decoders have shared weights? #1

Closed vinsis closed 5 years ago

vinsis commented 5 years ago

Hi, a quick question - since decoder #5 contains all the layers that decoder #4 has, is it possible to share the weights for the common layers between them? Right now the weights are totally separate.

I believe sharing the weights would lead to saving computation and time. Thanks.

pietrocarbo commented 5 years ago

No. This is because each decoder, from #1 to #5 has been trained separately, along with respective encoder from #1 to #5, for general-purpose image reconstruction. Therefore the weights of decoders are different also in the layers they have in common.

vinsis commented 5 years ago

Thanks.