Closed HuaZheLei closed 5 years ago
It is not a bug. Actually, we find that using 'in' in flow_net will introduce artifacts to the final results, while 'ln' has better performance than 'in' in this task.
It is not a bug. Actually, we find that using 'in' in flow_net will introduce artifacts to the final results, while 'ln' has better performance than 'in' in this task.
Hi, thanks for your reply. As you mentioned, 'norm_flow' should be 'ln' rather than 'in'. But in 'model.py', you send 'in' to 'norm_flow', which makes the defalut parameter 'ln' invalid. I think in the traning process, 'norm_flow' should be 'In' and 'norm_conv' should be 'in'. Am I right?
Sorry for misleading. The parameters of the code are correct. Since the Texture Generator (named in the paper) outputs the final results directly. Therefore, we find its normalization need to be 'ln' for better performance. Therefore, the parameter 'norm_conv' in 'FlowGen' should be 'ln'. The parameter 'norm_flow' assigns the normalization of flow fields generator of Texture Generator. It can be 'ln' or 'in'.
In 'model.py' the correct parameters are sent to the network. However, the default parameters in 'network.py' are incorrect. We will correct them.
Sorry for misleading. The parameters of the code are correct. Since the Texture Generator (named in the paper) outputs the final results directly. Therefore, we find its normalization need to be 'ln' for better performance. Therefore, the parameter 'norm_conv' in 'FlowGen' should be 'ln'. The parameter 'norm_flow' assigns the normalization of flow fields generator of Texture Generator. It can be 'ln' or 'in'.
In 'model.py' the correct parameters are sent to the network. However, the default parameters in 'network.py' are incorrect. We will correct them.
Cool!
In the 'models.py', the parameters for FlowGen is
self.flow_param = {'input_dim':3, 'dim':64, 'n_res':2, 'activ':'relu', 'norm_conv':'ln', 'norm_flow':'in', 'pad_type':'reflect', 'use_sn':False}
However, in the 'network.py', the default parameters for FlowGen isclass FlowGen(nn.Module): def __init__(self, input_dim=3, dim=64, n_res=2, activ='relu', norm_flow='ln', norm_conv='in', pad_type='reflect', use_sn=True):
The parameters of norm_flow and norm_conv are inconsistent. I wonder if it is a bug.