Open swjtulinxi opened 1 month ago
class MLP(nn.Module): def init(self, in_dim: int = 256, hidden_dim: int = 512, out_dim: int = 256, norm_layer=nn.BatchNorm1d, act_layer=nn.ReLU, drop: float = 0.):
MLP中出现了nn.batchnorm2d和nn.relu,这两个不是用于卷积的吗,可以用到nn.liner后面,不会出现维度错误?
您好.不会的,BatchNorm1D可以用于Sequence-Wise Operation. 同时最普通的1x1 Conv=Linear Layer.
class MLP(nn.Module): def init(self, in_dim: int = 256, hidden_dim: int = 512, out_dim: int = 256, norm_layer=nn.BatchNorm1d, act_layer=nn.ReLU, drop: float = 0.):
MLP中出现了nn.batchnorm2d和nn.relu,这两个不是用于卷积的吗,可以用到nn.liner后面,不会出现维度错误?