Closed ffs333 closed 1 year ago
Added nn.SiLU actication
Since PyTorch version only supports beta = 1.0 (no argument for beta), TF version with only beta=1.0
@converter(F.silu, channel_ordering_strategy=ChannelOrderingStrategy.MINIMUM_TRANSPOSITIONS) def converter_silu(input: Tensor, inplace=False): def func(input: Tensor, inplace=False): return tf.nn.silu(input, beta=1.0) return func
nn.SiLU() (inplace=False default argument value)
Added nn.SiLU actication
Since PyTorch version only supports beta = 1.0 (no argument for beta), TF version with only beta=1.0
nn.SiLU() (inplace=False default argument value)