Closed MasterJH5574 closed 1 year ago
As tracked by #332, this PR is the O2h milestone of the high-level operator introduction plan.
For the first part, following #341, this PR continues to introduce neural network operators. Specifically, this PR introduces
relu
gelu
silu
softmax
batch_norm
layer_norm
dropout
For the second part, this PR introduces the linear algebra operator matmul.
matmul
All ops are well-tested.
As tracked by #332, this PR is the O2h milestone of the high-level operator introduction plan.
For the first part, following #341, this PR continues to introduce neural network operators. Specifically, this PR introduces
relu
,gelu
,silu
,softmax
,batch_norm
,layer_norm
,dropout
.For the second part, this PR introduces the linear algebra operator
matmul
.All ops are well-tested.