Relu argument: let's make it more expressive, something like token_rep_relu.
Use torch.relu (or torch.nn.functional.relu) instead of torch.nn.ReLU() for conciseness. Meanwhile, my understanding is that torch.relu will directly go to the cpp binding.
Maybe apply relu in encode function as well for consistency?
I will also go check the documents and comment here if I have questions.
A few comments on the code,
token_rep_relu
.torch.relu
(ortorch.nn.functional.relu
) instead oftorch.nn.ReLU()
for conciseness. Meanwhile, my understanding is thattorch.relu
will directly go to the cpp binding.encode
function as well for consistency?I will also go check the documents and comment here if I have questions.