Closed SCIKings closed 1 week ago
Closing as this is not an issue with TensorLy.
@SCIKings TensorLy-Torch provides pytorch based layers and factorizations on top of TensorLy. You can have a look at the hooks we provide (e.g. tensor dropout), you can write a similar hook for the activations you are interested in.
I want to apply an activation layer to each factor matrix, and then reconstruct the result after activation, and then the resulting result is approximated with the original tensor, so how should I use the library tensorly? For example (tl.decomposition.partial_tucker): w is approximated as f(w1)f(w2)f(w3), where f() denotes the activation function. I would be grateful if you could provide me with some help.