Closed ashim95 closed 5 months ago
Hi @ashim95 -- factorized tensors are just that: higher-order tensors in factorized form. Tensorized matrices are matrices that are tensorized. That tensorized form is expressed in factorized form. You can use it as a drop in replacement for matrices.
Block-TT is not the same as Tensor-Train (MPS), it would be the equivalent of TTM (that you may also know as MPO).
We don't yet have a low-rank matrix factorization module but you can get this by using a tensorized matrix with CP and a tensorized shape being the same as the original shape. However, I guess you mean taking a tensor, reshaping it to a matrix and decomposing that: it would be great to have this and would be awesome if you wanted to open a PR for that!
Closing as inactive.
Hi,
Thanks for creating and maintaining this library. I had a couple of basic questions, would be great if you could answer:
What is the difference between the files in
tltorch/factorized_tensors/factorized_tensors.py
andtltorch/factorized_tensors/tensorized_matrices.py
? A lot of the code is replicated across these files.Is
BlockTT
the same asTensor-train
?I know its trivial to implement, but does the library similarly have a module for low-rank matrix factorization?
Thanks,