tensorly / tensorly

TensorLy: Tensor Learning in Python.
http://tensorly.org
Other
1.55k stars 290 forks source link

Non-linear tensor decomposition #477

Closed earmingol closed 3 months ago

earmingol commented 1 year ago

Is your feature request related to a problem? Please describe.

There is no method able to capture non-linear relationships using tensor decomposition in tensorly.

Describe the solution you'd like

Variational Auto-Encoder could be used for tensor decomposition to capture non-linear relationships as described here: https://arxiv.org/pdf/1611.00866.pdf

It would be great if someone with the knowledge to implement this method do so. The algorithm is described in the manuscript (https://arxiv.org/pdf/1611.00866.pdf). This could be implemented using the PyTorch backend?

JeanKossaifi commented 1 year ago

What exactly do you mean by non-linear relationships? One could argue tensor factorizations are not linear.

Re. VAE for tensor decomposition, this would be nice to have indeed. Do you have any insights/references on how this compares in practice to a simple gradient based method optimizing the rec. loss or other non-bayesian approach?

Unfortunately I don't have the bandwidth at the moment but it should be straightforward to implement directly in TensorLy-Torch which already provides most of the tools including basic tensor layers and a convenient tensor factorization extending PyTorch tensors. I haven't read the paper in detail but it looks like it's mostly a matter of optimizing a lower bound with a KL divergence over reconstruction. You could directly use the FactorizedCPTensor for this in tensorly-torch and write a custom loss.

earmingol commented 1 year ago

Thanks! I'll take a look to Tensorly-PyTorch.

With non-linear relationships I mean getting factors that reconstruct the tensor in a non-linear way (i.e. not using the typical sum of rank-1 tensors). That's why the VAE approach is useful because it uses the autoencoder to obtaint latent factors from the neural net training. For that reason I think using the FactorizedCPTensor and modifying the loss function is not enough because it still relies on the parafac method (please correct me if I'm wrong). If I had the experience with this I would be able to implement it, but unfortunately I haven't work on this area and don't understand all the math in the paper.

Regarding a benchmarking, in Fig 2 of the paper they compared it to other methods, and presented lower RMSE than all of them.

JeanKossaifi commented 3 months ago

The factorizedCPTensor will do what you want - you can plug it in directly in your VAE, it doesn't rely on the PARAFAC (you can initialize it from a dense tensor with Parafac but you can also train from scratch). Closing on here, probably more adapted for TensorLY-Torch but feel free to reopen if you have more issues!