sagemath / sage

Main repository of SageMath
https://www.sagemath.org
Other
1.47k stars 487 forks source link

Meta-ticket: Efficient numerical computations with tensor trains #31991

Open f20dad3f-90ee-447a-8dc6-e89527084234 opened 3 years ago

f20dad3f-90ee-447a-8dc6-e89527084234 commented 3 years ago

Following the progress with Ticket #30307, we propose to add computational functionalities with tensors via the tensor train decomposition, which is an idea originally named Matrix Product States in the quantum physics community and later popularized by Prof. Ivan Oseledets. As a computational tool, it represents tensors in compressed formats that would enable tensor operations with cost scaling only linearly in the number of dimensions.

This functionality is especially useful for high dimensional tasks to prevent the curse of dimensionality (such as in Riemannian optimization). Numerical packages are currently supported in MATLAB, Python, and C++. The MATLAB version is currently most versatile and robust.

We may consider internalizing this computational library for Sage, as an effort to formalize the currently available implementations. This addition can either be an additional child class under tensor, and overwrite the arithmetics; or a separate module. The backend is currently proposed to be tensorflow, however this may change / be flexible based on how the refactoring of Components progresses.

One can look at this github page for an example implementation in MATLAB (https://github.com/oseledets/TT-Toolbox).

Ideally after the implementation of this ticket, users can specify a fixed rank tensor for a general manifold using tensor.modules, specify a ring and a frame, and perform useful and normally computationally intractable tasks using tensor train as a backend. Based on the progress of this ticket, one may also consider implementing other backends for storing a numerical tensor such as tensor rings and quantized tensor train.

For alternative tensor decomposition formats, please see the following survey papers:

Other implementations:

Tickets:

CC: @mkoeppe @egourgoulhon @dimpase

Component: linear algebra

Issue created by migration from https://trac.sagemath.org/ticket/31991

f20dad3f-90ee-447a-8dc6-e89527084234 commented 3 years ago

Description changed:

--- 
+++ 
@@ -1,4 +1,4 @@
-Following the progress with Ticket #30307, we propose to add computational functionalities with tensors via the tensor train decomposition, which is an idea in fact originally named "Density Matrix Renormalization Group (DMRG)" and later popularized by Prof. Ivan Oseledets. 
+Following the progress with Ticket #30307, we propose to add computational functionalities with tensors via the tensor train decomposition, which is an idea originally named "Density Matrix Renormalization Group (DMRG)" and later popularized by Prof. Ivan Oseledets. 

 This functionality is especially useful for high dimensional tasks to prevent the curse of dimensionality (such as in Riemannian optimization), currently supported in MATLAB, Python, and C++. The MATLAB version is currently most versatile and robust.
f20dad3f-90ee-447a-8dc6-e89527084234 commented 3 years ago

Description changed:

--- 
+++ 
@@ -8,4 +8,10 @@

 Ideally after the implementation of this ticket, users can specify a fixed rank tensor for a general manifold using `tensor.modules`, specify a ring and a frame, and perform useful and normally computationally intractable tasks using tensor train as a backend. Based on the progress of this ticket, one may also consider implementing other backends for storing a numerical tensor such as tensor rings and quantized tensor train.

+For alternative tensor decomposition formats, please see the following survey papers:
+
+- Tucker format: (https://arxiv.org/abs/1810.01262)
+- Tensor ring: (https://arxiv.org/abs/1807.02513)
+- Quantized tensor train (see section 3.5.1): (https://refubium.fu-berlin.de/handle/fub188/3366)
+
 For the theory behind why one can reduce a normally exponential (in the number of dimensions) task to a linear task, see (https://epubs.siam.org/doi/abs/10.1137/090752286?journalCode=sjoce3).
mkoeppe commented 3 years ago
comment:5

Is ttpy still being maintained? I tried to install it (using sage -pip install ttpy) and ran into a Fortran issue:

    Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/COMPLEX(8)).
    tt/tt-fort/ort.f90:271:22:

      271 |    call dgemv('t',n,r,1.d0,yy,n,x,1, 0.d0,gv,1)
mkoeppe commented 3 years ago
comment:6

(This is with gfortran 11.1 on macOS; this may be similar to the many issues that we had when gfortran 10 came along - #29456)

f20dad3f-90ee-447a-8dc6-e89527084234 commented 3 years ago
comment:7

Replying to @mkoeppe:

(This is with gfortran 11.1 on macOS; this may be similar to the many issues that we had when gfortran 10 came along - #29456)

I didn't have an issue with downloading since I wasn't using pip but directly cloned the code on github. Based on their github's commit history it doesn't seem to be frequently maintained.. The MATLAB version is what I now use for my projects. I will update the MATLAB link in the ticket description.

f20dad3f-90ee-447a-8dc6-e89527084234 commented 3 years ago

Description changed:

--- 
+++ 
@@ -4,7 +4,7 @@

 We may consider internalizing this computational library for Sage, as an effort to formalize the currently available implementations. This addition can either be an additional child class under `tensor`, and overwrite the arithmetics; or a separate module. The backend is currently proposed to be TensorFlow, however this may change / be flexible based on how the refactoring of `Components` progresses. 

-One can look at this github page for an example implementation (https://github.com/oseledets/ttpy).
+One can look at this github page for an example implementation in MATLAB (https://github.com/oseledets/TT-Toolbox).

 Ideally after the implementation of this ticket, users can specify a fixed rank tensor for a general manifold using `tensor.modules`, specify a ring and a frame, and perform useful and normally computationally intractable tasks using tensor train as a backend. Based on the progress of this ticket, one may also consider implementing other backends for storing a numerical tensor such as tensor rings and quantized tensor train.
egourgoulhon commented 3 years ago
comment:9

Waouh! This looks promising!

mkoeppe commented 3 years ago
comment:10

I have opened an upstream issue for the compilation errors: https://github.com/oseledets/ttpy/issues/82

egourgoulhon commented 3 years ago
comment:11

Replying to @egourgoulhon:

Waouh! This looks promising!

To make it clear: the above comment was about the ticket main goal and was not some ironical statement about the ttpy issue with gfortran ;-)

mkoeppe commented 3 years ago

Description changed:

--- 
+++ 
@@ -10,7 +10,7 @@

 For alternative tensor decomposition formats, please see the following survey papers:

-- Tucker format: (https://arxiv.org/abs/1810.01262)
+- Tucker format: (Falcó, Hackbusch, Nouy, Tree-based tensor formats, https://arxiv.org/abs/1810.01262)
 - Tensor ring: (https://arxiv.org/abs/1807.02513)
 - Quantized tensor train (see section 3.5.1): (https://refubium.fu-berlin.de/handle/fub188/3366)
mkoeppe commented 3 years ago

Description changed:

--- 
+++ 
@@ -14,4 +14,4 @@
 - Tensor ring: (https://arxiv.org/abs/1807.02513)
 - Quantized tensor train (see section 3.5.1): (https://refubium.fu-berlin.de/handle/fub188/3366)

-For the theory behind why one can reduce a normally exponential (in the number of dimensions) task to a linear task, see (https://epubs.siam.org/doi/abs/10.1137/090752286?journalCode=sjoce3).
+For the theory behind why one can reduce a normally exponential (in the number of dimensions) task to a linear task, see Oseledets, Tensor-Train Decomposition, SIAM J. Sci. Comput., 33(5), 2295–2317 (https://epubs.siam.org/doi/abs/10.1137/090752286?journalCode=sjoce3).
mkoeppe commented 3 years ago

Description changed:

--- 
+++ 
@@ -15,3 +15,13 @@
 - Quantized tensor train (see section 3.5.1): (https://refubium.fu-berlin.de/handle/fub188/3366)

 For the theory behind why one can reduce a normally exponential (in the number of dimensions) task to a linear task, see Oseledets, Tensor-Train Decomposition, SIAM J. Sci. Comput., 33(5), 2295–2317 (https://epubs.siam.org/doi/abs/10.1137/090752286?journalCode=sjoce3).
+
+
+**Tickets:**
+
+- #31998 Package `ttpy` (tensor trains)
+- #31992 `FiniteRankFreeModuleMorphism`: Add method SVD (singular value decomposition)
+- #31997 Topological tensor spaces (modules)
+- #29619 tensors should have a sparse iterator
+- #30373 Parent methods `tensor` vs. `tensor_product`
+
mkoeppe commented 3 years ago

Description changed:

--- 
+++ 
@@ -24,4 +24,4 @@
 - #31997 Topological tensor spaces (modules)
 - #29619 tensors should have a sparse iterator
 - #30373 Parent methods `tensor` vs. `tensor_product`
-
+- #30235 Add construction methods to `FiniteRankFreeModule` and `CombinatorialFreeModule`
mkoeppe commented 3 years ago

Description changed:

--- 
+++ 
@@ -25,3 +25,4 @@
 - #29619 tensors should have a sparse iterator
 - #30373 Parent methods `tensor` vs. `tensor_product`
 - #30235 Add construction methods to `FiniteRankFreeModule` and `CombinatorialFreeModule`
+- Meta-ticket: Unify free module elements API: methods `dict`, `monomial_coefficients`, etc.
mkoeppe commented 3 years ago

Description changed:

--- 
+++ 
@@ -25,4 +25,4 @@
 - #29619 tensors should have a sparse iterator
 - #30373 Parent methods `tensor` vs. `tensor_product`
 - #30235 Add construction methods to `FiniteRankFreeModule` and `CombinatorialFreeModule`
-- Meta-ticket: Unify free module elements API: methods `dict`, `monomial_coefficients`, etc.
+- #30309 Meta-ticket: Unify free module elements API: methods `dict`, `monomial_coefficients`, etc.
f20dad3f-90ee-447a-8dc6-e89527084234 commented 3 years ago

Description changed:

--- 
+++ 
@@ -1,8 +1,11 @@
-Following the progress with Ticket #30307, we propose to add computational functionalities with tensors via the tensor train decomposition, which is an idea originally named "Density Matrix Renormalization Group (DMRG)" and later popularized by Prof. Ivan Oseledets. 
+Following the progress with Ticket #30307, we propose to add computational functionalities with tensors via the tensor train decomposition, which is an idea originally named Matrix Product States in the quantum physics community and later popularized by Prof. Ivan Oseledets. As a computational tool, it represents tensors in compressed formats that would enable tensor operations with cost scaling only linearly in the number of dimensions.

-This functionality is especially useful for high dimensional tasks to prevent the curse of dimensionality (such as in Riemannian optimization), currently supported in MATLAB, Python, and C++. The MATLAB version is currently most versatile and robust.
+- Affleck et al., Valence bond ground states in isotropic quantum antiferromagnets (https://link.springer.com/article/10.1007/BF01218021)
+- Oseledets, Tensor-Train Decomposition, SIAM J. Sci. Comput., 33(5), 2295–2317 (https://epubs.siam.org/doi/abs/10.1137/090752286?journalCode=sjoce3).

-We may consider internalizing this computational library for Sage, as an effort to formalize the currently available implementations. This addition can either be an additional child class under `tensor`, and overwrite the arithmetics; or a separate module. The backend is currently proposed to be TensorFlow, however this may change / be flexible based on how the refactoring of `Components` progresses. 
+This functionality is especially useful for high dimensional tasks to prevent the curse of dimensionality (such as in Riemannian optimization). Numerical packages are currently supported in MATLAB, Python, and C++. The MATLAB version is currently most versatile and robust.
+
+We may consider internalizing this computational library for Sage, as an effort to formalize the currently available implementations. This addition can either be an additional child class under `tensor`, and overwrite the arithmetics; or a separate module. The backend is currently proposed to be `tensorflow`, however this may change / be flexible based on how the refactoring of `Components` progresses. 

 One can look at this github page for an example implementation in MATLAB (https://github.com/oseledets/TT-Toolbox).

@@ -14,8 +17,6 @@
 - Tensor ring: (https://arxiv.org/abs/1807.02513)
 - Quantized tensor train (see section 3.5.1): (https://refubium.fu-berlin.de/handle/fub188/3366)

-For the theory behind why one can reduce a normally exponential (in the number of dimensions) task to a linear task, see Oseledets, Tensor-Train Decomposition, SIAM J. Sci. Comput., 33(5), 2295–2317 (https://epubs.siam.org/doi/abs/10.1137/090752286?journalCode=sjoce3).
-

 **Tickets:**
f20dad3f-90ee-447a-8dc6-e89527084234 commented 3 years ago
comment:18

slightly modified description

f20dad3f-90ee-447a-8dc6-e89527084234 commented 3 years ago

Description changed:

--- 
+++ 
@@ -27,3 +27,4 @@
 - #30373 Parent methods `tensor` vs. `tensor_product`
 - #30235 Add construction methods to `FiniteRankFreeModule` and `CombinatorialFreeModule`
 - #30309 Meta-ticket: Unify free module elements API: methods `dict`, `monomial_coefficients`, etc.
+- #32034 Graphical representations of tensors
mkoeppe commented 2 years ago

Description changed:

--- 
+++ 
@@ -17,6 +17,8 @@
 - Tensor ring: (https://arxiv.org/abs/1807.02513)
 - Quantized tensor train (see section 3.5.1): (https://refubium.fu-berlin.de/handle/fub188/3366)

+Other implementations:
+- http://tensorly.org/dev/user_guide/tensor_decomposition.html

 **Tickets:**