Open jofrevalles opened 1 year ago
Merging #26 (dc51736) into master (bff2de5) will increase coverage by
1.56%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #26 +/- ##
==========================================
+ Coverage 82.67% 84.23% +1.56%
==========================================
Files 5 5
Lines 202 222 +20
==========================================
+ Hits 167 187 +20
Misses 35 35
Impacted Files | Coverage Δ | |
---|---|---|
src/Numerics.jl | 95.71% <100.00%> (+1.71%) |
:arrow_up: |
I need more time to check why we need P
and the details of the implementation. Since this is not a hurry, I will check it out whenever I have some free time.
Summary
This PR adds a new
lu
function for tensors, extending theLinearAlgebra.lu
function (resolve #3). The newlu
function returns the LU decomposition of aTensor
, where the tensor can be recovered by contracting the permutation tensorP
, the tensorL
, and tensorU
. The tensorsL
andU
are reshaped versions of the original lower and upper triangular matrices obtained during the decomposition process, respectively.This implementation is inspired by the LU decomposition in the
scipy
library, as it returns the permutation tensorP
allowing the original tensorA
to be recovered with the contractionA = P * L * U
. This contrasts withLinearAlgebra
, where the permutation vectorp
is returned, and the original matrix can be recovered withP' * A = L * U
(whereP'
is the permutation matrix built fromp
).Please let me know if there are any concerns or issues with extending the
LinearAlgebra
library in this manner.We have also added tests for this new function.
Example
A usage example of the
lu
function: