Closed timweiland closed 1 year ago
Merging #794 (5a0843f) into main (384c739) will increase coverage by
0.05%
. The diff coverage is95.94%
.
@@ Coverage Diff @@
## main #794 +/- ##
==========================================
+ Coverage 91.23% 91.28% +0.05%
==========================================
Files 217 218 +1
Lines 8133 8207 +74
Branches 1027 1059 +32
==========================================
+ Hits 7420 7492 +72
- Misses 485 487 +2
Partials 228 228
Impacted Files | Coverage Δ | |
---|---|---|
src/probnum/linops/_block.py | 95.83% <95.83%> (ø) |
|
src/probnum/linops/__init__.py | 100.00% <100.00%> (ø) |
|
src/probnum/linops/_linear_operator.py | 87.17% <0.00%> (+0.17%) |
:arrow_up: |
Yes, that's reasonable. If we need non-square blocks in the future that can be refactored. No need ro design too generically at this point.
Thanks for the reviews!
Upon further thought and after discussion with @marvinpfoertner, I decided not to require all blocks to be square.
For example, BlockDiagonalMatrix
is useful for implementing independent multi-output kernels. When we consider k(X, X), everything is fine and all blocks are square, but k(x, X) (i.e. the cross-covariances between the input and the training data) will have non-square blocks in general. So there is a good use case for it, I think.
In a Nutshell
Add a custom linear operator
BlockDiagonalMatrix
for block diagonal structures.Detailed Description
BlockDiagonalMatrix
along with custom implementations ofdet
,trace
,eigvals
, etc...linop
andmatrix
for equality. But the eigenvalues may have a different order. For the comparison to be valid, we need to sort the arrays before comparing them.