JuliaSmoothOptimizers / LinearOperators.jl

Linear Operators for Julia
Other
150 stars 32 forks source link

Potential Bugfix #231

Closed alexjaffray closed 2 years ago

alexjaffray commented 2 years ago

For externally defined operator types (i.e NFFTOp in MRIReco.jl) the following error is encountered:


MethodError: no method matching storage_type(::NFFTOp{ComplexF64})
[371](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:371)
  Closest candidates are:
[372](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:372)
    storage_type(::LinearOperator) at /home/runner/.julia/packages/LinearOperators/ER4po/src/abstract.jl:168
[373](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:373)
    storage_type(::AbstractMatrix{T}) where T at /home/runner/.julia/packages/LinearOperators/ER4po/src/abstract.jl:169
[374](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:374)
    storage_type(::Union{AdjointLinearOperator, TransposeLinearOperator}) at /home/runner/.julia/packages/LinearOperators/ER4po/src/adjtrans.jl:77
[375](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:375)
    ...
[376](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:376)
  Stacktrace:
[377](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:377)
    [1] storage_type(A::AdjointLinearOperator{ComplexF64, NFFTOp{ComplexF64}})
[378](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:378)
      @ LinearOperators ~/.julia/packages/LinearOperators/ER4po/src/adjtrans.jl:77
[379](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:379)
    [2] *(op1::AdjointLinearOperator{ComplexF64, NFFTOp{ComplexF64}}, op2::NFFTOp{ComplexF64})
[380](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:380)
      @ LinearOperators ~/.julia/packages/LinearOperators/ER4po/src/operations.jl:118
[381](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:381)
    [3] *(::AdjointLinearOperator{ComplexF64, NFFTOp{ComplexF64}}, ::NFFTOp{ComplexF64}, ::Vector{ComplexF64})
[382](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:382)
      @ Base ./operators.jl:560
[383](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:383)
    [4] testFT(N::Int64)
[384](https://github.com/MagneticResonanceImaging/MRIReco.jl/runs/6123878388?check_suite_focus=true#step:6:384)
      @ Main ~/work/MRIReco.jl/MRIReco.jl/test/testOperators.jl:47

The proposed change seems to fix the issue, but I was wondering if the function argument being the base LinearOperator type was a design decision (since we can also fix the issue by locally overloading the LinearOperator.storage_type function for each external operator).

alexjaffray commented 2 years ago

looping in @tknopp

codecov[bot] commented 2 years ago

Codecov Report

Merging #231 (7e0ad26) into main (1341c7c) will decrease coverage by 0.10%. The diff coverage is 50.00%.

@@            Coverage Diff             @@
##             main     #231      +/-   ##
==========================================
- Coverage   97.71%   97.61%   -0.11%     
==========================================
  Files          13       13              
  Lines         964      965       +1     
==========================================
  Hits          942      942              
- Misses         22       23       +1     
Impacted Files Coverage Δ
src/abstract.jl 98.76% <50.00%> (-1.24%) :arrow_down:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 1341c7c...7e0ad26. Read the comment docs.

github-actions[bot] commented 2 years ago
Package name latest stable
CaNNOLeS.jl
DCISolver.jl
JSOSolvers.jl
Krylov.jl
NLPModels.jl
NLPModelsModifiers.jl
PROPACK.jl
Percival.jl
QuadraticModels.jl
SolverTools.jl
geoffroyleconte commented 2 years ago

Hi, I wrote storage_type(op::LinearOperator) = typeof(op.Mv5) because if you want to create a new type such as NewLinearOperator <: AbstractLinearOperator which does not have a field Mv5, it leads to an error. With your solution the tests passed because I wrote a storage_type method for the other LinearOperators types defined in this package (such as AdjointLinearOperator etc...).

I see three solutions here:

@dpo which one do you prefer? For info I think that the storage_type function is only useful when using the 5-args mul! with 3-args mul! defined operators (requires also at least a Mv5 field to work), and when creating operators with BlockDiagonalOperator

alexjaffray commented 2 years ago

For info I think that the storage_type function is only useful when using the 5-args mul! with 3-args mul! defined operators (requires also at least a Mv5 field to work), and when creating operators with BlockDiagonalOperator

As far as I can tell in src/operations.jl, the storage_type function is called whenever two operators are multiplied together, requiring the Mv5 field regardless.

geoffroyleconte commented 2 years ago

Oh yes sorry I forgot about these calls in operations.jl. However I think operations between operators could still work if they did not have the Mv5 field (at least if they are defined with the 5-args mul!) as long as storage_type is implemented. I've only used the types created in this package until now, but I can try to create a new operator type and see what works and what does not.

alexjaffray commented 2 years ago

I can try to create a new operator type and see what works and what does not.

@geoffroyleconte That sounds like a good plan until a way forward is determined. The main thing to test is that any operator without a Mv5 field never gets accidentally queried for it (i.e, the base storage_type doesn't get called on it).

I think the easiest thing (as you proposed earlier) would be to require a storage_type method definition as part of the interface, and just document the interface requirements for people who rely on the package.

alexjaffray commented 2 years ago

@geoffroyleconte did you get a chance to try a new operator type yet and play around.

If you didn't, I'm thinking probably the best thing to do is just to require that one defines a storage_type method for their custom operator types, and document the interface. If you agree with this, then feel free to close this PR.

dpo commented 2 years ago

How about something like

storage_type(op::AbstractLinearOperator) = error("please implement storage_type for $(typeof(op))")
storage_type(op::LinearOperator) = typeof(op.Mv5)
geoffroyleconte commented 2 years ago

Yes I think that's a good idea

alexjaffray commented 2 years ago

I agree @geoffroyleconte , @dpo , I have added it in to the commit

github-actions[bot] commented 2 years ago
Package name latest stable
CaNNOLeS.jl
DCISolver.jl
JSOSolvers.jl
Krylov.jl
NLPModels.jl
NLPModelsModifiers.jl
PROPACK.jl
Percival.jl
QuadraticModels.jl
SolverTools.jl
dpo commented 2 years ago

Thank you!