JuliaImageRecon / RegularizedLeastSquares.jl

MIT License
20 stars 9 forks source link

TV version with support for ND-signals #45

Closed migrosser closed 1 year ago

migrosser commented 1 year ago

This is my current version of the total variation regularizer that can handle signals with different dimensionalities. In this version the dispatch over the different dimensionalities is performed by using an appropriate gradient operator. Thus, it is possible to compute the proximal map for a given signal dimension, given that the gradient operator exists.

A suitable implementation of the gradient operator is contained in the package SparsityOperators.jl. However, a drawback of directly using this implementation is that it increases the set of dependencies of RegularizedLeastSquares. Notably, SparsityOperators depends on LinearOperators.jl, which is a dependency that we just got rid of in RegularizedLeastSquares. To work around this issue, I extracted only the relevant functionality from LinearOperators and SparsityOperators and put it into the folder linearOperators. Of course, this is not a very elegant approach but it made it easy for me to implement the required gradient operators.

Another limitation is that the gradient operator currently is only implementation for 1 to 3 dimensional signals. Moreover, a directional gradient operator is implemented. Here I think that it would be nicer to have an implementation which is completely generic. I think it should be possible to obtain a generic implementation using the macros from Base.Cartesian though.

JakobAsslaender commented 1 year ago

Not sure if I screwed this up--I wasn't able to add to this PR and, instead, created a new PR #48 . LMK what you think of it!

tknopp commented 1 year ago

Ok, I merged #48 into ND-TotalVariation and hope that I successfully resolved the conflicts.

Regarding the LinearOperator dependency, I am currently changing my mind and will probably add it again. On Julia 1.8 we have

julia> @time using LinearOperators
  0.305119 seconds (235.72 k allocations: 15.608 MiB, 73.06% compilation time)

and on Julia 1.9 (upcoming) we have

julia> @time using LinearOperators
  0.086789 seconds (113.28 k allocations: 8.254 MiB, 10.93% gc time, 9.65% compilation time)

I don't think that justifies code duplication.