PTB-MR / mrpro

MR image reconstruction and processing.
https://ptb-mr.github.io/mrpro/
Apache License 2.0
19 stars 2 forks source link

Signature of the signal function / Quantitative Parameters #135

Closed fzimmermann89 closed 5 months ago

fzimmermann89 commented 11 months ago

Today, we had a meeting and discussed the signal functions.

The background was how to implement complex parameters such as M0 (Sat. Recovery) or A,B (Molli)

There were the following options discussed:

A. one tensor as input for all signal models. Along the first dimension, different real parameters are concatenated. So, for example: x=[realM0, imagM0, T1]

Advantages:

B. multiple tensors as input. so, for example, InversionRecovery()(m0=tensor(),t1=tensor)

Advantages:

C. a dictionary / container of parameters. So, for example

class qdata(dict):
 ...
 x=qdata(t1=tensor, m0=tensor)
InversionRecovery()(x)

Advantage:

D. a seperate class for t1, m0, etc..

Advantage:

Results

We obtained a consensus of @fzimmermann89 @schuenke @guastinimara @JoHa0811 @koflera that we reject A. We decided to postpone the decision wether to go wth B,C or D for now. We decided to move forward for now with B, i.e. seperate inputs. We need to see how this influences the regression and chaining functionality. It would be easy to later switch to D or optionally support classes D.

Please correct the post if I missed anything important.

fzimmermann89 commented 11 months ago

Random Thought:

B might lead in the end to a lot of **kwargs / untyped dictionary usage.

So we could just make all signal functions kwargs, type kwargs as a subclass typedditionary that we call qdata... and pass along these dictionaries .

I will at some point if I find the time to a couple of mypy experiments to see if this would work..

schuenke commented 11 months ago

Thanks for the very nice summary @fzimmermann89

For now, I suggest that we change the signal models in #129 and #130 to style B and decide how to proceed at the next meeting on January 8.

fzimmermann89 commented 11 months ago

I forgot: Regarding Real / Complex, the consensus was:

So It is up to the downstream application to consider if parameters make sense as real or complex. We document it for each function.

fzimmermann89 commented 9 months ago

And a new summary, this time from our slack:

Felix

regarding the typing and signature of operators. currently, my understanding is:

So a general operator should be able to return multiple tensors.

So the operator chaining logic will be something like this:

intermediate = operator1(*args)
  if istype(intermediate,torch.Tensor):
out = operator2(intermediate)
else:
  out = operator2(*intermediate)

And the type hints of operator's forward would be forward(*args:*Tin) -> Tensor|tuple[*Tout] with Tin, Tout TypeVarTuples

We could instead make itforward(*args:*Tin) -> tuple[*Tout] and always return a tuple, then linear operator would be Tensor ->tuple[Tensor,] I.e. a linear operator would always return a length-1 tuple, but it will alwyas be a tuple.

Not sure if there will be many places where we will need to make the distinction between an operator returning a Tensor and returning a tuple of Tensors that will be unpacked.

What are your opinions?

Christoph Kolbitsch I think all Operators should return a tuple of Tensors otherwise we have to check at the beginning of every forward/adjoint… if it is a tuple or not

Patrick Schuenke Input is always a single tensor or multiple tensors (currently just for signal models), no ?! Output would always be a tuple, but for linear operators it would be a tuple of length 1. This means we don't have to check for the return type, but have to deal with the tuple unpacking, which is fine imo. Or did I get it wrong?

Christoph Kolbitsch wouldn’t this mean that something like LinOp1.foward(LinOp2.forward(x)) would not work, because LinOp1.forward would expect a single tensor but LinOp2.forward would return a tuple?

Felix Yes. It would either be LinOp1.foward(*LinOp2.forward(x)) Or linop1@linop2(x)

but either way, it would be GeneralOp1.foward(*GeneralOp2.forward(x)) which would make the "always return tuple and linear operators return tuple of len 1"-approach a bit more consistent

General operators would have n-Tensors as input and return a tuple of len-m of tensors. Linear Operators would be n=m=1 All signal models would be m=1

Christoph Kolbitsch Ok, thanks for the explanation. I think the “always return tuple” approach makes most sense then If this gets added to the code, can you please also directly add some info about this in the documentation?

ckolbPTB commented 5 months ago

So far we are happy with B.