odlgroup / odl

Operator Discretization Library https://odlgroup.github.io/odl/
Mozilla Public License 2.0
363 stars 105 forks source link

can I pass (and process) a complex data sinogram into fbp_op? #938

Closed rshpeley closed 7 years ago

rshpeley commented 7 years ago

I need a gpu based fbp algorithm to reconstruct from complex sinogram data. I've done this in matlab by passing complex floats and have got a complex reconstruction from it. I'd like to do this in odl running on the gpu.

Looking at the astra source code I don't see a way to use a complex sinogram as input to the reconstruction. It looks like fbp_op breaks out this functionality, but can I use complex data?

rshpeley commented 7 years ago

This is the error I get when attempting with ray_trafo.adjoint(), so I guess not...

/home/shpeley/miniconda3/lib/python3.6/site-packages/numpy/core/numeric.py:482: ComplexWarning: Casting complex values to real discards the imaginary part
  return array(a, dtype, copy=False, order=order)

Any suggestions? Rewrite astra?

kohr-h commented 7 years ago

No, currently you have to manually apply the operators to real and imaginary parts separately. There would be an obvious way to wrap this in ODL, of course, but we haven't quite come around to doing that.

Given your complex sinogram, you can simply get sinogram.real and sinogram.imag as ODL vectors, and there's similar functionality to split the spaces.

adler-j commented 7 years ago

A full example would be:

import odl

# Create ray transform
space = odl.uniform_discr([-1, -1], [1, 1], [128, 128])
geometry = odl.tomo.parallel_beam_geometry(space)
rt = odl.tomo.RayTransform(space, geometry)

# Create complex data
data_space = rt.range.complex_space
data = data_space.element(lambda x: x[0] + 1j * x[1])

# Apply FBP
fbp = odl.tomo.fbp_op(rt)
image_real = fbp(data.real)
image_imag = fbp(data.imag)

# Visualize
image_real.show('real part')
image_imag.show('imaginary part')

Is this sufficient for you?

adler-j commented 7 years ago

Another option is to create a FBP operator that works on complex values. The easiest way is by using the RealPart, ImagPart and ComplexEmbedding operators as such:

import odl

# Create ray transform
space = odl.uniform_discr([-1, -1], [1, 1], [128, 128])
geometry = odl.tomo.parallel_beam_geometry(space)
rt = odl.tomo.RayTransform(space, geometry)

# Create complex data
data_space = rt.range.complex_space
data = data_space.element(lambda x: x[0] + 1j * x[1])

# Create complex FBP
fbp = odl.tomo.fbp_op(rt)
real_part_op = odl.RealPart(data_space)
imag_part_op = odl.ImagPart(data_space)
complex_embedding = odl.ComplexEmbedding(space)
complex_fbp = (complex_embedding * fbp * real_part_op +
               1j * complex_embedding * fbp * imag_part_op)

# Apply complex FBP
complex_fbp_result = complex_fbp(data)

# Visualize
complex_fbp_result.show('complex result')
rshpeley commented 7 years ago

Hmm... I'm wondering if this would work though. I've had to break out real and imaginary parts before when storing complex arrays to a file, but my concern is that during fbp processing I won't be dealing with the complex plane -- just two separate lines. After fbp, no problem though.

What I need is for the real and imaginary parts fbp processing to be exactly the same as complex fbp given the same algorithms. It appears that with adler-j's algorithm I'd be running two passes, one real and one imaginary. A complex fbp algorithm would run in one pass. At first glance it appears as though it should be the same if the fbp algorithm is fully deterministic. If artefacts are generated differently between runs it is a definite problem.

Anyway, the best way is to try it out. I have the data of a matlab run from a previous experiment and I would expect to get similar results if they did work the same.

Looking ahead a bit, how should I go about getting fbp complex processing into odl and astra? I've done some programming over the years and should be able to handle it.

Thanks for your help!

adler-j commented 7 years ago

Hmm... I'm wondering if this would work though. I've had to break out real and imaginary parts before when storing complex arrays to a file, but my concern is that during fbp processing I won't be dealing with the complex plane -- just two separate lines. After fbp, no problem though.

What I need is for the real and imaginary parts fbp processing to be exactly the same as complex fbp given the same algorithms. It appears that with adler-j's algorithm I'd be running two passes, one real and one imaginary. A complex fbp algorithm would run in one pass. At first glance it appears as though it should be the same if the fbp algorithm is fully deterministic. If artefacts are generated differently between runs it is a definite problem.

Unless I'm miss-understanding what the complex FBP algorithm does, it should be equivalent to running FBP on the real part and on the complex part separately. Hence "on the surface" the complex_fbp example I gave you should be doing exactly what you expect, but I may be missing something.

Anyway, the best way is to try it out. I have the data of a matlab run from a previous experiment and I would expect to get similar results if they did work the same.

Certainly agree, testing is usually the best way to go in these cases.

Looking ahead a bit, how should I go about getting fbp complex processing into odl and astra? I've done some programming over the years and should be able to handle it.

Here there are two options:

Thanks for your help!

No problem. Looking forward to hearing from your progress!

rshpeley commented 7 years ago

I've tried your second option and with a few mods got a shepp-logan phantom in complex data (2 plots of real and imaginary) rendered on the gpu.

However, I need a 'none' filter option for fbp_op. I tried fbp = odl.tomo.fbp_op(rt, filter_type='none') and got this error,

ValueError: unknown `filter_type` (none)

So far it looks good! I should be able to get onto testing against my earlier data as soon as I can get a non-filtered fbp going. I presume the adjoint is not fbp and so can't use that.

ozanoktem commented 7 years ago

However, I need a 'none' filter option for fbp_op.

I don't understand this statement. If you do not filter data, then you are not doing filtered backprojection, you are doing plain backprojektion. Maybe I'm missunderstanding something. What do you want to do mathematically?

rshpeley commented 7 years ago

I suppose it's in how you want to break it out. I want an analytic reconstruction method, because that is what I'm testing, and that method needs to be accomplished without a filter for my purposes. In this case my purpose is for a way I handle testing complex reconstruction. The filter gets in the way.

Another case could be to use a neural network as a filter, and in particular for a neural net to be trained optimally for real and imaginary components which will accomplish the type of filter I need. This could be an 'external' filter.

So, the 'none' filter is useful to me for prototyping, and the 'external' filter would accomplish what I need for an fbp which would satisfy your concerns. In this case I think it's justified to retain 'none' as a filter, just like null is a no-op placeholder for other functions.

I didn't say what I'm trying to do mathematically because it's a topic of a paper I'm in the midst of researching. If I can confirm the results I got previously and extend them, I'll publish the paper and post a link to it here if there's any interest for it.

ozanoktem commented 7 years ago

It sounds like you are looking at the backprojection filtering approach then. See also section "2.5 Backprojection Filtration Method" in the book Reconstruction from Integral Data by Palamodov V, CRC Press, 2016. There you can find the general result for how to do backprojection filtering for the 3D ray transform with a source on an arbitrary curve.

rshpeley commented 7 years ago

Your first link doesn't work... but the book looks very useful. Thanks!

What I'm doing is along the lines of this paper, Fast inline inspection by Neural Network Based Filtered Backprojection: Application to apple inspection.

You'll notice some of the people mentioned in the article are also contributors to the astra toolbox. Daniel Pelt, who created the neural network as a filter in fbp also handles the python interface for astra.

Neural nets and convolutional neural nets for the filter function in fbp are finding application in a number of areas, so adding an 'external' filter, or whatever it ends up being called, I think, would be useful for odl.

ozanoktem commented 7 years ago

Your first link doesn't work... but the book looks very useful. Thanks!

Updated the link.

What I'm doing is along the lines of this paper, Fast inline inspection by Neural Network Based Filtered Backprojection: Application to apple inspection.

You'll notice some of the people mentioned in the article are also contributors to the astra toolbox. Daniel Pelt, who created the neural network as a filter in fbp also handles the python interface for astra.

Neural nets and convolutional neural nets for the filter function in fbp are finding application in a number of areas, so adding an 'external' filter, or whatever it ends up being called, I think, would be useful for odl.

I got it, you want to learn the filter. Well, just do backprojection and add the filter yourself then. Or am I missing something?

rshpeley commented 7 years ago

Hmm... I'll need to look into that paper closer, but at the moment I don't have much time.

That could be ok to hack something together for prototyping, and that's what has been done until now with the astra toolbox, but ideally, to get the best gpu performance having an operator like fbp_op is eventually going to help package the data stream (generally) to the gpu better for use cases other than mine.

Also, it would be nice to go back to some earlier papers, clean up the code, and have a more refined way to present results for students or researchers. Then again, maybe I'm getting ahead of myself.

adler-j commented 7 years ago

I suppose it's in how you want to break it out. I want an analytic reconstruction method, because that is what I'm testing, and that method needs to be accomplished without a filter for my purposes. In this case my purpose is for a way I handle testing complex reconstruction. The filter gets in the way.

There are two parts (often confused) that go into the statement "filtered back projection".

You can have a look at matlabs iradon which implements the same filters as ODL.

However, I need a 'none' filter option for fbp_op. I tried fbp = odl.tomo.fbp_op(rt, filter_type='none') and got this error

From what i gather, you may be looking for one of two things:

Another case could be to use a neural network as a filter, and in particular for a neural net to be trained optimally for real and imaginary components which will accomplish the type of filter I need. This could be an 'external' filter.

Very interesting application indeed. How I would handle this myself is not by adding an 'external', but rather by doing a composition of operators.

First, you create a standard FBP operator with a ramp (Ram-Lak) filter:

ray_transform = ...
fbp_op = odl.tomo.fbp_op(ray_transform)

Then, create your custom neural network filter

custom_filter = MyCustomFilter(...)

then compose the operators to create a more advanced operator

fbp_with_custom_filter = fbp_op * custom_filter  # note that '*' means composition, just as for matrices

I didn't say what I'm trying to do mathematically because it's a topic of a paper I'm in the midst of researching. If I can confirm the results I got previously and extend them, I'll publish the paper and post a link to it here if there's any interest for it.

Sounds very interesting indeed and I'd love to hear more from it.

That could be ok to hack something together for prototyping, and that's what has been done until now with the astra toolbox, but ideally, to get the best gpu performance having an operator like fbp_op is eventually going to help package the data stream (generally) to the gpu better for use cases other than mine.

Indeed, and we have done some overhead benchmarking of this (see the examples/tomo/backends folder) and we do indeed have a overhead. Proper GPU level bindings are however on their way and should be available rather soon.

Please keep asking if you have further questions!

rshpeley commented 7 years ago

There are two parts (often confused) that go into the statement "filtered back projection".

  • First, there is a ramp filter that appears in the analytic reconstruction formula. This is part of any FBP method, and is often called a Ram-Lak filter if used without any other filter. This is the default in fbp_op
  • Second, there is (optionally) a lowpass filtering of the signal. You can enable this by using any of the filters 'Shepp-Logan', 'Cosine', 'Hamming' or 'Hann'.

Yes, I realise how the various filters work.

From what i gather, you may be looking for one of two things:

  • Either, you dont want any smoothing filter, in which case you should use filter_type='Ram-Lak', which also happens to be the default.
  • You dont wan't any modifications of your data at all, simply a backprojection. Then you should use ray_transform.adjoint as is. Note however that this is not an approximate inverse.

I want the second option, but I need an analytic reconstruction because that's what I'm testing. I'm not testing an adjoint. My tests have to do with artefact creation and so I need to test the artefact creation of the analytic backprojection algorithm.

Very interesting application indeed. How I would handle this myself is not by adding an 'external', but rather by doing a composition of operators.

I need the NN filter to be applied in Fourier space for a NN. It seems to me a composition of operators won't accomplish this. However, in prototyping without a NN I will use a composition of operators filter (not in Fourier space) on the real part of the complex reconstruction as one of my test cases.

The neural network paper I referenced earlier does an analytic reconstruction without a RamLak filter but with the NN filter in its place in Fourier space. The NN learns the filter function on its own. It would be a totally different NN filter if implemented outside of Fourier space.

Btw, the astra gpu fbp code does have a filter option of 'none', but it doesn't appear to be in the cpu code.

I'm just going to hack filtered_back_projection.py myself to get a 'none' option for a filter so I can move ahead with this.

Sounds very interesting indeed and I'd love to hear more from it.

Sure, depending if I can confirm my earlier results from Matlab testing.

adler-j commented 7 years ago

Yes, I realise how the various filters work.

Just making sure we're on the same foot here.

I want the second option, but I need an analytic reconstruction because that's what I'm testing.

I guess you need to be a bit more specific in what an analytic reconstruction is then, since the analytic reconstruction formula for the ray transform does include a filter, so any approximation would also at least approximately have a filter in it.

If you are simply looking for the adjoint, you can use the adjoint operator as is, i.e.ray_transform.adjoint, is there any issue with this approach?

I need the NN filter to be applied in Fourier space for a NN. It seems to me a composition of operators won't accomplish this.

That is certainly possible. You could do it like this:

ray_transform = ...
fourier_transform = odl.FourierTransform(ray_transform.range)
filter_coeffs = ... # provide your filter coefficients here somehow
filter_in_fft_space = fourier_transform.inverse * filter_coeffs * fourier_transform

When you apply this operator, it is equivalent to this:

filter_in_fft_space(x) == fourier_transform.inverse(filter_coeffs * fourier_transform(x)))

which would be the filter applied in Fourier space.

If you want your filter to be one-dimensional, you could do the 1-dimensional FFT by using the axis parameter, see the 2d FBP example.

The neural network paper I referenced earlier does an analytic reconstruction without a RamLak filter but with the NN filter in its place in Fourier space.

This could in ODL be written as:

ray_transform = ...
fourier_transform = odl.FourierTransform(ray_transform.range)
filter_coeffs = ... 
filter_in_fft_space = fourier_transform.inverse * filter_coeffs * fourier_transform

my_custom_fbp = ray_transform.adjoint * filter_in_fft_space

I'm just going to hack filtered_back_projection.py myself to get a 'none' option for a filter so I can move ahead with this.

Well, any reason why simply using ray_transform.adjoint then wouldn't do the same in your case?

Anyway, if you would make a Pull Request with your proposed change we could give it a review and possibly add it to the library. We love any input.