Closed tvercaut closed 3 years ago
Hi @tvercaut ,
I marked for working group discussion, maybe you guys can discuss deeper in the meeting and make some proposals. Thanks.
Hi @Nic-Ma, This doesn't need to be prioritised in a working group. We are looking at it for one of our projects and would submit a PR when ready. This issue is meant to keep track of the planned PR and start the ball rolling with @crnbaker. (Similar scenario with #1155 ) Tom
Hi both, happy to take a look at this once I've got my head round the project!
I've implemented a Hilbert transform for 1D tensors using torch.fft.fft()
and torch.heaviside()
. @tvercaut are you interested in multidimensional Hilbert transforms too? If so, I could implement the method described here.
As I'm brand new to MONAI, I'm not sure exactly where this new transform should go. monai.transforms.intensity
maybe, inheriting from Transform
?
preferably it should go into monai/networks/layers
and inheriting nn.Module
, like the gaussian filter https://github.com/Project-MONAI/MONAI/blob/4e9164032610b69acdf716765909f5ec393af36a/monai/networks/layers/simplelayers.py#L133
and could add a transform class to use the layer, like the gaussian smoothing https://github.com/Project-MONAI/MONAI/blob/4e9164032610b69acdf716765909f5ec393af36a/monai/transforms/intensity/array.py#L512
Thanks, makes sense.
@crnbaker: I think a 1D Hilbert transform is enough for the use cases I have in mind. However, we should be able to aply it to 2D or 3D images by specifying the dimension across which to apply the 1D transform.
Also, a small performance benchmark with respect to scipy (maybe alse a jax-based implementation?) would be great. It could be as simple a notebook with some timeit in the first instance as e.g. here: https://colab.research.google.com/drive/17pI6xPGLTGcpGBulBrWZeNnBfLAwzqZT?usp=sharing
Just finalising this now. I've noticed that the other transforms in MONAI/monai/transforms/intensity/array.py
return Numpy arrays rather than Tensors. For example class GaussianSmooth(Transform)
mentioned above:
return gaussian_filter(input_data).squeeze(0).detach().numpy()
Is this the behavior we want? Wouldn't it be better to return Tensors in case the transforms are being used as part of a CUDA pipeline?
EDIT: I've just realised that all the transforms in /intensity/array.py
are intended to take numpy arrays and operate on the CPU only - presumably that's why it's called array
! I will include the Hilbert transform there, accepting and returning Numpy arrays. Is there another location for transforms that operate on Tensors, or would you expect the user to use layers/simplelayers
directly? There's not much of an advantage using PyTorch over Scipy for this transform on the CPU.
Hi @crnbaker, the user should use layers/simplelayers
if they are dealing with torch tensors on gpu. currently the transforms are mainly with numpy arrays, there's an ongoing discussion about how we support numpy/torch on cpu/gpu for the preprocessing. for now the transforms should at least support numpy. we can refactor these later. btw the transforms assume shape [channel, spatial_dim0, spatial_dim1, ...], but the layers assume [batch, channel, spatial_dim0, spatial_dim1, ...]. These assumptions simplify the input format validation
Great, thanks @wyli. I'll submit my pull request soon.
Is your feature request related to a problem? Please describe. Many ultrasound computing pipelines rely on a envelope detection detection algorithm. Having such a feature in MONAI would help ultrasound researchers adopt MONAI.
Describe the solution you'd like A GPU enabled, ideally differentiable, PyTorch version of
scipy.signal.hilbert
Describe alternatives you've considered Implementing the Hibert transform if/when needed by using the PyTorch
fft
.Additional context Adding @crnbaker in the discussion