Closed K-lab-X closed 2 years ago
Hello @K-lab-X, I think this result is expected. Density compensation is an approximation for the inverse - it is not the true inverse. Did Pipe report lower errors in his paper?
Thank you for quick response! No, I think the paper does not show quantitative results about reconstruction accuracy... but I just feel a few 10% error is unreasonably high and something is critically wrong. My concern is, all the steps run so fast (<<1 sec),and wonder if the tools are really working properly or not...
I will look into the density compensation function. Meanwhile, I am very happy if you could let me know how to improve it as the first quick trial.
Basic KB-NUFFT Example: https://colab.research.google.com/github/mmuckley/torchkbnufft/blob/main/notebooks/Basic%20Example.ipynb
Hello @K-lab-X.
In terms of the NUFFT operations themselves, I verified accuracy up to machine precision levels with Prof. Jeff Fessler's MATLAB MIRT package, which has been used by the MRI community for a few decades. There are also unit tests that use data from old versions of the package to make sure we never have a regression. So overall I'm confident in the accuracy of the NUFFT operations.
In terms of Pipe's algorithm I don't have such strong guarantees, but on the other hand there is no documentation of a bug in this issue and the performance is generally expected for a non-invertible linear operator (particularly when we aren't even trying to invert it). We need further documentation to conclude that anything is wrong.
Hi @mmuckley
Thank you for checking! I have just multiplied a factor to the density compensation (just a ratio of the average pixel values of the original and output image ... 1/42230), and then the new output image matches with the original image very well. I guess I did not use calc_density_compensation_function properly, or something could be wrong with it. I agree that its rather compensation issue.. Thank you so much for helping.
JFYI I put the result after multiplying a factor to the density compensation. Upper left: original image Upper right: reconstructed image Bottom left: Fractional error (diff of the two images divided by the original image, air is fixed at 0) Histogram: Pixel values of the original image (red) and of the reconstructed image (blue). It might not be still the machine precision, but its fairly good.
Forgot to close the issue.
Hello @K-lab-X thanks for providing this data on the issue! If you have any recommendations for modifying the documents or tutorials based on your experiences, feel free to submit a pull request.
Thank you for sophisticated implementation of NUFFT with pytorch. I have a question about "Basic KB-NUFFT Example".
In the code, I compared the original image of the shepp-logan and the final product of tkbn.KbNufftAdjoint, namely "sharp image (with Pipe dcomp)" in the last image, and found that the accuracy is in order of ~10%.
Is this an expected result (since its purpose is demonstration)? And if so, could you kindly teach me how to improve it to moderate accuracy like ~0.1%?
My environment is: MacOS 11.3.1 python 3.8.3 torch 1.10.2 torchvision 0.11.3 jupyter 6.0.3