Closed tomdertech closed 7 years ago
Hi Tom,
We are glad you have found HoloPy useful for your work! We are actually in the process of preparing a new release of HoloPy (v3.0) that will hopefully be more intuitive to use and also introduces a bunch of new features. Unfotunately, this update won't be backwards compatible, so you will have to make some adjustments to your code in order to use it. Stay tuned over the next few weeks for a release announcement.
The 'spacing' argument represents the distance between pixels in your image/camera, specified in the same units as the light wavelength. By setting it to 1 metre, you have effectively told holopy that your image is many metres across, which is why you need to propagate over such large distances. The results from this calculation probably can't be trusted, because we haven't tested a regime where the propagation distance is so much larger (10^7) than the light wavelength. In case you haven't found it yet, this page in the HoloPy documentation may be helpful: [http://holopy.readthedocs.io/en/latest/users/recon_tutorial.html]
Our transfer function is similar to the Rayleigh Sommerfeld method, but uses a convolution in fourier space to speed up the calculation. We would welcome the addition of other propagation routines to HoloPy if you end up writing your own.
I hope this information has been helpful, and please do not hesitate to reach out with any more questions. Good luck!
Solomon
@agoldfain, did you ever end up writing a reconstruction routine that might be useful for diverging beams? I seem to recall you working on something like that.
@barkls @vnmanoharan I am so pleased this is still active :-) Would you be as so kind to let me know what v3.0 holds - especially around the problem I am investigating? It would be a shame for me to be writing routines that might be available in the coming weeks. Any implementations for a lensfree microscopy, angular spectrum method, diverging beam?
From my python code and results what would you suggest I try as looking at the images they do not look 'reconstructed' ... do you agree? My experimental set-up is such where object is around (Z2) 2mm from my CMOS sensor, the light source (Z1) is around 60mm, therefore the condition Z1/Z2 >> 1 is met.
Many Thanks.
Regards, Tom
Hi Tom,
@vnmanoharan is right. A few years ago I wrote some code to implement a reconstruction for a lensless holographic microscope using one on-axis point source as the illumination. It follows the algorithm described here http://link.springer.com/chapter/10.1007%2F978-3-642-15813-1_1 . I will track down my code, make sure I can still run it, and share it with you (hopefully within a week). I was not planning on adding it to v3.0 because the code is messy and I think the implementation is quite slow.
As to the code you shared in your original post, I would not expect the propagate function in holopy to give reasonable reconstructions of holograms when a diverging reference wave was used. The diffraction integral that must be evaluated to perform a reconstruction is significantly different if the reference wave is diverging.
Just so you know, there may be commercial software for performing hologram reconstructions with diverging beams. There was a company called Resolution Optics that sold point source holographic microscopes and reconstruction software. I think the company was perhaps a spin-off of the Kreuzer Lab, which developed the algorithm in the paper I linked above. I just tried searching for Resolution Optics online and it seems like their products are now sold by the company 4-Deep. http://4-deep.com
Best, Aaron
@tomdertech The v3 release will not contain any improvements to the reconstruction code, sorry. Its main new capability is a Bayesian inference framework for estimating parameter values from holograms. The major version bump is because HoloPy is switching from python 2 to python 3 and switching to base off of a new array library called xarray and changing some interfaces to benefit from that.
@agoldfain your code might be useful in whatever state ;-) I am going to check out the paper you referenced. 4deep looks very interesting. Keep me posted on what you find. @tdimiduk thank you for the heads up :-)
I have been experimenting with using the holopy library to reconstruct images captured using a "lensfree" microscope arrangement. In the simplest arrangement I record a single image from one light point (axis-normal) and wish to reconstruct at different Z depths. Ultimately, I will want to combine images from multiple light points to create a high resolution hologram for reconstruction.
From looking at the holopy code most of the steps to appear to be present to perform this back propagation to the object plane. Prof. Manoharan raised the point that currently there is no model for diverging incident beams.
Here is a link https://www.dropbox.com/s/ys9dyuyiii2hp1a/holopy.html?dl=0 to my work so far. My observations are that the
depth
argument in thepropagate
method seem to have to be very large i.e. 10's of metres (as you can see from my code). Increasing/decreasing these values seems to somewhat change the reconstructed image, but I would say that it is more of a "filtering" effect, than a reconstruction. Furthermore, it seems to be cyclic, that is to say, that going from say 10 to 20 produces the same images as going from 20 to 30.I started by wanting to use the 'angular spectrum method' or the 'Rayleigh sommerfeld' method to perform the back propagation, but thought that your transfer function might be suitable, even though I do not fully understand the theory.
I would be interested to hear your thoughts and comments regarding my work and how I might modify holopy to perform my required reconstruction. I am happy to contribute to this code and/or beta test anything related to my problem.
Regards, Tom