microscope-cockpit / cockpit

Cockpit is a microscope graphical user interface. It is a flexible and easy to extend platform aimed at life scientists using bespoke microscopes.
https://microscope-cockpit.org
GNU General Public License v3.0
35 stars 26 forks source link

Online 2D SIM reconstructions. #859

Open iandobbie opened 1 year ago

iandobbie commented 1 year ago

The video rate 2D SIM system that Marcel built had online 2D reconstruction. The Diamond team have asked if that could be plumped into the CryoSIM system. I think it might be possible without too many problems.

1) Add a client to the camera DataDevice stack

2) Have a separate machine receiving the data and passing it off to a GPU based fairSIM 2D processing.

3) Have a separate display for online reconstruction, or have a separate "camera" which is the SIM reconstruction output from FairSIM sent back over the standard pyro connection.

iandobbie commented 1 year ago

Wrote to Marcel (the author of the online SIM processing and his reply was....

Concerning the live reconstruction, there's a few pointers I can give, but I think best would be a call with whoever will implement the changes to the code needed to make it work in that setting.

In short, our code does both instrument control and live reconstruction. As you are running cockpit for the first part, and our code is also very specific to our multi-color system, I would just turn that part off completely, and rely on other software to run the instrument. Currently, there might not be an easy way to do this, but there will be soon (see below).

For the reconstruction, two things are needed: Images need to come in via network, in a quite easy to implement transport feature. I think I've once written a python wrapper to send those images, have to see if I can find or recreate that. Then it should be easy to just plug it into Cockpit / python microscope and stream in the images over the network.

The second part needed is what we call sequence detection. Basically, as raw frames come in, the software has to assemble them (with angle and phase information) into full SIM datasets we can reconstruct. For historical reasons, on the fastSIM, we did this based on time stamps. I.e., we knew (with a given exposure time) that if there is a defined time between images, those go into a single data set. If you have reliable metadata with your images in python microscope, that part should also be adapted to just collect images based on that.

This is a bit of work, but I think quite doable for someone willing to work through that code a bit. Also, we have a (somewhat dormant) project of running live reconstruction on one of our TIRF SIM systems, that is controlled through python microscope. So if the Diamond project isn't urgent, most of the work you'll need for this should appear in faimSIM anyways.