microscope-cockpit / cockpit

Cockpit is a microscope graphical user interface. It is a flexible and easy to extend platform aimed at life scientists using bespoke microscopes.
https://microscope-cockpit.org
GNU General Public License v3.0
37 stars 27 forks source link

Need an experiment that will interleave different channels on one camera. #686

Open iandobbie opened 4 years ago

iandobbie commented 4 years ago

The Zaber system has multiple LEDs so to get multi channel images we would use a single multi-band filter cube and flash individual LEDs for each acquisition. This needs an experiment which will multiplex mulltilpe channels on a single camera. Exposure one uses LED1, take images on camera, expsoure 2 uses LED2, then move Z, exposure 3 is LED1 again etc...

iandobbie commented 3 years ago

I suggest we have a switch on teh experiment interface to interleave channels. Then the experiment can have a lost of lights and generate and action table with camera triggers and a round robin list of light triggers. With Z moves at the relevant place, then the data stored in a file with the relevant channels/z set.

iandobbie commented 3 years ago

I have started to implement this in 1cam-multi-channel branch. So far generates light pulses of the correct size.

Outstanding issues

  1. Method to enable multi channel on a single cam in experiment dialogue
  2. Doesn't take account fo the actual camera exposure time. Need to trigger the light for the correct time but expect the camera exposure to stay equal to the max exposure time
  3. Deal with storing the data in a file with correct meta data

I'm sure there will be other issues.

iandobbie commented 3 years ago

I have dealt with part of point 2. I don't take a second image till after the max camera exposure time, so we can leave the camera exposure fixed and should cope with readout time. Noticed a couple of additional factors that need to be taken care off.

  1. What do you do if you want multiple channels spread across several cameras?
  2. No option for ordering the lights, they get a random order or maybe they are fporced to be in wavelength order.
iandobbie commented 3 years ago

Latest version of my branch appears to do all the action table stuff. I can't actually run and experiment here so I will test on some real hardware and then also do the proper image saving fixes.

iandobbie commented 3 years ago

I am now working on the meta data for the file. The emission wavelengths need to be encoded somewhere, which currently doesn't exist, then this info needs to be passed to the datasaver. Possible the multi bandpass filters return the next band at longer wavelength from the excitation wavelength, although this ignore long stokes shift dyes.

iandobbie commented 3 years ago

Latest push has meta data but not quite correct yet. There are two outstanding issues as far as I can see

1) The excitation wavelength is always the longest, it needs to be the excitation wavelength used for that image, so needs to passed from the experiement module to the data saver.

2) There is no current mapping between excitation wavelength and emission of a multi-bandpass em filter. This needs to be in a config file I think

iandobbie commented 3 years ago

Sorted the excitation wavelength, just need to map the emission wavelength somewhere and then pass that.

Current suggestion is that we define it in the camera config.

iandobbie commented 3 years ago

Definition of mapping in a camera section of the depot file is defined as em-map: 488 : 525 561 : 580

This means that 488 excitation is mapped to 525 emission and 561 ex mapped to 580 em in the interleave multi-channel mode.

This appears to work on simulated devices but crashed on Zaber today. I think this might be a Ximea camera config issue. I will confirm tomorrow.

iandobbie commented 3 years ago

This works on the zaber, with the digital Z-stack edits (#691) to get the Z stack to work on the Zaber. I will test on Danny's system on Friday and pull if it works. The complete edits are isolated in

https://github.com/iandobbie/cockpit/commits/interleave-multichan

iandobbie commented 3 years ago

Pulled and tested on Danny's Aurox system so I am happy it works and is generalisable.

One issue is that the parsing of the em-map config is not robust and should probably be improved. I will open a separate issue around this.

carandraug commented 3 years ago

I see that this adds a ""Interleave all channels on one Camera" checkbox to all experiments, and the base Experiment class now takes a new interleave parameter. However, it seems that most of the actual logic is only implemented on the zStack experiment. Is that correct? If so, should this not either be done in a central place or maybe the UI changed so that the checkbox is only available for the z stack experiment?

iandobbie commented 2 years ago

I am trying to merge this. I can move the GUI control onto the zStack experiment. I see no reason why other experiments might not want to use this, but most the useful experiments are Zstacks underneath anyway, eg just plane time lapse.

iandobbie commented 2 years ago

Having looked into this there is no customization in the generic Z stack, so this will add considerable complication for little benefit IMHO.

I propose to pull my 1cam-mutli-channel branch into master to implement this functionality. One issue is it changes the default configs as it adds the interleave experiment parameter, so the config needs to be reset once the software is upgraded.

juliomateoslangerak commented 1 year ago

Hi @iandobbie, is this interleaved version handling a mixed mode? That is two cameras one of which has to take two channels. I have a use case for this. I have two cameras separated by a long pass at some 550nm. I need to take in one camera GFP-green and in the other one, using a dual band, red and far-red.

iandobbie commented 1 year ago

I haven't thought about this use case so I suspect not. I don't think it would be hard to add but would need some thought and some testing. The critical thing is to match lasers with cameras and emission wavelengths. If you are willing to have the images taken sequentially then I think all you would need would be some logic to ensure the right camera is triggered with the relevant light source. If you want the two camera to trigger simultaneously for one image and then just one camera for the second image set then it would need more thought and code logic.

juliomateoslangerak commented 1 year ago

I would need sequential acquisition in order to minimize crosstalk. At least in my equipment it is not worth doing simultaneous acquisition... What might be interesting is to factor in acquisition order. You might want to interleave the cameras. That is red_on_cam2 then green_on_cam1 then farRed_on_cam2. The idea being to optimize the timing by giving cam2 time to read the image while image is being taken on cam1. I will dive into this if the application comes to need it.

iandobbie commented 1 year ago

I agree interleaving the cameras seems like an obvious win. I don't think much additional work would be needed, just some way to map the illumination and emission colours to a specific camera. I already have a list which maps excitation to emission for use on multiband filter sets. I seem to remember there was a way to configure the acquisition order too but I could be miss remembering. If I get a chance I'll have a look.

iandobbie commented 1 year ago

This code works and has been used on the zaber and I believe on the aurox system so it seems silly to let it atrophy. I am keen to merge this into the main branch. I guess I need Tom or Julio to demonstrate that the code doesn't break an existing multi camera system.

We could then work on the suggestion from Julio above to allow the mixed mode suggested above, eg 3 channels across 2 cameras.

iandobbie commented 1 year ago

I pulled it up to date with master and the only issue was some documentation edits which obviously got applied differently in the two branches.