cortex-lab / Suite2P

Tools for processing 2P recordings
Other
120 stars 65 forks source link

Deconvolution problems #157

Open horsto opened 5 years ago

horsto commented 5 years ago

Hello! I have a problem with the output from deconvolution (OASIS). I noticed in recent analysis output that my spike matrices were much denser in comparison to ones I analyzed about half a year ago. Since then I have updated both the OASIS matlab package and suite2p code. I am thresholding the deconvolution output with mean + 2 * standard deviation for every cell, which seems to give stable results (across sessions / animals), meaning, results that "by eye" look reasonable. Now the same threshold does not filter the output adequately. I am showing an example of 4 different sessions, analyzed with old and new code, below.

2 The problem is that even if I set the thresholds for spikes higher I seem not to be able to clean up the deconvolution results (I am getting rid of spikes that belong to obvious transients).

I tried to revert the OASIS package to the code version I used ~5 months ago. With all suite2p parameters the same (no differences in "ops") there are still differences visible (OLD - old analysis result, NEW - same cell, with old OASIS package and up-to-date suite2p code). 1 There are also apparent differences in the neuropil corrected traces in the examples above. I also tried to run the suite2p python version which, with the thresholds I usually use, leads to good results! Since my pipeline depends still on the Matlab version for now I would appreciate any input on what might be going on.

marius10p commented 5 years ago

The last change to spike deconvolution I remember was more than 5 months ago and it was a change in running baseline correction. If it's not that, have you checked that your timescale and sampling rate is the same? Btw, the new version looks better to me, the old seems to have been losing some smaller transients, which is exactly what you'd expect if the baselining was not good.

The Python version also has running baseline correction, but some of the parameters might be treated differently, for example sampling rate is per plane not cumulative. Other than that, I think the baselining might be slightly different, I could double check. Would be good to figure this out, we'd like the Python and Matlab versions to be as similar as possible.

horsto commented 5 years ago

Thanks for the fast reply! Timescale / sampling rate are the same, yes. I will do some more tests, hope to get back to you soon.

horsto commented 5 years ago

By the way, I get quite a few warnings during baseline / coeff estimation: Warning: Matrix is singular, close to singular or badly scaled. Results may be inaccurate. RCOND = NaN. that is in https://github.com/cortex-lab/Suite2P/blob/73456d2c2112a2a4ccf056de39098f31a3c161a7/SpikeDetection/wrapperDECONV.m#L79

(This has not changed and happened since I started using suite2p)

marius10p commented 5 years ago

Sorry, didn't see this last one. Could you double check your neuropil subtraction coefficients? We have not yet implemented an adaptive coefficient per neuron in the Python version (it's fixed at 0.7). The warnings could mean that some neurons get poor estimates of coefficient.