NII-Kobayashi / GLMCC

GLMCC: The generalized linear model for spike cross-correlation (Kobayashi et al., Nature Communications, 2019)
MIT License
33 stars 3 forks source link

Setting up #1

Closed EmiliosIsaias closed 3 years ago

EmiliosIsaias commented 3 years ago

Hi! I am interested in using your method in a set of cortico-subcortical recordings. I was trying to run the commands in Windows 10 and it ran until Est_Data.py that it ran into an issue with the subprocess package. I figured out that this issue is due to incompatibility with Windows10 and tried running it in a Ubuntu virtual machine which seems to work. However, I ran into another issue and it is more into the amount of given data. I saw in the paper that you define the minimum criteria for estimating the connectivity matrix in Table 1 but, unfortunately, our recordings have sparse firing and are maximum 1 hour long. Additionally, I would like to split the experiment into the different treatments that I'm applying, which makes our registered activity even sparser.

Do you think that I could still use this method for my sparse data? Do you have a suggestion for running the algorithms in Windows? (I've looked into Cygwin, which is a Linux environment for Windows, but it creates many incompatibilities with my conda environments)

Thank you very much in advance! Emilio Isaías-Camacho

r-koba84 commented 3 years ago

Thanks for your inquiry.

Do you think that I could still use this method for my sparse data?

In general, you can use this method for sparse data. The sparser the data is, the more difficult detecting the connectivity is. Note that this issue is independent of the method. You can also try our web application just to try! https://s-shinomoto.com/CONNECT/

Do you have a suggestion for running the algorithms in Windows?

Can you tell me more detail? For example, how you run the code using conda, the version of the conda and windows, and the error massage.

Sincerely yours, Ryota Kobayashi

EmiliosIsaias commented 3 years ago

Thank you for your quick reply!

I will give it a shot in the web interface, thanks!

I am currently using: Windows 10 version 20H2, conda 4.9.2, python 3.9.2, scipy 1.6.1 numpy 1.20.1 matplotlib 3.3.4

The command that I run is python Est_data.py simulation_data 20 sim

And the error that I get is image

Best regards, Emilio Isaías-Camacho

r-koba84 commented 3 years ago

Thanks for your reply.

Please try to change the last lines in the python code "Est_Data.py".

===== before ================== cmd = ['rm', "Jpy"+str(T)+".txt"] proc.check_call(cmd)

===== after ================== import os os.remove("Jpy"+str(T)+".txt")

I hope it works.

Best regards, Ryota Kobayashi

EmiliosIsaias commented 3 years ago

Thanks a lot for looking into this! I'll do the change and test. Meanwhile, I've gotten my hands on a Linux system to try out the code and it runs on the simulated data seamlessly! However, as I input my sparse data, the code exits with the following error: image

I reviewed the firing rate of the given cells and the crash occurred in one with ~0.15 Hz. I've been thinking that maybe a set of bins with 0 counts in the cross correlograms might arise some issues with the logarithm operation. I don't know, what do you think?

Best regards Emilio Isaías-Camacho

r-koba84 commented 3 years ago

Thanks for your report.

I suggest you to exclude the neurons that fires too sparse (< 0.5 Hz). I also agree with you that each bin in the cross-correlogram should be more than a spike.

EmiliosIsaias commented 3 years ago

Very well, thanks for your time! I'll paste here the cross-correlogram of those cells which caused the math domain error in the logarithm sometime later.