csn-le / wave_clus

A fast and unsupervised algorithm for spike detection and sorting using wavelets and super-paramagnetic clustering
121 stars 65 forks source link

Subscripted Assignment Dimension Mismatch (error in line 369) #198

Closed sebastianmaruri closed 2 years ago

sebastianmaruri commented 2 years ago

When trying to open a .mat file containing micro electrode recording data using wave_clus I sometimes get the following error message:

Subscripted assignment dimension mismatch.

Error in wave_clus>load_data_button_Callback (line 369) clustering_results(:,2) = classes'; % GUI classes

Error in gui_mainfcn (line 95) feval(varargin{:});

Error in wave_clus (line 63) gui_mainfcn(gui_State, varargin{:});

Error while evaluating UIControl Callback

Does anyone know what might be missing from my file or any other way to fix this? Thanks a lot.

ferchaure commented 2 years ago

That looks like the algorithm isn't finding valid clusters, can you share with me the data? maybe after detecting spikes with Get_spikes()

sebastianmaruri commented 2 years ago

Sorry for taking so long. Here is one of the .mat files that caused that error message and its detected spikes (using get_spikes()) micro EEG_sorted_PC_links_2_CRAW_01___Medial.mat_channel21_depth =+2.zip Thanks for your help.

ferchaure commented 2 years ago

Hi sebastian, you have just 18 spikes, for that reason the algorithm don't work, if you try using using the batch file Do_clustering you will get a clear error message.

sebastianmaruri commented 2 years ago

Hi Fer. I get that, but I thought the toolbox was unable to open the file only if it had less than 15 spikes (which usually leads to the display of an error in line 230). Is there any difference between both errors (in line 369 and line 230)? Or would the solution be lowering the detection threshold for this one as well?

ferchaure commented 2 years ago

They are the same error. A fast fix could be change that 15 value to something like 50. Sometimes SPC can handle small amounts of spikes but the results won't be great.

Lowering the detection threshold, isn't always a good idea... otherwise you are just detecting peaks in the background noise and calling them spikes. If the channel has that small amount of spikes don't use it.

sebastianmaruri commented 2 years ago

Great, thanks for the advice. Do you think a detection threshold of 4 standard deviations is still acceptable, or should I leave it at 5 always?

ferchaure commented 2 years ago

4.5 is an acceptable value if you want to see enough multiunit activity or if the recording has low noise. If you only care only about nice single units choose 5.

sebastianmaruri commented 2 years ago

Thanks a lot. Regarding the cluster size, you suggested that any cluster with less than 50 spikes is no longer acceptable, right?

ferchaure commented 2 years ago

Doing clustering for 50 spikes as the total number in the recording is kind of useless. 50 spikes in a cluster after clustering, could be a sparse neuron in a short recording

sebastianmaruri commented 2 years ago

This has been really helpful, thanks you. I have one final question, are these numbers (detection threshold and spike count) based on literature or on your experience with spike sorting?

ferchaure commented 2 years ago

detection threshold: they are the typical values in the literature if you check that the threshold is related to the MAD and not to the standard deviation.

spike count: experience I guess, and the exact number could change with your objective and the length of your recording

sebastianmaruri commented 2 years ago

great, thanks