Closed MBLH closed 8 years ago
Question about" set-parameters.m" n1 The thresholds (stdmin, stdmax) are proportional to the standard deviation of the noise. (check Unsupervised spike detection and sorting with wavelets and superparamagnetic clustering. R. Quian Quiroga, Z. Nadasdy and Y. Ben-Shaul. Neural Computation)
Question about" set-parameters.m" n2 You are right. Only in some rare cases the use of more inputs could help to discriminate clusters with very similar waveforms.
Question about" set-parameters.m" n3 In the practice, any clustering algorithm needs at least some elements to find a real pattern in the distributions. You can use Wave_clus with only 80 spikes but the results will be better if you have 2000 spikes. The script "Do_clustering" checks that and doesn't process the file if the number of spikes is below a value defined inside that script. In another hand, par.min_clus defines the minimum size of a clusters to accept it as a neuron, the cluster with a smaller size will be considerate as a spurious cluster. If min_clus is too big you could eliminate clusters generated by sparse neurons but if the value is too small a lot of spurious clusters will be taken as neurons.
Question about GUI n1
The plot is fine, if you want to see bigger projections, you can maximise the window (to help a little, the next update will reduce the gap between subplots). In any case, that plot isn't very useful when you are working in ten dimensions. The "zoom in" works, you should select an area inside one of the subplots.
Question about Output file of Get_spikes_pol(1), "spikes".
For now, in the case of polytrodes, we consider a spike as a concatenation of the spikes in all the channels. 30 data points per channel and 4 channels, make a "concatenated spike" of 120 data points. About the units, if we read the raw file, the amplitude is in microvolts; in other case we don't change the scale of the data.
Question about Output file of Do_clustering_pol(1), "ipermut". Wave_clus chooses randomly what spikes uses to clustering (all if you have less than par.max_spk). The indexes of the chosen elements are in "ipermut".
Question about Output file of Do_clustering_pol(1), "inspk". The variable 'inspk' describes the features of the spike shapes use for clustering. They are 40 because you have 4 channels and par.dim=10.
Question about" set-parameters.m" n1 OK, on your power point used in some talk, the definition of threshold is described as T=4 x median { |X|/0.6745} maybe X;probability deviation? "median{ }" means "median of { }"? So this T value takes in between the stdmin, stdmax, you mean? Thank you, I will read the paper you recommend.
Question about" set-parameters.m" n2 OK.
Question about" set-parameters.m" n3 Thank you, I understand it.
Question about GUI n1 After I magnified my window size, the view improved, thank you. And the reason why the cluster which I magnified apparently disappeared was, I clicked the empty space on the clusters, I am sorry.
"that plot isn't very useful when you are working in ten dimensions. " means that Wave_clus calculates to segregate clusters in 10 dimensions anyway, but to human, it's difficult to recognize with natural sense (because it's not 3 dimensions) so it's useless to check it?
Question about Output file of Get_spikes_pol(1), "spikes". So, you don't change the scale of the data? OK. My original raw data was saved in microvolts, but I converted it to volt before I loaded it to wave_clus, so maybe the unit shown here is volt.
You wrote; "30 data points per channel and 4 channels, make a "concatenated spike" of 120 data points. " So, I'm trying to figure out how the data is stored. One possibility I can imagine is, at 10000Hz,
spike(1,1)...1=4x0+1 =spike#1, electrode#1, timepoint#1(=0.0001x0sec) spike(1,2)=spike#1, electrode#2, timepoint#1(=0sec) spike(1,3)=spike#1, electrode#3, timepoint#1(=0sec) spike(1,4)=spike#1, electrode#4, timepoint#1(=0sec)
spike(8,59).....59=4x14+3 =spike#8, electrode#3, timepoint#15(=0.0001x14sec=0.0014sec later from the beginning of spike data) spike(8,60)=spike#8,electrode#4,timepoint#15(=0.0014sec) spike(8,61)=spike#8,electrode#1,timepoint#16(=0.0015sec) spike(8,62)=spike#8,electrode#2,timepoint#16(=0.0015sec)
Is it correct? or do you use something different data arrangement on the matrix?
Question about Output file of Do_clustering_pol(1), "ipermut". OK, I understand it.
Question about Output file of Do_clustering_pol(1), "inspk". Thank you, now I see what it means.
Question about" set-parameters.m" n1 In the equation: T=4 x median { |X|/0.6745}
Question about GUI n1 Yes, exactly. Could be useful in odd cases, maybe if some bug appears.
Question about Output file of Get_spikes_pol(1), "spikes". You are right about the scales.
The data arrangement for the "concatenated spikes" is:
spikes(1,1:30)= spike#1(30 data points) electrode#1 spikes(1,31:60)= spike#1(30 data points) electrode#2
spikes(8,61:90)= spike#8(30 data points) electrode#3 spikes(8,91:120)= spike#8(30 data points) electrode#4
Thank you, I understand the most. One last question is
Question about Output file of Get_spikes_pol(1), "spikes".
In my case, w_pre datapoints=10, and w-post datapoints=20 (sampling rate is 10kHz), and total data points per an electrode is 30. Then, which data point corresponds to "spike" with the time stamp stored in "Index"? The 10th time point, or 11th time point?
Question about Output file of Get_spikes_pol(1), "spikes". The time stamp corresponds to the 10th point of the spike. Buy if par.interpolation='y' the waveform could be shifted half a sample.
OK, I may ask you if I once have more question in the future but for now, I think I received al the answers at this moment. Thank you very much!
Hi I working with the wave clus . I am trying to load my data of (.mat) file using ASCII. I am facing the following error
error: at line 284 of 'D:\MONTECAR\SPC\aux1.c': N<=0 Attempted to access ylimit(:,1); index out of bounds because size(ylimit)=[0,0].
Error in plot_spikes (line 357) ymin = min(ylimit(:,1));
Error in wave_clus>load_data_button_Callback (line 786) plot_spikes(handles);
Error in gui_mainfcn (line 95) feval(varargin{:});
Error in wave_clus (line 61) gui_mainfcn(gui_State, varargin{:});
Error while evaluating UIControl Callback
Please guide me . Thanking you Regards
Please @manisankar114, download the latest code from: https://github.com/csn-le/wave_clu
Question about" set-parameters.m" #1 line37; par.stdmin = 5; % minimum threshold for detection line38; par.stdmax = 50; % maximum threshold for detection
What is the numerical unit for those parameters above? Micro volt?
Question about" set-parameters.m" #2 line55; par.inputs = 10; % number of inputs to the clustering line56; par.scales = 4; % number of scales for the wavelet decomposition
I don't know what they are, but usually I don't need to change those values?
Question about" set-parameters.m" #3 line22; par.min_clus = 60; % minimum size of a cluster (default 60) Does it mean that Wave_Clus need at least 60 spikes for clustering?
In the manual, page 3, iv) Clustering, you say that "the optimal temperature is set as the largest temperature for which a cluster with at least min_clus members appears.".
To understand this sentence, I would like to know the definition of par.min_clus precisely.
Question about GUI #1 When I clicked "Plot all projections", Cluster window which appeared looks a little strange as is seen in this attachement file. Basically, each cluster look too small so it is a little difficult to see how each cluster sperad. But if I click "magnifier" icon, and next click on some clusters,
the cluster group dissapeared. Why does it happen?
Question about Output file of Get_spikes_pol(1), "spikes". The structure of the variable "spikes" is "X x Y double". To the column direction (from the top to the bottom), each spike information is aligning, right? To the row direction (from the left to the right), amplitude changes around the spike are shown maybe in Volt? If not, what is the numerical unit of this information?
Further, in my case, there are 120 colums. Does it mean that it contains 120 data points? How can I define the number of datapoints? In the manual, page 2, ii) Spike detection, you say that "For each spike, w_pre datapoints before and w-post datapoints after the spike peak are stored.". In my case, w_pre datapoints=10, and w-post datapoints=20 (sampling rate is 10kHz), so I expected that number of spike data points will be 30, but the result was 120. Do you know why?
Then, is it possible to see which column represents the voltage corresponds to the "spike"
with the time stamp?
Question about Output file of Do_clustering_pol(1), "ipermut".
What is the meaning of those two digits integer?
Question about Output file of Do_clustering_pol(1), "inspk". The structure of the variable "inspk" is "X x Y double", so it looks similar to "spikes", but in my case, the number of columns in "inspk" is 40 whereas that in "spikes" is 120. What does "inspk" describe? Is it just a short version of "spikes" (in Volt)? Or is the collection of numbers on the same row assigned to the set of variable
which describes the feature of the spike shapes?
Those above are my current questions to be cleared before I can use Wave_Clus for our experiment. I hope I can hear from you.
Sincerely, M