Closed byronprice closed 7 years ago
I've made some progress. If you're using the ms3 pipeline with a dataset abbreviated as ds1, you can call "kron-run ms3 --prescribed_event_times=/path/to/event_times/event_times.nt01.mda ds1" for each of the raw.nt0x.mda / event_times.nt0x.mda pairs. So, if you had 4 tetrodes in a given recording session, you'd have to call kron-run separately for each of the 4 tetrodes (with each tetrode having its own dataset abbreviation (ds1, ds2, etc.). Now the question is: Is this the correct way to organize everything and run the sorting? It seems like overkill to have to add an entry to datasets.txt for every tetrode from every recording session.
"It seems like overkill to have to add an entry to datasets.txt for every tetrode from every recording session."
That is a good point. I think this is better handled with a custom pipeline and not using the kron- system. That's where we're moving anyway since the pipelines are getting pretty complicated and not conforming to the simplicity of kron-run. The simplest way to do sorting is by making a sequence of processor calls
mp-run-process mountainsort.bandpass_filter... mp-run-process mountainsort.whiten... mp-run-process mountainsort.mountainsort3 (here's where the prescribed event times can go)... ...
from this you can build the needed script. I can follow up with more details.
On Tue, Oct 3, 2017 at 8:47 PM, Byron Price notifications@github.com wrote:
I've made some progress. If you're using the ms3 pipeline with a dataset abbreviated as ds1, you can call "kron-run ms3 --prescribed_event_times=/ path/to/event_times/event_times.nt01.mda ds1" for each of the raw.nt0x.mda / event_times.nt0x.mda pairs. So, if you had 4 tetrodes in a given recording session, you'd have to call kron-run separately for each of the 4 tetrodes (with each tetrode having its own dataset abbreviation (ds1, ds2, etc.). Now the question is: Is this the correct way to organize everything and run the sorting? It seems like overkill to have to add an entry to datasets.txt for every tetrode from every recording session.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/magland/mountainlab/issues/174#issuecomment-334019916, or mute the thread https://github.com/notifications/unsubscribe-auth/ADgkQMSUpOKrFpkYxA6ymiIvi9h6MUKaks5sotWygaJpZM4PsjXC .
I'd be interested in more info. For now, I've written a bash script that cycles through all of the raw*.mda files in a given .mda folder. I have sub-directories in the datasets directory ready to go with the appropriate params.json files, named "channel1", "channel2", "channel3", etc. As it loops through each tetrode, it pauses while you curate the results.
cd /path/to/.mda/folder/with/snippets
var=0 for f in raw*.mda do var=$((var+1)) datasetName="channel$var" prv-create /path/to/raw.mda/files/$f /path/to/datasets/$datasetName/raw.mda.prv done
var=0 for f in event_times*.mda do var=$((var+1)) cd /path/to/directory/with/datasets.txt/and/pipelines.txt/etc. datasetAbbrev="ch$var" kron-run ms3 --prescribed_event_times/path/to/raw.mda/files/$f $datasetAbbrev kron-view results ms3 $datasetAbbrev done
We're working on documentation, so when the next release comes out in the coming weeks we can revisit making a more sensible script for this type of situation.
I'm having some difficulty figuring out how to set up the pipelines.txt file with snippets. In the documentation, it says:
"Manually pass in --prescribed_event_times=/path/to/event_times after the pipeline call. Since each ntrode has its own event_times, you could do this at the command line or each ntrode could have an entry in pipelines.txt."
I've got ds1 set to a single recording session with 8 recording electrodes. The .mda folder for the dataset has the files event_times.nt01.mda , event_times.nt02.mda , ... , raw.nt01.mda , raw.nt02.mda , etc. If I understand the documentation correctly, my pipelines.txt file should have eight separate --prescribed_event_times=/path/to/event_times/event_times.nt0x.mda .
Let me know if you need any information on the output of trying to run the sorter.