Closed holawa closed 10 months ago
Hi @holawa
1. for this kind of bug that occur in multiprocessing (n_jobs>1)
BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
The easiest is to use n_jobs=1 and get a better trace for the bug.
We did a recent fix spkykingcircus in code if you install spikeinterface from source maybe this could fix it.
Same as 1
In short could try to run the same notebook with n_jobs=1 ? And repost error here ?
Thank you! @samuelgarcia Now I get the latest version! It seemed 'NoneType' object has no attribute 'split' may be the fatal problem I really met.
print(f"SpikeInterface version: {si.__version__}")
--------------------------------------------------
SpikeInterface version: 0.100.0.dev0
I used n_jobs = 1 and get some results
if (base_folder / "preprocessed_compressed.zarr").is_dir():
recording_saved = si.read_zarr(base_folder / "preprocessed_compressed.zarr")
else:
import numcodecs
compressor = numcodecs.Blosc(cname="zstd", clevel=9, shuffle=numcodecs.Blosc.BITSHUFFLE)
recording_saved = recording_sub.save(format="zarr", folder=base_folder / "preprocessed_compressed.zarr",
compressor=compressor,
**job_kwargs)
It's solved! With no problem!
write_zarr_recording with n_jobs = 1 and chunk_size = 30000
write_zarr_recording: 100%
300/300 [01:16<00:00, 4.16it/s]
In this part, a new error came.
sorting_SC2 = si.run_sorter('spykingcircus2', recording_saved,
output_folder=base_folder / 'results_SC2',
verbose=True, job_kwargs=job_kwargs)
It returned
Click to expand!
detect peaks using locally_exclusive with n_jobs = 1 and chunk_size = 30000
detect peaks using locally_exclusive: 100%
300/300 [00:22<00:00, 13.47it/s]
We found 126896 peaks in total
We kept 126896 peaks for clustering
extracting features with n_jobs = 1 and chunk_size = 30000
extracting features: 100%
300/300 [00:23<00:00, 12.88it/s]
We found 78 raw clusters, starting to clean with matching...
extract waveforms shared_memory multi buffer with n_jobs = 1 and chunk_size = 30000
extract waveforms shared_memory multi buffer: 100%
300/300 [00:16<00:00, 18.83it/s]
extract waveforms shared_memory multi buffer with n_jobs = 1 and chunk_size = 30000
extract waveforms shared_memory multi buffer: 100%
300/300 [00:17<00:00, 17.15it/s]
Error running spykingcircus2
---------------------------------------------------------------------------
SpikeSortingError Traceback (most recent call last)
Cell In[50], line 1
----> 1 sorting_SC2 = si.run_sorter('spykingcircus2', recording_saved,
2 output_folder=base_folder / 'results_SC2',
3 verbose=True, job_kwargs=job_kwargs)
File ~\Downloads\spikeinterface_main\src\spikeinterface\sorters\runsorter.py:148, in run_sorter(sorter_name, recording, output_folder, remove_existing_folder, delete_output_folder, verbose, raise_error, docker_image, singularity_image, delete_container_files, with_output, **sorter_params)
141 container_image = singularity_image
142 return run_sorter_container(
143 container_image=container_image,
144 mode=mode,
145 **common_kwargs,
146 )
--> 148 return run_sorter_local(**common_kwargs)
File ~\Downloads\spikeinterface_main\src\spikeinterface\sorters\runsorter.py:174, in run_sorter_local(sorter_name, recording, output_folder, remove_existing_folder, delete_output_folder, verbose, raise_error, with_output, **sorter_params)
172 SorterClass.set_params_to_folder(recording, output_folder, sorter_params, verbose)
173 SorterClass.setup_recording(recording, output_folder, verbose=verbose)
--> 174 SorterClass.run_from_folder(output_folder, raise_error, verbose)
175 if with_output:
176 sorting = SorterClass.get_result_from_folder(output_folder, register_recording=True, sorting_info=True)
File ~\Downloads\spikeinterface_main\src\spikeinterface\sorters\basesorter.py:289, in BaseSorter.run_from_folder(cls, output_folder, raise_error, verbose)
286 print(f"{sorter_name} run time {run_time:0.2f}s")
288 if has_error and raise_error:
--> 289 raise SpikeSortingError(
290 f"Spike sorting error trace:\n{log['error_trace']}\n"
291 f"Spike sorting failed. You can inspect the runtime trace in {output_folder}/spikeinterface_log.json."
292 )
294 return run_time
SpikeSortingError: Spike sorting error trace:
Traceback (most recent call last):
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sorters\basesorter.py", line 254, in run_from_folder
SorterClass._run_from_folder(sorter_output_folder, sorter_params, verbose)
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sorters\internal\spyking_circus2.py", line 131, in _run_from_folder
labels, peak_labels = find_cluster_from_peaks(
^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\clustering\main.py", line 42, in find_cluster_from_peaks
labels, peak_labels = method_class.main_function(recording, peaks, params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\clustering\random_projections.py", line 215, in main_function
labels, peak_labels = remove_duplicates_via_matching(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\clustering\clustering_tools.py", line 610, in remove_duplicates_via_matching
spikes, computed = find_spikes_from_templates(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\matching\main.py", line 60, in find_spikes_from_templates
spikes = processor.run()
^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\core\job_tools.py", line 359, in run
res = self.func(segment_index, frame_start, frame_stop, worker_ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\matching\main.py", line 105, in _find_spikes_chunk
with threadpool_limits(limits=1):
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 171, in __init__
self._original_info = self._set_threadpool_limits()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 268, in _set_threadpool_limits
modules = _ThreadpoolInfo(prefixes=self._prefixes,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 340, in __init__
self._load_modules()
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 373, in _load_modules
self._find_modules_with_enum_process_module_ex()
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 485, in _find_modules_with_enum_process_module_ex
self._make_module_from_path(filepath)
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 515, in _make_module_from_path
module = module_class(filepath, prefix, user_api, internal_api)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 606, in __init__
self.version = self.get_version()
^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 646, in get_version
config = get_config().split()
^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'split'
Spike sorting failed. You can inspect the runtime trace in C:\Users\Melody\Desktop\test1\results_SC2/spikeinterface_log.json.
And I continued to use TridesClous2
sorting_TC2 = si.run_sorter('tridesclous2', recording_saved,
output_folder=base_folder / 'results_TC22',
verbose=True, job_kwargs=job_kwargs)
Here're the result
Click to expand!
detect peaks using locally_exclusive: 100%
300/300 [00:23<00:00, 12.46it/s]
We found 92487 peaks in total
We kept 92487 peaks for clustering
extract waveforms shared_memory mono buffer: 100%
300/300 [00:17<00:00, 17.84it/s]
pipeline: 100%
300/300 [00:22<00:00, 13.33it/s]
split_clusters with local_feature_clustering: 0%
0/49 [00:00, ?it/s]
Error running tridesclous2
---------------------------------------------------------------------------
SpikeSortingError Traceback (most recent call last)
Cell In[51], line 1
----> 1 sorting_TC2 = si.run_sorter('tridesclous2', recording_saved,
2 output_folder=base_folder / 'results_TC22',
3 verbose=True, job_kwargs=job_kwargs)
File ~\Downloads\spikeinterface_main\src\spikeinterface\sorters\runsorter.py:148, in run_sorter(sorter_name, recording, output_folder, remove_existing_folder, delete_output_folder, verbose, raise_error, docker_image, singularity_image, delete_container_files, with_output, **sorter_params)
141 container_image = singularity_image
142 return run_sorter_container(
143 container_image=container_image,
144 mode=mode,
145 **common_kwargs,
146 )
--> 148 return run_sorter_local(**common_kwargs)
File ~\Downloads\spikeinterface_main\src\spikeinterface\sorters\runsorter.py:174, in run_sorter_local(sorter_name, recording, output_folder, remove_existing_folder, delete_output_folder, verbose, raise_error, with_output, **sorter_params)
172 SorterClass.set_params_to_folder(recording, output_folder, sorter_params, verbose)
173 SorterClass.setup_recording(recording, output_folder, verbose=verbose)
--> 174 SorterClass.run_from_folder(output_folder, raise_error, verbose)
175 if with_output:
176 sorting = SorterClass.get_result_from_folder(output_folder, register_recording=True, sorting_info=True)
File ~\Downloads\spikeinterface_main\src\spikeinterface\sorters\basesorter.py:289, in BaseSorter.run_from_folder(cls, output_folder, raise_error, verbose)
286 print(f"{sorter_name} run time {run_time:0.2f}s")
288 if has_error and raise_error:
--> 289 raise SpikeSortingError(
290 f"Spike sorting error trace:\n{log['error_trace']}\n"
291 f"Spike sorting failed. You can inspect the runtime trace in {output_folder}/spikeinterface_log.json."
292 )
294 return run_time
SpikeSortingError: Spike sorting error trace:
Traceback (most recent call last):
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sorters\basesorter.py", line 254, in run_from_folder
SorterClass._run_from_folder(sorter_output_folder, sorter_params, verbose)
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sorters\internal\tridesclous2.py", line 208, in _run_from_folder
post_split_label, split_count = split_clusters(
^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\clustering\split.py", line 92, in split_clusters
is_split, local_labels, peak_indices = res.result()
^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\core\job_tools.py", line 431, in result
return self.f(*self.args)
^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\clustering\split.py", line 149, in split_function_wrapper
with threadpool_limits(limits=_ctx["max_threads_per_process"]):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 171, in __init__
self._original_info = self._set_threadpool_limits()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 268, in _set_threadpool_limits
modules = _ThreadpoolInfo(prefixes=self._prefixes,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 340, in __init__
self._load_modules()
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 373, in _load_modules
self._find_modules_with_enum_process_module_ex()
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 485, in _find_modules_with_enum_process_module_ex
self._make_module_from_path(filepath)
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 515, in _make_module_from_path
module = module_class(filepath, prefix, user_api, internal_api)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 606, in __init__
self.version = self.get_version()
^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 646, in get_version
config = get_config().split()
^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'split'
Spike sorting failed. You can inspect the runtime trace in C:\Users\Melody\Desktop\test1\results_TC22/spikeinterface_log.json.
Thank you again!
Hi @holawa
for this kind of bug that occur in multiprocessing (n_jobs>1)
BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
The easiest is to use n_jobs=1 and get a better trace for the bug.
- We did a recent fix spkykingcircus in code if you install spikeinterface from source maybe this could fix it.
- Same as 1
In short could try to run the same notebook with n_jobs=1 ? And repost error here ?
For 1, this is a good news but this must be very slow. Maybe you could try a few n_jobs. How many core do you have on your machine ? For the bug 2 and 3 I think, this is also because inetrnally tdc2 orsc2 are doing multiporcessing and sometimes is going wrong in your case.
Thank you ! @samuelgarcia
For ①, sadly, if I set other n_jobs, it may get wrong. when I set the n_jobs as 2 , I can get the message
write_zarr_recording with n_jobs = 2 and chunk_size = 30000
write_zarr_recording: 0%
0/300 [00:01<?, ?it/s]
---------------------------------------------------------------------------
BrokenProcessPool Traceback (most recent call last)
Almost the same as before.
For ② and ③ I have no idea, sorry. Multiporcessing seemed run well in this computer several weeks ago, and I used to run the Tutorial 0.99 with no modification. Is there a good way to solve the problem about multiporcessing ?
My computer in lab uses Intel i9-10900 and has 10 cores with a 64GB RAM. If I run n_cpus = os.cpu_count()
I can get the number 20.
Actually this computer has two users, another user(administrator) went another lab, and I set my count(no administrator) on this computer. I set the remote connection on win10 so I can use it wherever I am.
Everything went well until this week, I tried to test a new sorter "Klustra" in docker, then I found I got a message as no module named 'Spikeinterface'. I was shocked and ran pip install spikeinterface[full,widgets]
pip install --upgrade spikeinterface[full,widgets]
several times and reboot the computer. Then I found I can't run well with SpikeTutorials 0.99, ①②③ occured and I got terrible.
I haven't directly tried the dataset from the tutorial, but using some simulated data, and mimicking the tutorial, I am able to get multiprocessing to work. I think @h-mayorquin's comment, may be pertinent to this in that on Windows there is an inconsistent need to use name==main in Windows specifically in the Spikeinterface context. I never need to use (thus far), but I also don't try to run full scripts (I typically do a more cell by cell approach).
Other possibilities I can think of is that my computer isn't hyperthreaded so it has 8 physical and 8 logical cores. I'm not sure if the multiprocessing for Windows struggles with logical cores or not. I have administrator privileges, but I don't typically run my python as admin, so I don't think that should be the problem.
@holawa,
One question I have is how are you running the tutorial. In a juypter notebook or did you try to copy the cells into a standard .py
file? I'm not sure how if __name__ == '__main__'
could fit into a juypter context, but maybe @samuelgarcia could answer that better.
Thank you! @zm711 Your words touched me ! I run the tutorial in jupyter lab since I knew Spikeinterface. I usually run these blocks step by step, but I never run them in .py files. I think I could try this way, for my computer is running a 54GB data in Spikeinterface , mountainsort4 , I will test this idea immediately when the task is over.
I think it's a nice way for .py file running in windows and I wrote a short .py file to try SpykingCircus2 in Spikeinterface. For convenience I have make a preprocessed file, so I could load the variable recording_saved.
import spikeinterface
import spikeinterface.full as si
import spikeinterface.extractors as se
import spikeinterface.widgets as sw
import probeinterface as pi
import os
from probeinterface.plotting import plot_probe,plot_probe_group
from pathlib import Path
import warnings
warnings.simplefilter("ignore")
if __name__ == '__main__':
base_folder = Path('C:/Users/Melody/Desktop/test0')
n_jobs = 1
job_kwargs = dict(n_jobs=n_jobs, chunk_duration="1s", progress_bar=True)
if (base_folder / "preprocessed").is_dir():
recording_saved = si.load_extractor(base_folder / "preprocessed")
else:
recording_saved = recording_sub.save(folder=base_folder / "preprocessed")
print(recording_saved)
#In Spikeinterface spykingcircus2
sorting_SC2 = si.run_sorter('spykingcircus2', recording_saved,
output_folder=base_folder / 'results_SC2',
verbose=True,
job_kwargs=job_kwargs
)
It could work, however, it returned
Click to expand!
PS C:\Users\Melody\Desktop\test0> python .\SpykingCircus2.py
C:\Users\Melody\anaconda3\Lib\site-packages\paramiko\transport.py:219: CryptographyDeprecationWarning: Blowfish has been deprecated
"class": algorithms.Blowfish,
BinaryFolderRecording: 49 channels - 30.0kHz - 1 segments - 9,000,000 samples
300.00s (5.00 minutes) - int16 dtype - 841.14 MiB
detect peaks using locally_exclusive with n_jobs = 1 and chunk_size = 30000
detect peaks using locally_exclusive: 100%|############################################| 300/300 [00:22<00:00, 13.48it/s]
We found 126896 peaks in total
We kept 126896 peaks for clustering
extracting features with n_jobs = 1 and chunk_size = 30000
extracting features: 100%|#############################################################| 300/300 [00:21<00:00, 13.88it/s]
We found 78 raw clusters, starting to clean with matching...
extract waveforms shared_memory multi buffer with n_jobs = 1 and chunk_size = 30000
extract waveforms shared_memory multi buffer: 100%|####################################| 300/300 [00:15<00:00, 19.70it/s]
extract waveforms shared_memory multi buffer with n_jobs = 1 and chunk_size = 30000
extract waveforms shared_memory multi buffer: 100%|####################################| 300/300 [00:16<00:00, 18.30it/s]
Error running spykingcircus2
Traceback (most recent call last):
File "C:\Users\Melody\Desktop\test0\SpykingCircus2.py", line 23, in
sorting_SC2 = si.run_sorter('spykingcircus2', recording_saved,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sorters\runsorter.py", line 148, in run_sorter return run_sorter_local(**common_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sorters\runsorter.py", line 174, in run_sorter_local
SorterClass.run_from_folder(output_folder, raise_error, verbose)
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sorters\basesorter.py", line 289, in run_from_folder
raise SpikeSortingError(
spikeinterface.sorters.utils.misc.SpikeSortingError: Spike sorting error trace:
Traceback (most recent call last):
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sorters\basesorter.py", line 254, in run_from_folder
SorterClass._run_from_folder(sorter_output_folder, sorter_params, verbose)
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sorters\internal\spyking_circus2.py", line 131, in _run_from_folder
labels, peak_labels = find_cluster_from_peaks(
^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\clustering\main.py", line 42, in find_cluster_from_peaks
labels, peak_labels = method_class.main_function(recording, peaks, params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\clustering\random_projections.py", line 215, in main_function
labels, peak_labels = remove_duplicates_via_matching(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\clustering\clustering_tools.py", line 610, in remove_duplicates_via_matching
spikes, computed = find_spikes_from_templates(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\matching\main.py", line 60,
in find_spikes_from_templates
spikes = processor.run()
^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\core\job_tools.py", line 359, in run
res = self.func(segment_index, frame_start, frame_stop, worker_ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\sortingcomponents\matching\main.py", line 105, in _find_spikes_chunk
with threadpool_limits(limits=1):
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 171, in __init__
self._original_info = self._set_threadpool_limits()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 268, in _set_threadpool_limits
modules = _ThreadpoolInfo(prefixes=self._prefixes,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 340, in __init__
self._load_modules()
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 373, in _load_modules
self._find_modules_with_enum_process_module_ex()
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 485, in _find_modules_with_enum_process_module_ex
self._make_module_from_path(filepath)
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 515, in _make_module_from_path
module = module_class(filepath, prefix, user_api, internal_api)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 606, in __init__
self.version = self.get_version()
^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 646, in get_version
config = get_config().split()
^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'split'
Spike sorting failed. You can inspect the runtime trace in C:\Users\Melody\Desktop\test0\results_SC2/spikeinterface_log.json.
If I change the n_jobs to -1, I could get something like this. And the part C:\Users\Melody\anaconda3\Lib\site-packages\paramiko\transport.py:219: CryptographyDeprecationWarning: Blowfish has been deprecated
"class": algorithms.Blowfish
to AttributeError: 'NoneType' object has no attribute 'split'
would cycle in my powershell.
Click to expand!
PS C:\Users\Melody\Desktop\test0> python .\SpykingCircus2.py
C:\Users\Melody\anaconda3\Lib\site-packages\paramiko\transport.py:219: CryptographyDeprecationWarning: Blowfish has been deprecated
"class": algorithms.Blowfish,
BinaryFolderRecording: 49 channels - 30.0kHz - 1 segments - 9,000,000 samples
300.00s (5.00 minutes) - int16 dtype - 841.14 MiB
detect peaks using locally_exclusive with n_jobs = 20 and chunk_size = 30000
C:\Users\Melody\anaconda3\Lib\site-packages\paramiko\transport.py:219: CryptographyDeprecationWarning: Blowfish has been deprecated
"class": algorithms.Blowfish,
Exception in initializer:
Traceback (most recent call last):
File "C:\Users\Melody\anaconda3\Lib\concurrent\futures\process.py", line 235, in _process_worker
initializer(*initargs)
File "C:\Users\Melody\Downloads\spikeinterface_main\src\spikeinterface\core\job_tools.py", line 401, in worker_initializer
with threadpool_limits(limits=max_threads_per_process):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 171, in __init__
self._original_info = self._set_threadpool_limits()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 268, in _set_threadpool_limits
modules = _ThreadpoolInfo(prefixes=self._prefixes,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 340, in __init__
self._load_modules()
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 373, in _load_modules
self._find_modules_with_enum_process_module_ex()
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 485, in _find_modules_with_enum_process_module_ex
self._make_module_from_path(filepath)
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 515, in _make_module_from_path
module = module_class(filepath, prefix, user_api, internal_api)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 606, in __init__
self.version = self.get_version()
^^^^^^^^^^^^^^^^^^
File "C:\Users\Melody\anaconda3\Lib\site-packages\threadpoolctl.py", line 646, in get_version
config = get_config().split()
^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'split'
I haven't directly tried the dataset from the tutorial, but using some simulated data, and mimicking the tutorial, I am able to get multiprocessing to work. I think @h-mayorquin's comment, may be pertinent to this in that on Windows there is an inconsistent need to use name==main in Windows specifically in the Spikeinterface context. I never need to use (thus far), but I also don't try to run full scripts (I typically do a more cell by cell approach).
Other possibilities I can think of is that my computer isn't hyperthreaded so it has 8 physical and 8 logical cores. I'm not sure if the multiprocessing for Windows struggles with logical cores or not. I have administrator privileges, but I don't typically run my python as admin, so I don't think that should be the problem.
@holawa,
One question I have is how are you running the tutorial. In a juypter notebook or did you try to copy the cells into a standard
.py
file? I'm not sure howif __name__ == '__main__'
could fit into a juypter context, but maybe @samuelgarcia could answer that better.
Sorry I mean try number 1 with the the script. So something like:
job_kwargs = dict(n_jobs=4, chunk_duration='1s', progress_bar=True)
if __name__ == '__main__':
recording_saved = recording_sub.save(format="zarr", folder=base_folder / "preprocessed_compressed.zarr",
compressor=compressor,
**job_kwargs)
I wanted to test multiprocessing first. It seems like there is a separate multiprocessing-thread based issue that we also need to think about. But I want to see if we protect your recording.save()
if that will work with n_jobs >1. Could you start with that and then we can try a sorter after that.
Could you also let us know the version of threadpoolctl you have? You can just type conda list
in the environment containing spikeinterface. (Although it appears you are working in base instead of a conda env is that true?)
I feel sorry that I failed. I wrote a code block like this, and replaced the old one
job_kwargs = dict(n_jobs=4, chunk_duration='1s', progress_bar=True)
if __name__ == '__main__':
if (base_folder / "preprocessed_compressed.zarr").is_dir():
recording_saved = si.read_zarr(base_folder / "preprocessed_compressed.zarr")
else:
compressor = numcodecs.Blosc(cname="zstd", clevel=9, shuffle=numcodecs.Blosc.BITSHUFFLE)
recording_saved = recording_sub.save(format="zarr", folder=base_folder / "preprocessed_compressed.zarr",
compressor=compressor,
**job_kwargs)
The code returned
Click to expand!
write_zarr_recording with n_jobs = 4 and chunk_size = 30000
write_zarr_recording: 0%
0/300 [00:01, ?it/s]
---------------------------------------------------------------------------
BrokenProcessPool Traceback (most recent call last)
Cell In[50], line 7
5 else:
6 compressor = numcodecs.Blosc(cname="zstd", clevel=9, shuffle=numcodecs.Blosc.BITSHUFFLE)
----> 7 recording_saved = recording_sub.save(format="zarr", folder=base_folder / "preprocessed_compressed.zarr",
8 compressor=compressor,
9 **job_kwargs)
File ~\Downloads\spikeinterface_main\src\spikeinterface\core\base.py:828, in BaseExtractor.save(self, **kwargs)
826 loaded_extractor = self.save_to_memory(**kwargs)
827 elif format == "zarr":
--> 828 loaded_extractor = self.save_to_zarr(**kwargs)
829 else:
830 loaded_extractor = self.save_to_folder(**kwargs)
File ~\Downloads\spikeinterface_main\src\spikeinterface\core\base.py:1015, in BaseExtractor.save_to_zarr(self, name, folder, storage_options, channel_chunk_size, verbose, zarr_path, **save_kwargs)
1013 save_kwargs["storage_options"] = storage_options
1014 save_kwargs["channel_chunk_size"] = channel_chunk_size
-> 1015 cached = self._save(verbose=verbose, **save_kwargs)
1017 # save properties
1018 prop_group = zarr_root.create_group("properties")
File ~\Downloads\spikeinterface_main\src\spikeinterface\core\baserecording.py:519, in BaseRecording._save(self, format, **save_kwargs)
513 zarr_kwargs["compressor"] = compressor = get_default_zarr_compressor()
514 print(
515 f"Using default zarr compressor: {compressor}. To use a different compressor, use the "
516 f"'compressor' argument"
517 )
--> 519 write_traces_to_zarr(self, **zarr_kwargs, **job_kwargs)
521 # save probe
522 if self.get_property("contact_vector") is not None:
File ~\Downloads\spikeinterface_main\src\spikeinterface\core\core_tools.py:709, in write_traces_to_zarr(recording, zarr_root, zarr_path, storage_options, dataset_paths, channel_chunk_size, dtype, compressor, filters, verbose, auto_cast_uint, **job_kwargs)
705 init_args = (recording, zarr_path, storage_options, dataset_paths, dtype, cast_unsigned)
706 executor = ChunkRecordingExecutor(
707 recording, func, init_func, init_args, verbose=verbose, job_name="write_zarr_recording", **job_kwargs
708 )
--> 709 executor.run()
File ~\Downloads\spikeinterface_main\src\spikeinterface\core\job_tools.py:379, in ChunkRecordingExecutor.run(self)
376 if self.progress_bar:
377 results = tqdm(results, desc=self.job_name, total=len(all_chunks))
--> 379 for res in results:
380 if self.handle_returns:
381 returns.append(res)
File ~\anaconda3\Lib\site-packages\tqdm\notebook.py:254, in tqdm_notebook.__iter__(self)
252 try:
253 it = super(tqdm_notebook, self).__iter__()
--> 254 for obj in it:
255 # return super(tqdm...) will not catch exception
256 yield obj
257 # NB: except ... [ as ...] breaks IPython async KeyboardInterrupt
File ~\anaconda3\Lib\site-packages\tqdm\std.py:1178, in tqdm.__iter__(self)
1175 time = self._time
1177 try:
-> 1178 for obj in iterable:
1179 yield obj
1180 # Update and possibly print the progressbar.
1181 # Note: does not call self.update(1) for speed optimisation.
File ~\anaconda3\Lib\concurrent\futures\process.py:606, in _chain_from_iterable_of_lists(iterable)
600 def _chain_from_iterable_of_lists(iterable):
601 """
602 Specialized implementation of itertools.chain.from_iterable.
603 Each item in *iterable* should be a list. This function is
604 careful not to keep references to yielded objects.
605 """
--> 606 for element in iterable:
607 element.reverse()
608 while element:
File ~\anaconda3\Lib\concurrent\futures\_base.py:619, in Executor.map..result_iterator()
616 while fs:
617 # Careful not to keep a reference to the popped future
618 if timeout is None:
--> 619 yield _result_or_cancel(fs.pop())
620 else:
621 yield _result_or_cancel(fs.pop(), end_time - time.monotonic())
File ~\anaconda3\Lib\concurrent\futures\_base.py:317, in _result_or_cancel(***failed resolving arguments***)
315 try:
316 try:
--> 317 return fut.result(timeout)
318 finally:
319 fut.cancel()
File ~\anaconda3\Lib\concurrent\futures\_base.py:456, in Future.result(self, timeout)
454 raise CancelledError()
455 elif self._state == FINISHED:
--> 456 return self.__get_result()
457 else:
458 raise TimeoutError()
File ~\anaconda3\Lib\concurrent\futures\_base.py:401, in Future.__get_result(self)
399 if self._exception:
400 try:
--> 401 raise self._exception
402 finally:
403 # Break a reference cycle with the exception in self._exception
404 self = None
BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
**job_kwargs
so it run well , I'm sorry that I forgot it:
if (base_folder / "preprocessed").is_dir():
recording_saved = si.load_extractor(base_folder / "preprocessed")
else:
# recording_saved = recording_sub.save(folder=base_folder / "preprocessed", **job_kwargs)
recording_saved = recording_sub.save(folder=base_folder / "preprocessed")
I run the file in jupyter lab, using the python 3.11 env of Anaconda. Here are my lists:
conda list (Click to expand!)
PS C:\Users\Melody\Desktop\test0> conda list
# packages in environment at C:\Users\Melody\anaconda3:
#
# Name Version Build Channel
_anaconda_depends 2023.09 py311_mkl_1
abseil-cpp 20211102.0 hd77b12b_0
aiobotocore 2.5.0 py311haa95532_0
aiohttp 3.9.0 py311h2bbff1b_0
aioitertools 0.7.1 pyhd3eb1b0_0
aiosignal 1.2.0 pyhd3eb1b0_0
alabaster 0.7.12 pyhd3eb1b0_0
altair 5.1.2 pypi_0 pypi
anaconda-anon-usage 0.4.2 py311hfc23b7f_0
anaconda-catalogs 0.2.0 py311haa95532_0
anaconda-client 1.12.1 py311haa95532_0
anaconda-cloud-auth 0.1.4 py311haa95532_0
anaconda-navigator 2.5.0 py311haa95532_0
anaconda-project 0.11.1 py311haa95532_0
anyio 3.5.0 py311haa95532_0
aom 3.6.0 hd77b12b_0
appdirs 1.4.4 pyhd3eb1b0_0
archspec 0.2.1 pyhd3eb1b0_0
argon2-cffi 21.3.0 pyhd3eb1b0_0
argon2-cffi-bindings 21.2.0 py311h2bbff1b_0
arrow 1.2.3 py311haa95532_1
arrow-cpp 11.0.0 ha81ea56_2
asciitree 0.3.3 pypi_0 pypi
astroid 2.14.2 py311haa95532_0
astropy 5.3.4 py311hd7041d2_0
asttokens 2.0.5 pyhd3eb1b0_0
async-lru 2.0.4 py311haa95532_0
atomicwrites 1.4.0 py_0
attrs 23.1.0 py311haa95532_0
automat 20.2.0 py_0
autopep8 1.6.0 pyhd3eb1b0_1
aws-c-common 0.6.8 h2bbff1b_1
aws-c-event-stream 0.1.6 hd77b12b_6
aws-checksums 0.1.11 h2bbff1b_2
aws-sdk-cpp 1.8.185 hd77b12b_1
babel 2.11.0 py311haa95532_0
backcall 0.2.0 pyhd3eb1b0_0
backports 1.1 pyhd3eb1b0_0
backports.functools_lru_cache 1.6.4 pyhd3eb1b0_0
backports.tempfile 1.0 pyhd3eb1b0_1
backports.weakref 1.0.post1 py_1
bcrypt 3.2.0 py311h2bbff1b_1
beautifulsoup4 4.12.2 py311haa95532_0
binaryornot 0.4.4 pyhd3eb1b0_1
black 23.11.0 py311haa95532_0
blas 1.0 mkl
bleach 4.1.0 pyhd3eb1b0_0
blosc 1.11.1 pypi_0 pypi
bokeh 3.3.0 py311h746a85d_0
boltons 23.0.0 py311haa95532_0
boost-cpp 1.82.0 h59b6b97_2
botocore 1.29.76 py311haa95532_0
bottleneck 1.3.5 py311h5bb9823_0
brotli 1.0.9 h2bbff1b_7
brotli-bin 1.0.9 h2bbff1b_7
brotli-python 1.0.9 py311hd77b12b_7
bzip2 1.0.8 he774522_0
c-ares 1.19.1 h2bbff1b_0
c-blosc2 2.10.5 h2f4ed9d_0
ca-certificates 2023.08.22 haa95532_0
cbor2 5.5.1 pypi_0 pypi
certifi 2023.11.17 py311haa95532_0
cffi 1.16.0 py311h2bbff1b_0
cfitsio 3.470 h2bbff1b_7
chardet 4.0.0 py311haa95532_1003
charls 2.2.0 h6c2663c_0
charset-normalizer 2.0.4 pyhd3eb1b0_0
click 8.1.7 py311haa95532_0
cloudpickle 2.2.1 py311haa95532_0
clyent 1.2.2 py311haa95532_1
colorama 0.4.6 py311haa95532_0
colorcet 3.0.1 py311haa95532_0
comm 0.1.2 py311haa95532_0
conda 23.10.0 py311haa95532_0
conda-build 3.26.1 py311haa95532_0
conda-content-trust 0.2.0 py311haa95532_0
conda-index 0.3.0 py311haa95532_0
conda-libmamba-solver 23.11.1 py311haa95532_0
conda-pack 0.6.0 pyhd3eb1b0_0
conda-package-handling 2.2.0 py311haa95532_0
conda-package-streaming 0.9.0 py311haa95532_0
conda-repo-cli 1.0.75 py311haa95532_0
conda-token 0.4.0 pyhd3eb1b0_0
conda-verify 3.4.2 py_1
console_shortcut 0.1.1 4
constantly 15.1.0 py311haa95532_0
contourpy 1.2.0 py311h59b6b97_0
cookiecutter 2.5.0 py311haa95532_0
cpython 0.0.6 pypi_0 pypi
cryptography 41.0.3 py311h89fc84f_0
cssselect 1.1.0 pyhd3eb1b0_0
cuda-python 12.3.0 pypi_0 pypi
curl 8.1.1 h2bbff1b_0
cycler 0.11.0 pyhd3eb1b0_0
cython 0.29.36 pypi_0 pypi
cytoolz 0.12.0 py311h2bbff1b_0
daal4py 2023.1.1 py311h30df693_0
dal 2023.1.1 h59b6b97_48682
dask 2023.6.0 py311haa95532_0
dask-core 2023.6.0 py311haa95532_0
datasets 2.12.0 py311haa95532_0
datashader 0.16.0 py311haa95532_0
dav1d 1.2.1 h2bbff1b_0
debugpy 1.6.7 py311hd77b12b_0
decorator 5.1.1 pyhd3eb1b0_0
defusedxml 0.7.1 pyhd3eb1b0_0
diff-match-patch 20200713 pyhd3eb1b0_0
dill 0.3.6 py311haa95532_0
distinctipy 1.2.3 pypi_0 pypi
distributed 2023.6.0 py311haa95532_0
dnspython 2.4.2 pypi_0 pypi
docker 6.1.3 pypi_0 pypi
docstring-to-markdown 0.11 py311haa95532_0
docutils 0.18.1 py311haa95532_3
elephant 1.0.0 pypi_0 pypi
entrypoints 0.4 py311haa95532_0
ephyviewer 1.6.0 pypi_0 pypi
et_xmlfile 1.1.0 py311haa95532_0
executing 0.8.3 pyhd3eb1b0_0
fasteners 0.19 pypi_0 pypi
figurl 0.2.18 pypi_0 pypi
filelock 3.13.1 py311haa95532_0
flake8 6.0.0 py311haa95532_0
flask 2.2.2 py311haa95532_0
fmt 9.1.0 h6d14046_0
fonttools 4.25.0 pyhd3eb1b0_0
fqdn 1.5.1 pypi_0 pypi
freetype 2.12.1 ha860e81_0
frozenlist 1.4.0 py311h2bbff1b_0
frz-jupyterlab-variableinspector 0.1.4 pypi_0 pypi
fsspec 2023.4.0 py311haa95532_0
future 0.18.3 py311haa95532_0
gensim 4.3.0 py311heda8569_0
gflags 2.2.2 ha925a31_0
ghp-import 2.1.0 pypi_0 pypi
giflib 5.2.1 h8cc25b3_3
glob2 0.7 pyhd3eb1b0_0
glog 0.5.0 hd77b12b_0
greenlet 2.0.1 py311hd77b12b_0
grpc-cpp 1.48.2 hfe90ff0_1
h5py 3.9.0 py311h4e0e482_0
hdbscan 0.8.33 pypi_0 pypi
hdf5 1.12.1 h51c971a_3
hdmf 3.11.0 pypi_0 pypi
heapdict 1.0.1 pyhd3eb1b0_0
holoviews 1.18.1 py311haa95532_0
huggingface_hub 0.17.3 py311haa95532_0
hvplot 0.9.0 py311haa95532_0
hyperlink 21.0.0 pyhd3eb1b0_0
icc_rt 2022.1.0 h6049295_2
icu 73.1 h6c2663c_0
idna 3.4 py311haa95532_0
imagecodecs 2023.1.23 py311he6ff3c7_0
imageio 2.31.4 py311haa95532_0
imagesize 1.4.1 py311haa95532_0
imbalanced-learn 0.11.0 py311haa95532_1
importlib-metadata 6.0.0 py311haa95532_0
importlib_metadata 6.0.0 hd3eb1b0_0
incremental 21.3.0 pyhd3eb1b0_0
inflection 0.5.1 py311haa95532_0
iniconfig 1.1.1 pyhd3eb1b0_0
intake 0.6.8 py311haa95532_0
intel-openmp 2023.1.0 h59b6b97_46320
intervaltree 3.1.0 pyhd3eb1b0_0
ipykernel 6.25.0 py311h746a85d_0
ipympl 0.9.3 pypi_0 pypi
ipython 8.15.0 py311haa95532_0
ipython_genutils 0.2.0 pyhd3eb1b0_1
ipywidgets 8.0.4 py311haa95532_0
isoduration 20.11.0 pypi_0 pypi
isort 5.9.3 pyhd3eb1b0_0
itemadapter 0.3.0 pyhd3eb1b0_0
itemloaders 1.0.4 pyhd3eb1b0_1
itsdangerous 2.0.1 pyhd3eb1b0_0
jaraco.classes 3.2.1 pyhd3eb1b0_0
jedi 0.18.1 py311haa95532_1
jellyfish 1.0.1 py311h36a85e1_0
jinja2 3.1.2 py311haa95532_0
jmespath 1.0.1 py311haa95532_0
joblib 1.2.0 py311haa95532_0
jpeg 9e h2bbff1b_1
jq 1.6 haa95532_1
json5 0.9.6 pyhd3eb1b0_0
jsonpatch 1.32 pyhd3eb1b0_0
jsonpointer 2.1 pyhd3eb1b0_0
jsonschema 4.19.2 py311haa95532_0
jsonschema-specifications 2023.7.1 py311haa95532_0
jupyter 1.0.0 py311haa95532_8
jupyter-contrib-core 0.4.2 pypi_0 pypi
jupyter-contrib-nbextensions 0.7.0 pypi_0 pypi
jupyter-highlight-selected-word 0.2.0 pypi_0 pypi
jupyter-lsp 2.2.0 py311haa95532_0
jupyter-nbextensions-configurator 0.6.3 pypi_0 pypi
jupyter_client 8.6.0 py311haa95532_0
jupyter_console 6.6.3 py311haa95532_0
jupyter_core 5.5.0 py311haa95532_0
jupyter_events 0.8.0 py311haa95532_0
jupyter_server 2.10.0 py311haa95532_0
jupyter_server_terminals 0.4.4 py311haa95532_1
jupyterlab 4.0.8 py311haa95532_0
jupyterlab_pygments 0.1.2 py_0
jupyterlab_server 2.25.1 py311haa95532_0
jupyterlab_widgets 3.0.9 py311haa95532_0
kachery-cloud 0.4.3 pypi_0 pypi
kaleido-core 0.2.1 h2bbff1b_0
keyring 23.13.1 py311haa95532_0
kiwisolver 1.4.4 py311hd77b12b_0
krb5 1.20.1 h5b6d351_0
lazy-object-proxy 1.6.0 py311h2bbff1b_0
lazy-ops 0.2.0 pypi_0 pypi
lazy_loader 0.3 py311haa95532_0
lcms2 2.12 h83e58a3_0
lerc 3.0 hd77b12b_0
libaec 1.0.4 h33f27b4_1
libarchive 3.6.2 hb62f4d4_2
libavif 0.11.1 h2bbff1b_0
libboost 1.82.0 h3399ecb_2
libbrotlicommon 1.0.9 h2bbff1b_7
libbrotlidec 1.0.9 h2bbff1b_7
libbrotlienc 1.0.9 h2bbff1b_7
libclang 14.0.6 default_hb5a9fac_1
libclang13 14.0.6 default_h8e68704_1
libcurl 8.1.1 h86230a5_0
libdeflate 1.17 h2bbff1b_1
libevent 2.1.12 h56d1f94_1
libffi 3.4.4 hd77b12b_0
libiconv 1.16 h2bbff1b_2
liblief 0.12.3 hd77b12b_0
libmamba 1.5.3 hcd6fe79_0
libmambapy 1.5.3 py311h77c03ed_0
libpng 1.6.39 h8cc25b3_0
libpq 12.15 h906ac69_1
libprotobuf 3.20.3 h23ce68f_0
libsodium 1.0.18 h62dcd97_0
libsolv 0.7.24 h23ce68f_0
libspatialindex 1.9.3 h6c2663c_0
libssh2 1.10.0 he2ea4bf_2
libthrift 0.15.0 h4364b78_2
libtiff 4.5.1 hd77b12b_0
libwebp 1.3.2 hbc33d0d_0
libwebp-base 1.3.2 h2bbff1b_0
libxml2 2.10.4 h0ad7f3c_1
libxslt 1.1.37 h2bbff1b_1
libzopfli 1.0.3 ha925a31_0
linkify-it-py 2.0.0 py311haa95532_0
llvmlite 0.41.0 py311hf2fb9eb_0
locket 1.0.0 py311haa95532_0
loky 3.4.1 pypi_0 pypi
lxml 4.9.3 py311h09808a7_0
lz4 4.3.2 py311h2bbff1b_0
lz4-c 1.9.4 h2bbff1b_0
lzo 2.10 he774522_2
m2-msys2-runtime 2.5.0.17080.65c939c 3
m2-patch 2.7.5 2
m2w64-libwinpthread-git 5.0.0.4634.697f757 2
markdown 3.4.1 py311haa95532_0
markdown-it-py 2.2.0 py311haa95532_1
markupsafe 2.1.1 py311h2bbff1b_0
mathjax 2.7.5 haa95532_0
matplotlib 3.8.0 py311haa95532_0
matplotlib-base 3.8.0 py311hf62ec03_0
matplotlib-inline 0.1.6 py311haa95532_0
mccabe 0.7.0 pyhd3eb1b0_0
mdit-py-plugins 0.3.0 py311haa95532_0
mdurl 0.1.0 py311haa95532_0
mearec 1.9.0 pypi_0 pypi
meautility 1.5.1 pypi_0 pypi
menuinst 1.4.19 py311h59b6b97_1
mergedeep 1.3.4 pypi_0 pypi
mistune 2.0.4 py311haa95532_0
mkdocs 1.5.3 pypi_0 pypi
mkl 2023.1.0 h6b88ed4_46358
mkl-service 2.4.0 py311h2bbff1b_1
mkl_fft 1.3.8 py311h2bbff1b_0
mkl_random 1.2.4 py311h59b6b97_0
more-itertools 10.1.0 py311haa95532_0
mpi4py 3.1.5 pypi_0 pypi
mpmath 1.3.0 py311haa95532_0
msgpack-python 1.0.3 py311h59b6b97_0
msys2-conda-epoch 20160418 1
mtscomp 1.0.2 pypi_0 pypi
multidict 6.0.4 py311h2bbff1b_0
multipledispatch 0.6.0 py311haa95532_0
multiprocess 0.70.14 py311haa95532_0
munkres 1.1.4 py_0
mypy_extensions 1.0.0 py311haa95532_0
navigator-updater 0.4.0 py311haa95532_1
nbclient 0.8.0 py311haa95532_0
nbconvert 7.10.0 py311haa95532_0
nbformat 5.9.2 py311haa95532_0
neo 0.12.0 pypi_0 pypi
nest-asyncio 1.5.6 py311haa95532_0
networkx 3.1 py311haa95532_0
nltk 3.8.1 py311haa95532_0
nodejs 18.16.0 haa95532_1
notebook 7.0.6 py311haa95532_0
notebook-shim 0.2.3 py311haa95532_0
numba 0.58.1 py311hf62ec03_0
numcodecs 0.12.1 pypi_0 pypi
numexpr 2.8.7 py311h1fcbade_0
numpy 1.26.2 py311hdab7c0b_0
numpy-base 1.26.2 py311hd01c5d8_0
numpydoc 1.5.0 py311haa95532_0
openjpeg 2.4.0 h4fc8c34_0
openpyxl 3.0.10 py311h2bbff1b_0
openssl 3.0.12 h2bbff1b_0
orc 1.7.4 h623e30f_1
overrides 7.4.0 py311haa95532_0
packaging 23.1 py311haa95532_0
pandas 2.1.1 py311hf62ec03_0
pandocfilters 1.5.0 pyhd3eb1b0_0
panel 1.3.1 py311haa95532_0
param 2.0.1 py311haa95532_0
paramiko 2.8.1 pyhd3eb1b0_0
parsel 1.6.0 py311haa95532_0
parso 0.8.3 pyhd3eb1b0_0
partd 1.4.1 py311haa95532_0
pathlib 1.0.1 pyhd3eb1b0_1
pathspec 0.11.2 pypi_0 pypi
patsy 0.5.3 py311haa95532_0
pcre2 10.42 h0ff8eda_0
pep8 1.7.1 py311haa95532_1
pexpect 4.8.0 pyhd3eb1b0_3
phy 2.0b2.dev0 pypi_0 pypi
phylib 2.4.3 pypi_0 pypi
pickleshare 0.7.5 pyhd3eb1b0_1003
pillow 10.0.1 py311h045eedc_0
pip 23.3 py311haa95532_0
pkce 1.0.3 py311haa95532_0
pkginfo 1.9.6 py311haa95532_0
platformdirs 3.10.0 py311haa95532_0
plotly 5.9.0 py311haa95532_0
pluggy 1.0.0 py311haa95532_1
ply 3.11 py311haa95532_0
powershell_shortcut 0.0.1 3
probeinterface 0.2.19 pypi_0 pypi
prometheus_client 0.14.1 py311haa95532_0
prompt-toolkit 3.0.36 py311haa95532_0
prompt_toolkit 3.0.36 hd3eb1b0_0
protego 0.1.16 py_0
psutil 5.9.0 py311h2bbff1b_0
ptyprocess 0.7.0 pyhd3eb1b0_2
pubnub 7.3.1 pypi_0 pypi
pure_eval 0.2.2 pyhd3eb1b0_0
py-cpuinfo 9.0.0 py311haa95532_0
py-lief 0.12.3 py311hd77b12b_0
pyarrow 11.0.0 py311h8a3a540_1
pyasn1 0.4.8 pyhd3eb1b0_0
pyasn1-modules 0.2.8 py_0
pybind11 2.11.1 pypi_0 pypi
pybind11-abi 4 hd3eb1b0_1
pycodestyle 2.10.0 py311haa95532_0
pycosat 0.6.6 py311h2bbff1b_0
pycparser 2.21 pyhd3eb1b0_0
pycryptodomex 3.19.0 pypi_0 pypi
pyct 0.5.0 py311haa95532_0
pycurl 7.45.2 py311he2ea4bf_1
pydantic 1.10.12 py311h2bbff1b_1
pydispatcher 2.0.5 py311haa95532_2
pydocstyle 6.3.0 py311haa95532_0
pyerfa 2.0.0 py311h2bbff1b_0
pyflakes 3.0.1 py311haa95532_0
pygments 2.15.1 py311haa95532_1
pyjwt 2.4.0 py311haa95532_0
pylint 2.16.2 py311haa95532_0
pylint-venv 2.3.0 py311haa95532_0
pyls-spyder 0.4.0 pyhd3eb1b0_0
pymongo 4.6.1 pypi_0 pypi
pynacl 1.5.0 py311h8cc25b3_0
pynwb 2.5.0 pypi_0 pypi
pyodbc 4.0.39 py311hd77b12b_0
pyopengl 3.1.7 pypi_0 pypi
pyopenssl 23.2.0 py311haa95532_0
pyparsing 3.0.9 py311haa95532_0
pyqt 5.15.10 py311hd77b12b_0
pyqt5-sip 12.13.0 py311h2bbff1b_0
pyqtgraph 0.13.3 pypi_0 pypi
pyqtwebengine 5.15.10 py311hd77b12b_0
pysocks 1.7.1 py311haa95532_0
pytables 3.8.0 py311h4671533_3
pytest 7.4.0 py311haa95532_0
python 3.11.5 he1021f5_0
python-dateutil 2.8.2 pyhd3eb1b0_0
python-dotenv 0.21.0 py311haa95532_0
python-fastjsonschema 2.16.2 py311haa95532_0
python-json-logger 2.0.7 py311haa95532_0
python-kaleido 0.2.1 py311haa95532_0
python-libarchive-c 2.9 pyhd3eb1b0_1
python-lmdb 1.4.1 py311hd77b12b_0
python-lsp-black 1.2.1 py311haa95532_0
python-lsp-jsonrpc 1.0.0 pyhd3eb1b0_0
python-lsp-server 1.7.2 py311haa95532_0
python-slugify 5.0.2 pyhd3eb1b0_0
python-snappy 0.6.1 py311hd77b12b_0
python-tzdata 2023.3 pyhd3eb1b0_0
python-xxhash 2.0.2 py311h2bbff1b_1
pytoolconfig 1.2.5 py311haa95532_1
pytz 2023.3.post1 py311haa95532_0
pyviz_comms 2.3.0 py311haa95532_0
pywavelets 1.4.1 py311h2bbff1b_0
pywin32 305 py311h2bbff1b_0
pywin32-ctypes 0.2.0 py311haa95532_1000
pywinpty 2.0.10 py311h5da7b33_0
pyyaml 6.0.1 py311h2bbff1b_0
pyyaml-env-tag 0.1 pypi_0 pypi
pyzmq 25.1.0 py311hd77b12b_0
qdarkstyle 3.0.2 pyhd3eb1b0_0
qstylizer 0.2.2 py311haa95532_0
qt-main 5.15.2 h19c9488_10
qt-webengine 5.15.9 h5bd16bc_7
qtawesome 1.2.2 py311haa95532_0
qtconsole 5.4.2 py311haa95532_0
qtpy 2.4.1 py311haa95532_0
quantities 0.14.1 pypi_0 pypi
queuelib 1.6.2 py311haa95532_0
re2 2022.04.01 hd77b12b_0
referencing 0.30.2 py311haa95532_0
regex 2023.10.3 py311h2bbff1b_0
reproc 14.2.4 hd77b12b_1
reproc-cpp 14.2.4 hd77b12b_1
requests 2.31.0 py311haa95532_0
requests-file 1.5.1 pyhd3eb1b0_0
requests-toolbelt 1.0.0 py311haa95532_0
responses 0.13.3 pyhd3eb1b0_0
rfc3339-validator 0.1.4 py311haa95532_0
rfc3986-validator 0.1.1 py311haa95532_0
rich 13.3.5 py311haa95532_0
rope 1.7.0 py311haa95532_0
rpds-py 0.10.6 py311h062c2fa_0
rtree 1.0.1 py311h2eaa2aa_0
ruamel.yaml 0.17.21 py311h2bbff1b_0
ruamel_yaml 0.17.21 py311h2bbff1b_0
s3fs 2023.4.0 py311haa95532_0
safetensors 0.4.0 py311hcbdf901_0
scikit-image 0.20.0 py311h3513d60_0
scikit-learn 1.3.2 pypi_0 pypi
scikit-learn-intelex 2023.1.1 py311haa95532_0
scipy 1.11.4 py311hc1ccb85_0
scrapy 2.8.0 py311haa95532_0
seaborn 0.12.2 py311haa95532_0
semver 2.13.0 pyhd3eb1b0_0
send2trash 1.8.2 py311haa95532_0
service_identity 18.1.0 pyhd3eb1b0_1
setuptools 68.0.0 py311haa95532_0
simplejson 3.19.2 pypi_0 pypi
sip 6.7.12 py311hd77b12b_0
six 1.16.0 pyhd3eb1b0_1
smart_open 5.2.1 py311haa95532_0
snappy 1.1.9 h6c2663c_0
sniffio 1.2.0 py311haa95532_1
snowballstemmer 2.2.0 pyhd3eb1b0_0
sortedcontainers 2.4.0 pyhd3eb1b0_0
sortingview 0.12.0 pypi_0 pypi
soupsieve 2.5 py311haa95532_0
sphinx 5.0.2 py311haa95532_0
sphinxcontrib-applehelp 1.0.2 pyhd3eb1b0_0
sphinxcontrib-devhelp 1.0.2 pyhd3eb1b0_0
sphinxcontrib-htmlhelp 2.0.0 pyhd3eb1b0_0
sphinxcontrib-jsmath 1.0.1 pyhd3eb1b0_0
sphinxcontrib-qthelp 1.0.3 pyhd3eb1b0_0
sphinxcontrib-serializinghtml 1.1.5 pyhd3eb1b0_0
spikeinterface 0.100.0.dev0 pypi_0 pypi
spikeinterface-gui 0.7.0 pypi_0 pypi
spyder 5.4.3 py311haa95532_1
spyder-kernels 2.4.4 py311haa95532_0
spyking-circus 1.1.0 pypi_0 pypi
spython 0.3.1 pypi_0 pypi
sqlalchemy 2.0.21 py311h2bbff1b_0
sqlite 3.41.2 h2bbff1b_0
stack_data 0.2.0 pyhd3eb1b0_0
statsmodels 0.14.0 py311hd7041d2_0
sympy 1.11.1 py311haa95532_0
tabulate 0.8.10 py311haa95532_0
tbb 2021.8.0 h59b6b97_0
tbb4py 2021.8.0 py311h59b6b97_0
tblib 1.7.0 pyhd3eb1b0_0
tenacity 8.2.2 py311haa95532_0
terminado 0.17.1 py311haa95532_0
text-unidecode 1.3 pyhd3eb1b0_0
textdistance 4.2.1 pyhd3eb1b0_0
threadpoolctl 2.2.0 pyh0d69192_0
three-merge 0.1.1 pyhd3eb1b0_0
tifffile 2023.4.12 py311haa95532_0
tinycss2 1.2.1 py311haa95532_0
tk 8.6.12 h2bbff1b_0
tldextract 3.2.0 pyhd3eb1b0_0
tokenizers 0.13.3 py311h49fca51_0
toml 0.10.2 pyhd3eb1b0_0
tomlkit 0.11.1 py311haa95532_0
toolz 0.12.0 py311haa95532_0
tornado 6.3.3 py311h2bbff1b_0
tqdm 4.65.0 py311h746a85d_0
traitlets 5.7.1 py311haa95532_0
transformers 4.32.1 py311haa95532_0
tridesclous 1.6.8 pypi_0 pypi
truststore 0.8.0 py311haa95532_0
twisted 22.10.0 py311h2bbff1b_0
twisted-iocpsupport 1.0.2 py311h2bbff1b_0
typing-extensions 4.7.1 py311haa95532_0
typing_extensions 4.7.1 py311haa95532_0
tzdata 2023c h04d1e81_0
uc-micro-py 1.0.1 py311haa95532_0
ujson 5.4.0 py311hd77b12b_0
unidecode 1.2.0 pyhd3eb1b0_0
uri-template 1.3.0 pypi_0 pypi
urllib3 1.26.18 py311haa95532_0
utf8proc 2.6.1 h2bbff1b_0
vc 14.2 h21ff451_1
vs2015_runtime 14.27.29016 h5e58377_2
w3lib 1.21.0 pyhd3eb1b0_0
watchdog 2.1.6 py311haa95532_0
wcwidth 0.2.5 pyhd3eb1b0_0
webcolors 1.13 pypi_0 pypi
webencodings 0.5.1 py311haa95532_1
websocket-client 0.58.0 py311haa95532_4
werkzeug 2.2.3 py311haa95532_0
whatthepatch 1.0.2 py311haa95532_0
widgetsnbextension 4.0.5 py311haa95532_0
win_inet_pton 1.1.0 py311haa95532_0
winpty 0.4.3 4
wrapt 1.14.1 py311h2bbff1b_0
xarray 2023.6.0 py311haa95532_0
xlwings 0.29.1 py311haa95532_0
xxhash 0.8.0 h2bbff1b_3
xyzservices 2022.9.0 py311haa95532_1
yaml 0.2.5 he774522_0
yaml-cpp 0.8.0 hd77b12b_0
yapf 0.31.0 pyhd3eb1b0_0
yarl 1.9.3 py311h2bbff1b_0
zarr 2.16.1 pypi_0 pypi
zeromq 4.3.4 hd77b12b_0
zfp 1.0.0 hd77b12b_0
zict 3.0.0 py311haa95532_0
zipp 3.11.0 py311haa95532_0
zlib 1.2.13 h8cc25b3_0
zlib-ng 2.0.7 h2bbff1b_0
zope 1.0 py311haa95532_1
zope.interface 5.4.0 py311h2bbff1b_0
zstandard 0.19.0 py311h2bbff1b_0
zstd 1.5.5 hd43e919_0
pip list (Click to expand!)
PS C:\Users\Melody\Desktop\test0> conda activate spike
PS C:\Users\Melody\Desktop\test0> pip list
Package Version Editable project location
--------------------------------- --------------- ------------------------------------------------
aiobotocore 2.5.0
aiohttp 3.9.0
aioitertools 0.7.1
aiosignal 1.2.0
alabaster 0.7.12
altair 5.1.2
anaconda-anon-usage 0.4.2
anaconda-catalogs 0.2.0
anaconda-client 1.12.1
anaconda-cloud-auth 0.1.4
anaconda-navigator 2.5.0
anaconda-project 0.11.1
anyio 3.5.0
appdirs 1.4.4
archspec 0.2.1
argon2-cffi 21.3.0
argon2-cffi-bindings 21.2.0
arrow 1.2.3
asciitree 0.3.3
astroid 2.14.2
astropy 5.3.4
asttokens 2.0.5
async-lru 2.0.4
atomicwrites 1.4.0
attrs 23.1.0
Automat 20.2.0
autopep8 1.6.0
Babel 2.11.0
backcall 0.2.0
backports.functools-lru-cache 1.6.4
backports.tempfile 1.0
backports.weakref 1.0.post1
bcrypt 3.2.0
beautifulsoup4 4.12.2
binaryornot 0.4.4
black 23.11.0
bleach 4.1.0
blosc 1.11.1
bokeh 3.3.0
boltons 23.0.0
botocore 1.29.76
Bottleneck 1.3.5
Brotli 1.0.9
cbor2 5.5.1
certifi 2023.11.17
cffi 1.16.0
chardet 4.0.0
charset-normalizer 2.0.4
click 8.1.7
cloudpickle 2.2.1
clyent 1.2.2
colorama 0.4.6
colorcet 3.0.1
comm 0.1.2
conda 23.10.0
conda-build 3.26.1
conda-content-trust 0.2.0
conda_index 0.3.0
conda-libmamba-solver 23.11.1
conda-pack 0.6.0
conda-package-handling 2.2.0
conda_package_streaming 0.9.0
conda-repo-cli 1.0.75
conda-token 0.4.0
conda-verify 3.4.2
constantly 15.1.0
contourpy 1.2.0
cookiecutter 2.5.0
cPython 0.0.6
cryptography 41.0.3
cssselect 1.1.0
cuda-python 12.3.0
cycler 0.11.0
Cython 0.29.36
cytoolz 0.12.0
daal4py 2023.1.1
dask 2023.6.0
datasets 2.12.0
datashader 0.16.0
debugpy 1.6.7
decorator 5.1.1
defusedxml 0.7.1
diff-match-patch 20200713
dill 0.3.6
distinctipy 1.2.3
distributed 2023.6.0
dnspython 2.4.2
docker 6.1.3
docstring-to-markdown 0.11
docutils 0.18.1
elephant 1.0.0
entrypoints 0.4
ephyviewer 1.6.0
et-xmlfile 1.1.0
executing 0.8.3
fasteners 0.19
fastjsonschema 2.16.2
figurl 0.2.18
filelock 3.13.1
flake8 6.0.0
Flask 2.2.2
fonttools 4.25.0
fqdn 1.5.1
frozenlist 1.4.0
frz-jupyterlab-variableinspector 0.1.4
fsspec 2023.4.0
future 0.18.3
gensim 4.3.0
ghp-import 2.1.0
glob2 0.7
greenlet 2.0.1
h5py 3.9.0
hdbscan 0.8.33
hdmf 3.11.0
HeapDict 1.0.1
holoviews 1.18.1
huggingface-hub 0.17.3
hvplot 0.9.0
hyperlink 21.0.0
idna 3.4
imagecodecs 2023.1.23
imageio 2.31.4
imagesize 1.4.1
imbalanced-learn 0.11.0
importlib-metadata 6.0.0
incremental 21.3.0
inflection 0.5.1
iniconfig 1.1.1
intake 0.6.8
intervaltree 3.1.0
ipykernel 6.25.0
ipympl 0.9.3
ipython 8.15.0
ipython-genutils 0.2.0
ipywidgets 8.0.4
isoduration 20.11.0
isort 5.9.3
itemadapter 0.3.0
itemloaders 1.0.4
itsdangerous 2.0.1
jaraco.classes 3.2.1
jedi 0.18.1
jellyfish 1.0.1
Jinja2 3.1.2
jmespath 1.0.1
joblib 1.2.0
json5 0.9.6
jsonpatch 1.32
jsonpointer 2.1
jsonschema 4.19.2
jsonschema-specifications 2023.7.1
jupyter 1.0.0
jupyter_client 8.6.0
jupyter-console 6.6.3
jupyter-contrib-core 0.4.2
jupyter-contrib-nbextensions 0.7.0
jupyter_core 5.5.0
jupyter-events 0.8.0
jupyter-highlight-selected-word 0.2.0
jupyter-lsp 2.2.0
jupyter-nbextensions-configurator 0.6.3
jupyter_server 2.10.0
jupyter_server_terminals 0.4.4
jupyterlab 4.0.8
jupyterlab-pygments 0.1.2
jupyterlab_server 2.25.1
jupyterlab-widgets 3.0.9
kachery-cloud 0.4.3
kaleido 0.2.1
keyring 23.13.1
kiwisolver 1.4.4
lazy_loader 0.3
lazy-object-proxy 1.6.0
lazy-ops 0.2.0
libarchive-c 2.9
libmambapy 1.5.3
linkify-it-py 2.0.0
llvmlite 0.41.0
lmdb 1.4.1
locket 1.0.0
loky 3.4.1
lxml 4.9.3
lz4 4.3.2
Markdown 3.4.1
markdown-it-py 2.2.0
MarkupSafe 2.1.1
matplotlib 3.8.0
matplotlib-inline 0.1.6
mccabe 0.7.0
mdit-py-plugins 0.3.0
mdurl 0.1.0
MEArec 1.9.0
MEAutility 1.5.1
menuinst 1.4.19
mergedeep 1.3.4
mistune 2.0.4
mkdocs 1.5.3
mkl-fft 1.3.8
mkl-random 1.2.4
mkl-service 2.4.0
more-itertools 10.1.0
mpi4py 3.1.5
mpmath 1.3.0
msgpack 1.0.3
mtscomp 1.0.2
multidict 6.0.4
multipledispatch 0.6.0
multiprocess 0.70.14
munkres 1.1.4
mypy-extensions 1.0.0
navigator-updater 0.4.0
nbclient 0.8.0
nbconvert 7.10.0
nbformat 5.9.2
neo 0.12.0
nest-asyncio 1.5.6
networkx 3.1
nltk 3.8.1
notebook 7.0.6
notebook_shim 0.2.3
numba 0.58.1
numcodecs 0.12.1
numexpr 2.8.7
numpy 1.26.2
numpydoc 1.5.0
openpyxl 3.0.10
overrides 7.4.0
packaging 23.1
pandas 2.1.1
pandocfilters 1.5.0
panel 1.3.1
param 2.0.1
paramiko 2.8.1
parsel 1.6.0
parso 0.8.3
partd 1.4.1
pathlib 1.0.1
pathspec 0.11.2
patsy 0.5.3
pep8 1.7.1
pexpect 4.8.0
phy 2.0b2.dev0
phylib 2.4.3
pickleshare 0.7.5
Pillow 10.0.1
pip 23.3
pkce 1.0.3
pkginfo 1.9.6
platformdirs 3.10.0
plotly 5.9.0
pluggy 1.0.0
ply 3.11
probeinterface 0.2.19
prometheus-client 0.14.1
prompt-toolkit 3.0.36
Protego 0.1.16
psutil 5.9.0
ptyprocess 0.7.0
pubnub 7.3.1
pure-eval 0.2.2
py-cpuinfo 9.0.0
pyarrow 11.0.0
pyasn1 0.4.8
pyasn1-modules 0.2.8
pybind11 2.11.1
pycodestyle 2.10.0
pycosat 0.6.6
pycparser 2.21
pycryptodomex 3.19.0
pyct 0.5.0
pycurl 7.45.2
pydantic 1.10.12
PyDispatcher 2.0.5
pydocstyle 6.3.0
pyerfa 2.0.0
pyflakes 3.0.1
Pygments 2.15.1
PyJWT 2.4.0
pylint 2.16.2
pylint-venv 2.3.0
pyls-spyder 0.4.0
pymongo 4.6.1
PyNaCl 1.5.0
pynwb 2.5.0
pyodbc 4.0.39
PyOpenGL 3.1.7
pyOpenSSL 23.2.0
pyparsing 3.0.9
PyQt5 5.15.10
PyQt5-sip 12.13.0
pyqtgraph 0.13.3
PyQtWebEngine 5.15.6
PySocks 1.7.1
pytest 7.4.0
python-dateutil 2.8.2
python-dotenv 0.21.0
python-json-logger 2.0.7
python-lsp-black 1.2.1
python-lsp-jsonrpc 1.0.0
python-lsp-server 1.7.2
python-slugify 5.0.2
python-snappy 0.6.1
pytoolconfig 1.2.5
pytz 2023.3.post1
pyviz-comms 2.3.0
PyWavelets 1.4.1
pywin32 305.1
pywin32-ctypes 0.2.0
pywinpty 2.0.10
PyYAML 6.0.1
pyyaml_env_tag 0.1
pyzmq 25.1.0
QDarkStyle 3.0.2
qstylizer 0.2.2
QtAwesome 1.2.2
qtconsole 5.4.2
QtPy 2.4.1
quantities 0.14.1
queuelib 1.6.2
referencing 0.30.2
regex 2023.10.3
requests 2.31.0
requests-file 1.5.1
requests-toolbelt 1.0.0
responses 0.13.3
rfc3339-validator 0.1.4
rfc3986-validator 0.1.1
rich 13.3.5
rope 1.7.0
rpds-py 0.10.6
Rtree 1.0.1
ruamel.yaml 0.17.21
ruamel-yaml-conda 0.17.21
s3fs 2023.4.0
safetensors 0.4.0
scikit-image 0.20.0
scikit-learn 1.3.2
scikit-learn-intelex 20230426.121932
scipy 1.11.4
Scrapy 2.8.0
seaborn 0.12.2
semver 2.13.0
Send2Trash 1.8.2
service-identity 18.1.0
setuptools 68.0.0
simplejson 3.19.2
sip 6.7.12
six 1.16.0
smart-open 5.2.1
sniffio 1.2.0
snowballstemmer 2.2.0
sortedcontainers 2.4.0
sortingview 0.12.0
soupsieve 2.5
Sphinx 5.0.2
sphinxcontrib-applehelp 1.0.2
sphinxcontrib-devhelp 1.0.2
sphinxcontrib-htmlhelp 2.0.0
sphinxcontrib-jsmath 1.0.1
sphinxcontrib-qthelp 1.0.3
sphinxcontrib-serializinghtml 1.1.5
spikeinterface 0.100.0.dev0 C:\Users\Melody\Downloads\spikeinterface_main
spikeinterface-gui 0.7.0
spyder 5.4.3
spyder-kernels 2.4.4
spyking-circus 1.1.0
spython 0.3.1
SQLAlchemy 2.0.21
stack-data 0.2.0
statsmodels 0.14.0
sympy 1.11.1
tables 3.8.0
tabulate 0.8.10
TBB 0.2
tblib 1.7.0
tenacity 8.2.2
terminado 0.17.1
text-unidecode 1.3
textdistance 4.2.1
threadpoolctl 2.2.0
three-merge 0.1.1
tifffile 2023.4.12
tinycss2 1.2.1
tldextract 3.2.0
tokenizers 0.13.3
toml 0.10.2
tomlkit 0.11.1
toolz 0.12.0
tornado 6.3.3
tqdm 4.65.0
traitlets 5.7.1
transformers 4.32.1
tridesclous 1.6.8
truststore 0.8.0
Twisted 22.10.0
twisted-iocpsupport 1.0.2
typing_extensions 4.7.1
tzdata 2023.3
uc-micro-py 1.0.1
ujson 5.4.0
Unidecode 1.2.0
uri-template 1.3.0
urllib3 1.26.18
w3lib 1.21.0
watchdog 2.1.6
wcwidth 0.2.5
webcolors 1.13
webencodings 0.5.1
websocket-client 0.58.0
Werkzeug 2.2.3
whatthepatch 1.0.2
wheel 0.38.4
widgetsnbextension 4.0.5
win-inet-pton 1.1.0
wrapt 1.14.1
xarray 2023.6.0
xlwings 0.29.1
xxhash 2.0.2
xyzservices 2022.9.0
yapf 0.31.0
yarl 1.9.3
zarr 2.16.1
zict 3.0.0
zipp 3.11.0
zope.interface 5.4.0
zstandard 0.19.0
PS C:\Users\Melody\Desktop\test0>
No worries. There is a known bug in threadpoolctl 2.x, so it might be worth while to update threadpoolctl to 3+ and see if that fixes it. The threadpoolctl 2.x can lead to the None
not having attribute split
. Could you try to update and then repeat the sorting and see if that helps?
Really, really, really thank you! @zm711
I update the threadpoolctl 2.2
to threadpoolctl 3.2
and ① ② ③ went all well. Even if I set the n_jobs
to other numbers, the code went stable and I feel much better !
No worries. There is a known bug in threadpoolctl 2.x, so it might be worth while to update threadpoolctl to 3+ and see if that fixes it. The threadpoolctl 2.x can lead to the
None
not having attributesplit
. Could you try to update and then repeat the sorting and see if that helps?
Thanks @zm711. Maybe we should bump up the minimal version of threadpoolctl. I will take a look at that and then close this.
Maybe with the switch in #2218 it led to this issue with threadpoolctl coming up. It happened two weeks ago so the timeline makes sense. But yeah based on my reading I think it is probably better just to have a minimal version (but if you discover something else with your reading do let me know!)
Not sure it's related to #2218 since that only affected the PCA computation and quality metrics
You’re right. I think it was just an old install. I support @h-mayorquin ‘s idea to just put in a minimum version so that if someone has an old install we make sure SI doesn’t have a buggy threadpoolctl.
Just a note that this error can occur, even with threadpoolctl >= 3.2.0
, when not enough memory is available (with respect to chunk size). This is something that happens to me sometimes when debugging code on a remote cluster node, and I did not request as much memory as when running jobs. The issue would not easily be fixed by job_kwargs
optimization, because the total memory and number of cores on the node is not the same as the number reserved for the job.
@vncntprvst Thanks for this. If we get enough votes @samuelgarcia might become magically inspired to work on this :).
I think you raise a good point though. I think there are two separate problems:
1) have threadpoolctl < 3 leads to the actual issue here. the None
attribute error would be fixed by fixing threadpoolctl.
2) if the job_kwargs are not tuned properly then the parallelization will break (as you see) and currently the job_kwargs require the user to tune rather than having spikeinterface aid the user in tuning them to appropriate conditions.
I think we need to both bump the threadpoolctl, but also have a job_kwargs_optimizer (but that's my opinion of course).
Thanks again.
Dear developers ! I am using 0.99.1 SpikeInterface on windows10, my computer has 64GB RAM, and I meet some problems when I try to learn something from the tutorial. BrokenProcessPool seems to occur everywhere. Indeed I doubt it occurs when I try to use the Variable job_kwargs,wether in .save() , spykingcircus2 or tridesclous2 Actually, serveral weeks ago, I didn't meet BrokenProcessPool when I followed the tutorial. And I'm really need your help.
①In the [Official_Tutorial_SI_0.99_Nov23] Capter 3 Saving and loading SpikeInterface objects
When I run the code block above, it returns BrokenProcessPool information
Click to expand!
②In the [Official_Tutorial_SI_0.99_Nov23] Capter 4 Spike sorting Run internal sorter : 'spykingcircus2'
Then when I tried to use installed sorter 'spykingcircus2' and 'tridesclous2' the BrokenProcessPool still occured. I remained the set of tutorial:
When I run the code below
It returns
Click to expand!
③Then I tried the internal sorter 'tridesclous2'
It all remained, I just modify my imput into
Then I know something maybe wrong, I got such results:
Click to expand!
If I run the sorters in Docker, such as kilosort2, mountainsort4, tridecircus, ironclust, etc. They're all good, except when I run a huge(maybe) data in spykingsort image(the tutorial data is OK with no error), I meet the same problem with #895.However,I feel sorry that I am just a new learner , I can't manage to solve the problem by setting the parameters (just because I don't know where to set , use whith function to set).