flatironinstitute / CaImAn

Computational toolbox for large scale Calcium Imaging Analysis, including movie handling, motion correction, source extraction, spike deconvolution and result visualization.
https://caiman.readthedocs.io
GNU General Public License v2.0
640 stars 370 forks source link

Issue with multiprocessing in VS code interactive window #1151

Closed JohnStout closed 10 months ago

JohnStout commented 1 year ago

Please fill in the following for any issues

Your setup:

  1. Operating System (Linux, MacOS, Windows):
  2. Hardware type (x86, ARM..) and RAM:
  3. Python Version (3.11.4):
  4. Caiman version (1.9.15):
  5. Which demo exhibits the problem (if applicable): NA
  6. How you installed Caiman (pure conda, conda + compile, colab, ..):
  7. Details:

After reinstalling caiman, my machine (12 cores, n_processes = 11) is getting stuck on the motion correction process. It seems like it is attempting to load the video on each core, but fails when allocating tasks to each core? I'm not sure...

Importantly, this is only an issue when running the code through the interactive window. This issue does not occur when running the program in Jupyter notebook.

Here is the text error (Ive also attached a photo)

Traceback (most recent call last): File "/Users/js0403/decode_lab_code/src/decode_lab_code/preprocessing/ophys/script_cnm.py", line 209, in c, dview, n_processes = cm.cluster.setup_cluster( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/CaImAn/caiman/cluster.py", line 422, in setup_cluster dview = Pool(n_processes, maxtasksperchild=maxtasksperchild) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/context.py", line 119, in Pool return Pool(processes, initializer, initargs, maxtasksperchild, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/pool.py", line 215, in init self._repopulate_pool() File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/pool.py", line 306, in _repopulate_pool return self._repopulate_pool_static(self._ctx, self.Process, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/pool.py", line 329, in _repopulate_pool_static w.start() File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/process.py", line 121, in start self._popen = self._Popen(self) ^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/context.py", line 300, in _Popen return Popen(process_obj) ^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/popen_forkserver.py", line 35, in init super().init(process_obj) File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/popen_fork.py", line 19, in init self._launch(process_obj) File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/popen_forkserver.py", line 42, in _launch prep_data = spawn.get_preparation_data(process_obj._name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/spawn.py", line 158, in get_preparation_data _check_not_importing_main() File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/spawn.py", line 138, in _check_not_importing_main raise RuntimeError(''' RuntimeError: An attempt has been made to start a new process before the current process has finished its bootstrapping phase.

    This probably means that you are not using fork to start your
    child processes and you have forgotten to use the proper idiom
    in the main module:

        if __name__ == '__main__':
            freeze_support()
            ...

    The "freeze_support()" line can be omitted if the program
    is not going to be frozen to produce an executable.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/forkserver.py", line 274, in main code = _serve_one(child_r, fds, ^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/forkserver.py", line 313, in _serve_one code = spawn._main(child_r, parent_sentinel) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/spawn.py", line 129, in _main prepare(preparation_data) File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/spawn.py", line 240, in prepare _fixup_main_from_path(data['init_main_from_path']) File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/spawn.py", line 291, in _fixup_main_from_path main_content = runpy.run_path(main_path, ^^^^^^^^^^^^^^^^^^^^^^^^^ File "", line 291, in run_path File "", line 98, in _run_module_code File "", line 88, in _run_code File "/Users/js0403/decode_lab_code/src/decode_lab_code/preprocessing/ophys/script_cnm.py", line 214, in c, dview, n_processes = cm.cluster.setup_cluster( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/CaImAn/caiman/cluster.py", line 422, in setup_cluster dview = Pool(n_processes, maxtasksperchild=maxtasksperchild) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/context.py", line 119, in Pool return Pool(processes, initializer, initargs, maxtasksperchild, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/pool.py", line 215, in init self._repopulate_pool() File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/pool.py", line 306, in _repopulate_pool return self._repopulate_pool_static(self._ctx, self.Process, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/pool.py", line 329, in _repopulate_pool_static w.start() File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/process.py", line 121, in start self._popen = self._Popen(self) ^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/context.py", line 300, in _Popen return Popen(process_obj) ^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/popen_forkserver.py", line 35, in init super().init(process_obj) File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/popen_fork.py", line 19, in init self._launch(process_obj) File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/popen_forkserver.py", line 42, in _launch prep_data = spawn.get_preparation_data(process_obj._name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/spawn.py", line 158, in get_preparation_data _check_not_importing_main() File "/Users/js0403/anaconda3/envs/caiman/lib/python3.11/multiprocessing/spawn.py", line 138, in _check_not_importing_main raise RuntimeError(''' RuntimeError: An attempt has been made to start a new process before the current process has finished its bootstrapping phase.

    This probably means that you are not using fork to start your
    child processes and you have forgotten to use the proper idiom
    in the main module:

        if __name__ == '__main__':
            freeze_support()
            ...

    The "freeze_support()" line can be omitted if the program
    is not going to be frozen to produce an executable.
Screenshot 2023-08-14 at 12 24 17 PM
JohnStout commented 1 year ago

Wanted to add that I was using the demo_pipeline_CNMFE as a backbone. I used the "local" backend for cm.cluster.setup_cluster

pgunn commented 1 year ago

Hello, Right now we don't test with VSCode, but we can still look into some of the details here. A few questions:

1) Does this happen when you run it outside of VSCode - meaning from the commandline? 2) This decode_lab_code/preprocessing/ophys/script_cnm.py is derived from one of the CLI demos? Which one? Is there a chance we might be able to give it a look? 3) Does changing n_processes change the result at all?

EricThomson commented 1 year ago

A couple of months ago, I had some weird issues with VS Code multiprocessing in caiman when running .py files, and switched to Pycharm where things worked. I had to set dview to None to get things to work in VS Code (which basically turns off multiprocessing, so I ended up using Pycharm). I didn't explore it deeply enough to create an issue, as I almost always run code in Jupyter and and just write back end stuff in VS Code. But this wouldn't be the first VS Code specific issue (e.g., #1001 ).

JohnStout commented 1 year ago

Ahh, I see. It seems to be related to Pool in the multiprocessing toolbox. With how sudden and how specific to interactive window, I'm starting to wonder if it's an update that they did or something, rather than a caiman thing.

I was hoping to have some scripts that I could run for preprocessing purposes, then explore the results more in Jupyter. I'll check out Pycharm. Thanks!