mantidproject / mantidimaging

Graphical toolkit for neutron imaging.
https://mantidproject.github.io/mantidimaging
GNU General Public License v3.0
10 stars 6 forks source link

Applying Median Filter fails #2250

Closed MikeSullivan7 closed 3 days ago

MikeSullivan7 commented 4 days ago

Summary

When applying the Median filter with default parameters on a dataset, the operation fails.

Steps To Reproduce

  1. Open MI
  2. Load in a dataset
  3. Open Operations window
  4. Select the Median Filter and apply

Expected Behaviour

The Median filter should be applied to the dataset

Current Behaviour

The Operation fails

Failure Logs

2024-07-03 16:50:34,507 [mantidimaging.gui.dialogs.async_task.task:L53] ERROR: Failed to execute task: cannot pickle '_thread.lock' object
Traceback (most recent call last):
  File "C:\Users\ddb29996\mantidimaging\mantidimaging\gui\dialogs\async_task\task.py", line 50, in run
    self.result = call_with_known_parameters(self.task_function, **self.kwargs)
  File "C:\Users\ddb29996\mantidimaging\mantidimaging\core\utility\func_call.py", line 12, in call_with_known_parameters
    return func(**ka)
  File "C:\Users\ddb29996\mantidimaging\mantidimaging\gui\windows\operations\model.py", line 118, in apply_to_stacks
    self.apply_to_images(stack, progress=progress)
  File "C:\Users\ddb29996\mantidimaging\mantidimaging\gui\windows\operations\model.py", line 130, in apply_to_images
    exec_func(images)
  File "C:\Users\ddb29996\mantidimaging\mantidimaging\core\operations\median_filter\median_filter.py", line 95, in filter_func
    ps.run_compute_func(MedianFilter.compute_function, data.data.shape[0], data.shared_array, params)
  File "C:\Users\ddb29996\mantidimaging\mantidimaging\core\parallel\shared.py", line 108, in run_compute_func
    pu.run_compute_func_impl(worker_func, num_operations, all_data_in_shared_memory, progress)
  File "C:\Users\ddb29996\mantidimaging\mantidimaging\core\parallel\utility.py", line 137, in run_compute_func_impl
    for _ in pm.pool.imap(worker_func, indices_list, chunksize=calculate_chunksize(pm.cores)):
  File "C:\Users\ddb29996\AppData\Local\miniforge3\envs\mantidimaging-dev\lib\multiprocessing\pool.py", line 873, in next
    raise value
  File "C:\Users\ddb29996\AppData\Local\miniforge3\envs\mantidimaging-dev\lib\multiprocessing\pool.py", line 540, in _handle_tasks
    put(task)
  File "C:\Users\ddb29996\AppData\Local\miniforge3\envs\mantidimaging-dev\lib\multiprocessing\connection.py", line 206, in send
    self._send_bytes(_ForkingPickler.dumps(obj))
  File "C:\Users\ddb29996\AppData\Local\miniforge3\envs\mantidimaging-dev\lib\multiprocessing\reduction.py", line 51, in dumps
    cls(buf, protocol).dump(obj)
TypeError: cannot pickle '_thread.lock' object
2024-07-03 16:50:34,668 [mantidimaging.core.parallel.utility:L96] INFO: Not all of the data uses shared memory
2024-07-03 16:50:34,668 [mantidimaging.core.parallel.utility:L140] INFO: Running synchronously on 1 core
2024-07-03 16:50:34,693 [mantidimaging.core.utility.progress_reporting.progress:L241] INFO: Elapsed time: 0 sec.

Screenshot(s)

image

samtygier-stfc commented 4 days ago

Confirm on main. Switching back to release-2.7.0 I don't see it, so must be a regression. (Note you need mamba install pyqtgraph=0.13.3 to run older versions).

MikeSullivan7 commented 3 days ago

Looks like this is the issue from git bisect:

da3b42053e50fbce5a8f2d667075a695e8eaf1a9 is the first bad commit
commit da3b42053e50fbce5a8f2d667075a695e8eaf1a9
Author: ashmeigh <ashleymeigh0@gmail.com>
Date:   Wed Mar 6 18:09:25 2024 +0000

    remove cuda out compute function

 .../core/operations/median_filter/median_filter.py | 28 ++++++++++++----------
 1 file changed, 15 insertions(+), 13 deletions(-)
samtygier-stfc commented 3 days ago

Ah, could be:

-        params = {'mode': mode, 'force_cpu': force_cpu}
+        params = {'mode': mode, 'size': size, 'force_cpu': force_cpu, 'progress': progress}

Things in the params dict get passed to the worker processes though pickle. The error is saying there is a lock object that can't be pickled. Its probably in the progress object, which should be passed as a progress parameter, rather than in the params dict.