Is there a way to cleanly halt execution when a large job is running? I am using this in a jupyter notebook on macOS 10.15.4 and when I interrupt the cell (or one of the processes exits with an error) the cell shows as finished but my CPU and memory are still being used. In fact, there appears to be a memory leak because after exiting the whole jupyter and python instance, the memory usage of "kernel_task" (the process which was showing high CPU during execution) does not drop.
Is there a way to cleanly halt execution when a large job is running? I am using this in a jupyter notebook on macOS 10.15.4 and when I interrupt the cell (or one of the processes exits with an error) the cell shows as finished but my CPU and memory are still being used. In fact, there appears to be a memory leak because after exiting the whole jupyter and python instance, the memory usage of "kernel_task" (the process which was showing high CPU during execution) does not drop.