Closed kingjr closed 8 years ago
That cannot work with joblib AFAIK.
On Fri, Mar 18, 2016 at 3:22 PM Jean-Rémi KING notifications@github.com wrote:
I was thinking of parallelizing the wavelet decomposition functions across wavelet https://github.com/mne-tools/mne-python/blob/master/mne/time_frequency/tfr.py#L207 .
However, these functions can be called by other function in a parallel mode https://github.com/mne-tools/mne-python/blob/master/mne/time_frequency/tfr.py#L428 .
Such that we'd get something like
def bar(): ...
def foo(): parallel(bar())
def joe(): parallel(foo())
Of course we can ensure that the n_jobs is only > 1 at the highest level, but I was wondering whether the overall structure was ok.
— You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub https://github.com/mne-tools/mne-python/issues/3036
That cannot work with joblib AFAIK.
Actually, I just realized that we have are own parallel wrapper, and it doesn't call joblib if n_jobs==1. https://github.com/mne-tools/mne-python/blob/master/mne/parallel.py#L53
So I guess it should be ok, right?
Yes that's right. Maybe I misunderstood your question.
On Fri, Mar 18, 2016 at 3:50 PM Jean-Rémi KING notifications@github.com wrote:
That cannot work with joblib AFAIK.
Actually, I just realized that we have are own parallel wrapper, and it doesn't call joblib if n_jobs==1. https://github.com/mne-tools/mne-python/blob/master/mne/parallel.py#L53
So I guess it should be ok, right?
— You are receiving this because you commented.
Reply to this email directly or view it on GitHub https://github.com/mne-tools/mne-python/issues/3036#issuecomment-198394339
Ok, I'll wait for https://github.com/mne-tools/mne-python/pull/3034 before trying this, since it's going to be on the same file.
On a related topic, I just saw that joblib can share objects across workers using memmapping. I've never tried this, any disadvantage (it dumps the data on the hard drive?)?
We have memmapping support already in MNE for joblib. My impression was that it does not make a big diff. What you gain in memory you loose it in time.
On Fri, Mar 18, 2016 at 4:22 PM Jean-Rémi KING notifications@github.com wrote:
Ok, I'll wait for #3034 https://github.com/mne-tools/mne-python/pull/3034 before trying this, since it's going to be on the same file.
On a related topic, I just saw that joblib can share objects across workers using memmapping https://pythonhosted.org/joblib/parallel.html. I've never tried this, any disadvantage (it dumps the data on the hard drive?)?
— You are receiving this because you commented.
Reply to this email directly or view it on GitHub https://github.com/mne-tools/mne-python/issues/3036#issuecomment-198407514
The advantage is also for code itself, isn't it? Instead of chunks of data (e.g. if you loop across chunks of freqs, or chunks of times), you can just load and update and initialized array. It would have made the GAT code much more readable... (I should stop speaking about the GAT, right?...)
I was thinking of parallelizing the wavelet decomposition functions across wavelet.
However, these functions can be called by other function in a parallel mode.
Such that we'd get something like
Of course we can ensure that the n_jobs is only > 1 at the highest level, but I was wondering whether the overall structure was ok.