python-adaptive / adaptive

:chart_with_upwards_trend: Adaptive: parallel active learning of mathematical functions
http://adaptive.readthedocs.io/
BSD 3-Clause "New" or "Revised" License
1.17k stars 60 forks source link

can't pickle lru_cache function with loky #292

Closed basnijholt closed 4 years ago

basnijholt commented 4 years ago

The following fails:

from functools import lru_cache
import adaptive
adaptive.notebook_extension()

@lru_cache
def g(x):
    return x

def f(x):
    return g(x)

learner = adaptive.SequenceLearner(f, range(2))
runner = adaptive.Runner(learner, adaptive.SequenceLearner.done)

runner.live_info()

Related to loky issue: https://github.com/joblib/loky/issues/268

This worked fine with the concurrent.futures.ProcessPoolExecutor.

basnijholt commented 4 years ago

This is because of https://github.com/cloudpipe/cloudpickle/issues/178 and can be reproduced with

import cloudpickle
from functools import lru_cache

@lru_cache
def g(x):
    return x

dump = cloudpickle.dumps(g)
del g
g = cloudpickle.loads(dump)
g(1)
akhmerov commented 4 years ago

What's the use case for lru_cache'd functions?

basnijholt commented 4 years ago

This occurs in some simulation software I use.

For example

@lru_cache
def make_kwant_syst():
    ...
    return syst

def f(x):
    syst = make_kwant_syst()
    return conductance(x, syst)
akhmerov commented 4 years ago

I see. In the meantime you could hack around it by making syst global:

def make_kwant_syst():
    try:
        return syst
    except NameError:
        global syst = ...
        return syst
basnijholt commented 4 years ago

Also, it seems like this is blocked until at least the release of Python 3.9: https://github.com/cloudpipe/cloudpickle/pull/309#issuecomment-545472136.

Other alternatives include using your own memorization decorator:

def memoize(f):
    memo = {}
    def helper(x):
        if x not in memo:            
            memo[x] = f(x)
        return memo[x]
    return helper

or using the concurrent.futures.ProcessPoolExecutor.

akhmerov commented 4 years ago

Should we then close this as an upstream bug? It seems like it will require no action either way.

basnijholt commented 4 years ago

I think closing it will suggest that it's fixed. I rather leave it open with the "Blocked" label attached.

akhmerov commented 4 years ago

But it's just not our bug, that is the reason to close it.

basnijholt commented 4 years ago

Closing because it is an upstream issue