Closed basnijholt closed 4 years ago
This is because of https://github.com/cloudpipe/cloudpickle/issues/178 and can be reproduced with
import cloudpickle
from functools import lru_cache
@lru_cache
def g(x):
return x
dump = cloudpickle.dumps(g)
del g
g = cloudpickle.loads(dump)
g(1)
What's the use case for lru_cache'd functions?
This occurs in some simulation software I use.
For example
@lru_cache
def make_kwant_syst():
...
return syst
def f(x):
syst = make_kwant_syst()
return conductance(x, syst)
I see. In the meantime you could hack around it by making syst
global:
def make_kwant_syst():
try:
return syst
except NameError:
global syst = ...
return syst
Also, it seems like this is blocked until at least the release of Python 3.9: https://github.com/cloudpipe/cloudpickle/pull/309#issuecomment-545472136.
Other alternatives include using your own memorization decorator:
def memoize(f):
memo = {}
def helper(x):
if x not in memo:
memo[x] = f(x)
return memo[x]
return helper
or using the concurrent.futures.ProcessPoolExecutor
.
Should we then close this as an upstream bug? It seems like it will require no action either way.
I think closing it will suggest that it's fixed. I rather leave it open with the "Blocked" label attached.
But it's just not our bug, that is the reason to close it.
Closing because it is an upstream issue
The following fails:
Related to loky issue: https://github.com/joblib/loky/issues/268
This worked fine with the
concurrent.futures.ProcessPoolExecutor
.