class MemoizeJac(object):
""" Decorator that caches the value gradient of function each time it
is called. """
def __init__(self, fun):
self.fun = fun
self.jac = None
self.x = None
def __call__(self, x, *args):
self.x = numpy.asarray(x).copy()
fg = self.fun(x, *args)
self.jac = fg[1]
return fg[0]
def derivative(self, x, *args):
if self.jac is not None and numpy.all(x == self.x):
return self.jac
else:
self(x, *args)
return self.jac
We see that the jacobian is not recomputed if the local xdidn't change. But x may have changed on other processors, leading to a deadlock.
If in the implentation we call fun(x) befor each jac(x) this will not be a problem
In scipy.optimize.minimize when we provide a function that computes function and jacobian together, scipy caches the value
If we look in
MemoizeJac
We see that the jacobian is not recomputed if the local
x
didn't change. But x may have changed on other processors, leading to a deadlock.If in the implentation we call
fun(x)
befor eachjac(x)
this will not be a problem