def exp(self):
x = self.data
out = Value(math.exp(x), (self, ), 'exp')
def _backward():
self.grad += out.data * out.grad # NOTE: in the video I incorrectly used = instead of +=. Fixed here.
out._backward = _backward
return out
The gradient should be calculated like this: self.grad += self.data * out.grad
First of all, thank you @karpathy for this amazing repo.
I think the
exp
function in the second half of the first lecture of micrograd has a bug:The gradient should be calculated like this:
self.grad += self.data * out.grad
What do you think?