HIPS / autograd

Efficiently computes derivatives of NumPy code.
MIT License
6.95k stars 908 forks source link

copy.deepcopy leads to incorrect gradient #555

Open jbellevi opened 4 years ago

jbellevi commented 4 years ago

Consider the following :

from autograd import grad 
import autograd.numpy as np
import copy

def summation_no_copy(x):
    sum = 0

    for i in range(1, 3):
        sum += i * np.sum(x)
    return sum

def summation_copy(x):
    sum = 0

    for i in range(1, 3):
        sum += i * np.sum(copy.deepcopy(x))

    return sum

x = np.array([1, 2, 3, 4, 3.5, 920, 0])

grad_copy = grad(summation_copy)
grad_no_copy = grad(summation_no_copy)

print(f'with deepcopy: {grad_copy(x)}')
print(f'without deepcopy: {grad_no_copy(x)}')

The output is:

with deepcopy: [1. 1. 1. 1. 1. 1. 1.]
without deepcopy: [3. 3. 3. 3. 3. 3. 3.]

I'm not sure whether this is a bug, or if this simply isn't supported, but it seems like the gradient calculation is broken after the first deepcopy (i.e. doesn't consider any of the following deepcopies). I need the gradient of a more complicated function which requires the use of deepcopy: how can I go about getting it?

tylerflex commented 3 months ago

@jbellevi curious if this ever got resolved? I think I encountered the same issue