HIPS / autograd

Efficiently computes derivatives of NumPy code.
MIT License
7.02k stars 914 forks source link

`'ArrayBox' object has no attribute 'dot'` when differentiating function containing `x.dot(y)` #608

Open ForceBru opened 1 year ago

ForceBru commented 1 year ago

Looks like ArrayBox has not attribute dot, so I can't differentiate x.dot(y) wrt x:

>>> import autograd, autograd.numpy as np
>>> x = np.array([1., 3,4])
>>> cs = np.array([0.2,0.6,0.1])
>>> autograd.grad(lambda x: x.dot(cs))(x)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/forcebru/.pyenv/versions/3.11.4/lib/python3.11/site-packages/autograd/wrap_util.py", line 20, in nary_f
    return unary_operator(unary_f, x, *nary_op_args, **nary_op_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/forcebru/.pyenv/versions/3.11.4/lib/python3.11/site-packages/autograd/differential_operators.py", line 28, in grad
    vjp, ans = _make_vjp(fun, x)
               ^^^^^^^^^^^^^^^^^
  File "/Users/forcebru/.pyenv/versions/3.11.4/lib/python3.11/site-packages/autograd/core.py", line 10, in make_vjp
    end_value, end_node =  trace(start_node, fun, x)
                           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/forcebru/.pyenv/versions/3.11.4/lib/python3.11/site-packages/autograd/tracer.py", line 10, in trace
    end_box = fun(start_box)
              ^^^^^^^^^^^^^^
  File "/Users/forcebru/.pyenv/versions/3.11.4/lib/python3.11/site-packages/autograd/wrap_util.py", line 15, in unary_f
    return fun(*subargs, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "<stdin>", line 1, in <lambda>
AttributeError: 'ArrayBox' object has no attribute 'dot'

I managed to fix this by adding 'dot' to diff_methods here: https://github.com/HIPS/autograd/blob/e18f656118d23982bacf33380da3efc09b62cfe3/autograd/numpy/numpy_boxes.py#L56-L64

Now the error is gone and my code seems to produce correct results. According to NumPy, numpy.dot is the equivalent function for the numpy.ndarray.dot method, so this also seems to align with what the comment says.


Should 'dot' indeed be added to diff_methods, or is this expected behavior?

Benja-Vera commented 5 months ago

Hi, I had the same problem and I think you're right. Especially because when replacing the last line in your code by

autograd.grad(lambda x: cs.dot(x))(x)

(i.e, just swapping x and cs in the function definition) it works as expected.

I do think that 'dot' should be added to the diff_methods, but in the meantime my fix was to just replace the dot by the numpy @ operator. That works in any case.