Open pavelbrn opened 3 years ago
Definitely no cross product invovled... cross product is a "hack" for getting perpendicular vectors and only applies for 3D vectors.
I'm guessing what is meant is just the "matrix product"
x1
is also a row vector(targets - predictions)
as a column vector@
corresponds to row-times-column style matrix product which is what we want
(equivalent to computing the dot product, .dot
)Intuitively, the d for each weight has a contribution from each data point (n=200) so dot product by it's summy nature is the convenient tool for doing this.
see
import numpy as np
u = np.array([1,3,3])
v = np.array([2,2,3])
print("The dot product between u and v can be computer as...")
print("The sum of the elementwise-wise products", sum(u.T*v))
print("The matrix product", u@v)
print("Or by calling the .dot product on one-a-dem vecs", u.dot(v), v.dot(u))
@pavelbrn If you have time to fix this, perhaps you can open a PR with change:
# Use matrix cross product (*) to simultaneously
# calculate the derivative for each weight
d_w1 = -x1*(targets - predictions)
to
# Use dot product to calculate the derivative for each weight
d_w1 = -x1.dot(targets - predictions)
(I removed the whole "simulatanous" part because it doesn't apply here. Simultanous way would be to compute d as 3D vector where matrix-vector product would be useful, but the code shows cleaner coefficient-by-coefficient approach so matrix product not involved
@ivanistheone Thank you for the good explanation! I opened up a new PR with the changes you proposed.
The following comment really confused me:
# Use matrix cross product (*) to simultaneously
# calculate the derivative for each weight
d_w1 = -x1*(targets - predictions)
...https://ml-cheatsheet.readthedocs.io/en/latest/linear_regression.html#id4
In Numpy the cross product method would be
np.cross()
and not*
, both versions give different results. Which is the correct version?