erodola / DLAI-s2-2021

Teaching material for the course of Deep Learning and Applied AI, 2nd semester 2021, Sapienza University of Rome
35 stars 5 forks source link

linalg - product of "map matrix" and "vector matrix" #7

Open sh3rlock14 opened 3 years ago

sh3rlock14 commented 3 years ago

immagine

Slides "03-linalg", 71/72

I'm not getting why the column-matrix storing the linear combination coefficients is noted as Tvj, since we are not expressing the basis vectors of W in the formula.

Also in the lecture is said that "what we get is the matrix representation of Tu" (probably meant Tv), but again: if it were so, then shouldn't it be: T1,jw1 + ... + Tm,jwm ?

Thanks in advance for the help!

erodola commented 3 years ago

I went through the slides and the video recording to double-check, let me now dispel your doubts:

I hope this clarifies. If not, feel free to cast your doubts here!

Addendum: this should also clarify our previous statement that matrix representations only need to know how to map basis vectors Tvj, since for arbitrary vectors c we only need to take a linear combination of the Tvj with the coefficients contained in c.