Closed CaffreyR closed 2 years ago
Hi In the SAID code they consider the lambda for each parameter, and then they decrease this from the number of trainable parameters, see here https://github.com/rabeehk/compacter/blob/b210eef13f64ff6441186ee5a1cbf031b5918b94/seq2seq/projections/intrinsic.py#L158 so since lambda is for each parameter, they decrease the total number of intrinsic-dimension by the total number of parameters. And I think here https://github.com/rabeehk/compacter/blob/b210eef13f64ff6441186ee5a1cbf031b5918b94/seq2seq/projections/intrinsic.py#L184 they allocate the lambdas, so the total number of parameters is fixed as set by the intrinsic dimension.
Please note the SAID code is not written by me, and this is done by Meta Research, the authors of the paper. Please feel free to also reach out to them for any questions. Please feel free to also ask here, I will help as much as I can.
OPPS!! My carelessness! I misunderstood about the len(list(module.named_parameters()))
! Thanks very much!
Hi @rabeehk, there any way that I can find the origin repo to post a question? Since the paper here mentioned the fastfood method, but the code is there it does not match. But I did not find the H matrix, what is the meaning of GG and divisor? And BB is equal to H*B?
Hi @CaffreyR AFAIK the original paper has not released the codes, I am not sure about the details of fastfood transform, but maybe you can contact them by email?
Great ! Many thanks!
Great ! Many thanks!
Hi @rabeehk, in your code https://github.com/rabeehk/compacter/blob/b210eef13f64ff6441186ee5a1cbf031b5918b94/seq2seq/projections/intrinsic.py#L174 https://github.com/rabeehk/compacter/blob/b210eef13f64ff6441186ee5a1cbf031b5918b94/seq2seq/projections/intrinsic.py#L175 You get the parameter number to get the projection. But for SAID, why get the projection is the same since the SAID use some parameters here https://github.com/rabeehk/compacter/blob/b210eef13f64ff6441186ee5a1cbf031b5918b94/seq2seq/projections/intrinsic.py#L184 And this is the paper of SAID
Thanks a lot!