Closed win10ogod closed 1 month ago
Can anyone put kan(attention and mlp) into llama2.c?
I don't have plan working on this and it's irrelevant to this repo. Please fire an issue on that project instead.
Can anyone put kan(attention and mlp) into llama2.c?