issues
search
hkproj
/
pytorch-llama
LLaMA 2 implemented from scratch in PyTorch
https://www.youtube.com/watch?v=oM4VmoabDAI
MIT License
227
stars
45
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
refactor to pass cmd args
#13
honghua
opened
1 week ago
0
Fixup
#12
honghua
opened
1 week ago
0
No need to use forward method?
#11
nkkbr
opened
3 months ago
0
Can I turn off KV cache?
#10
purejomo
opened
4 months ago
2
Error in rotary matrix multiplication formula of slide 25
#9
sanzgadea
opened
5 months ago
0
alternative implementation of group-query attention (GQA)
#8
feixyz10
closed
7 months ago
5
[Question] Is this line a bug?
#7
kozer
closed
8 months ago
1
what is minimal computer can be used for only inference
#6
Sandy4321
opened
8 months ago
0
will it work on windows OS with only CPU?
#5
Sandy4321
opened
8 months ago
0
an unexpected keyword argument 'rope_theta'
#4
mietekrmd
closed
10 months ago
3
Where did the factor 2 go in rotary embedding?
#3
ZhichaoDuan
closed
10 months ago
2
Multiplication is not dividing in the RMS norm.
#2
srikant86panda
closed
11 months ago
2
what is the license of the code please?
#1
botsbrain
closed
1 year ago
1