issues
search
dhakalnirajan
/
LLaMA-BitNet
LLaMA-BitNet is a repository dedicated to empowering users to train their own BitNet models built upon LLaMA 2 model, inspired by the groundbreaking paper 'The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits'.
https://arxiv.org/pdf/2402.17764
MIT License
12
stars
3
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Inference mode kernel
#3
wx02shi
opened
2 months ago
1
Model generates gibberish after training.
#2
JustMangler
opened
6 months ago
1
Usage example
#1
andreamigliorati
opened
7 months ago
4