dhakalnirajan / LLaMA-BitNet

LLaMA-BitNet is a repository dedicated to empowering users to train their own BitNet models built upon LLaMA 2 model, inspired by the groundbreaking paper 'The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits'.
https://arxiv.org/pdf/2402.17764
MIT License
12 stars 3 forks source link