ylsdamxssjxxdd / eva

直观的大模型应用软件: 机体
MIT License
145 stars 16 forks source link

can you support microsoft bitnet.cpp #14

Open zhuanyedecainiao opened 3 weeks ago

zhuanyedecainiao commented 3 weeks ago

bitnet.cpp is the official inference framework for 1-bit LLMs (e.g., BitNet b1.58). It offers a suite of optimized kernels, that support fast and lossless inference of 1.58-bit models on CPU (with NPU and GPU support coming next).

The first release of bitnet.cpp is to support inference on CPUs. bitnet.cpp achieves speedups of 1.37x to 5.07x on ARM CPUs, with larger models experiencing greater performance gains. Additionally, it reduces energy consumption by 55.4% to 70.0%, further boosting overall efficiency. On x86 CPUs, speedups range from 2.37x to 6.17x with energy reductions between 71.9% to 82.2%. Furthermore, bitnet.cpp can run a 100B BitNet b1.58 model on a single CPU, achieving speeds comparable to human reading (5-7 tokens per second), significantly enhancing the potential for running LLMs on local devices. Please refer to the technical report for more details.

ylsdamxssjxxdd commented 3 weeks ago

I will test it in the near future, but it is said that it's response effect is very average./(ㄒoㄒ)/~~