Closed charmandercha closed 3 days ago
Can't be converted. The 1.58-bit Bitnet models need to be trained from scratch.
For now, there are only some early test LLM's made in that format, nothing substantial. Maybe in the future it will be more relevant, maybe not, who knows.
Can't be converted. The 1.58-bit Bitnet models need to be trained from scratch.
For now, there are only some early test LLM's made in that format, nothing substantial. Maybe in the future it will be more relevant, maybe not, who knows.
There is 1.58 finetuning and already a llama model with that quantization.
I really do not know much about the AI world and the limitations, but if this model can convert to 1.58, maybe it will make this model more accessible?