Open do-me opened 1 day ago
Running a 405B model on a single PC is super cool! And BitNet can definitely help that.
Unfortunately, there is no plan for now due to the constrain of resources. Hopefully will make it in the near future.
We have opened up the discussion section. Enjoy!
In the benchmark's graph, there is a 70b model. So, is it supported or not?
Thanks a lot for open sourcing this amazing library! I was wondering whether you tried/are planning to prepare some larger models too, like Llama-3.1-70B/405B. As it seems, there is an actual chance to be able to run Llama-3.1-405B on a single Mac. Also, would you mind opening up the discussions section?