Open zcbenz opened 3 weeks ago
Some of the most popular models provide weights in bfloat16, which unfortunately can not load on CPU because Matmul::eval_cpu only supports float32.
Matmul::eval_cpu
I know CPU support is not on priority, but it would be great if my code can run on other platforms than mac arm64 even being very slow.
maybe this can be also interesting to look at https://github.com/microsoft/BitNet
Some of the most popular models provide weights in bfloat16, which unfortunately can not load on CPU because
Matmul::eval_cpu
only supports float32.I know CPU support is not on priority, but it would be great if my code can run on other platforms than mac arm64 even being very slow.