microsoft / T-MAC

Low-bit LLM inference on CPU with lookup table
MIT License
545 stars 39 forks source link

How to build ollama with t-mac support? #23

Open goukisun opened 2 months ago

goukisun commented 2 months ago

Thanks for the work。 how to build ollama with t-mac ? just replace the llama.cpp submodule ? anything to config ?

kaleid-liner commented 2 months ago

Haven't tested it yet. But it should work by replacing llama.cpp. You may also need to update the build scripts with our pipeline. But I'm not sure how to achieve that.

jianlins commented 2 weeks ago

Anyone worked on this?