Open siilats opened 11 months ago
Someone figured it out! https://github.com/fauxpilot/fauxpilot/issues/10#issuecomment-1710600127
Mac compress command does not create a valid zip file so use Keka or windows VM to compress the folder back up for installing https://apple.stackexchange.com/questions/414947/how-to-prevent-macos-from-adding-dot-underscore-files-whatever-html-in-zip-a
Note that you need 4_0 model and manual install instructions for llama-cpp-python to use METAL https://github.com/abetlen/llama-cpp-python/blob/main/docs/install/macos.md?plain=1
It works but is way too slow on my M1. I wonder if someone wants to debug if its really using Metal
CodeGPT integration is a bit more userfriendly, from this branch https://github.com/carlrobertoh/CodeGPT/tree/178-add-support-for-llama-cpp