psugihara / FreeChat

llama.cpp based AI chat app for macOS
https://www.freechat.run
MIT License
437 stars 41 forks source link

How to compile freechat-server #27

Closed EthanLipnik closed 1 year ago

EthanLipnik commented 1 year ago

I'm trying to integrate LLaMA into a very low level LLaMA app and was wondering what freechat-server is from. It seems to be the best way to run LLaMA inside of a macOS app. I tried the compiled server executable from the main llama.cpp repo but it's half the file size of free chat-server and when running it, it says

-[MTLComputePipelineDescriptorInternal setComputeFunction:withType:]:722: failed assertion computeFunction must not be nil.

Is the original freechat-server also open source so I can compile it myself?

psugihara commented 1 year ago

freechat-server is llama.cpp examples/server. Current release is at this commit https://github.com/ggerganov/llama.cpp/commit/1a159553f921a

It's a universal binary that I create by lipo'ing an x86 and arm build like this lipo -create server server_x86 -output freechat-server

Can't help with that error, sorry, but you could ask over at the llama.cpp repo.

Thanks for checking out FreeChat.