Closed hassanzadeh closed 11 months ago
Yes, the server is llama.cpp/examples/server running on localhost.
I make a universal binary by compiling it on an arm and intel machine then combining them with:
lipo -create server server_x86 -output freechat-server
Interesting when you say compile, you mean running "make -j +[arch related args]" in llama.cpp ?
yes but just "make"
Got it thanks, This is a really interesting work, Congrats!
Hey Guys, Question, so how is the executable server created? Is it done by llama.cpp project?