Closed gsuuon closed 8 months ago
Start the llama.cpp server the first time a prompt is requested. Also need to add options to the llama.cpp provider for the binary directory. Check the previous llama.cpp ./main provider for an example.
Start the llama.cpp server the first time a prompt is requested. Also need to add options to the llama.cpp provider for the binary directory. Check the previous llama.cpp ./main provider for an example.