Open smamindl opened 8 months ago
Quick question how did you start your llama.cpp server ? Did you specify -np 3
in the parameters ?
Quick question how did you start your llama.cpp server ? Did you specify
-np 3
in the parameters ?
@nsarrazin yes I have specified -np 2
likely resolved with my PR! https://github.com/huggingface/chat-ui/pull/867 check out my branch and see if it helps :heart:
I am using the following .env.local with llama-2-7b.Q4_K_S.gguf and llama prompt template
I am trying to get this work with chat-ui and it doesn't work and chat-ui is frozen. However server is receiving request from client.