Open ChrisKretschmer opened 1 year ago
This uses alpaca.cpp, which has been abandoned. Only the new llama.cpp, which now has support for alpaca, supports context memorization.
Hey there, it's simple to turn on context with a few lines of code, but we found the output quality deteriorates considerably on subsequent responses even on llama.cpp. Feel free to submit a PR if you can get the context feature working well.
I made the app work on windows, but am wondering, if its intended, that there is no context between individual queries like ChatGPT has?