h-alice / llamacpp-webui

A StreamLit WebUI for LLaMA.cpp.
MIT License
3 stars 0 forks source link

Fix: Increase max token #4

Closed h-alice closed 5 months ago

h-alice commented 5 months ago

As title mentioned, the max output tokens has been increased.

h-alice commented 5 months ago

The fix has been tested and it worked properly.