TimeSurgeLabs / searchbase

⚡️ Supercharge productivity with your company's intelligent and versatile AI Chatbot! 🤖
GNU Affero General Public License v3.0
0 stars 0 forks source link

Increase response token length #1

Closed chand1012 closed 1 year ago

chand1012 commented 1 year ago

Steve says the token responses seem a bit lacking. Try to find a way to increase response size effectively.

chand1012 commented 1 year ago

This seems to be somewhat remedied when using ChatGPT as your backend thanks to these commits: 611f0e758a4420c3611e8142e61521328d273760 & 09a0e1f529b726f1931dd5fad87f5cac1ac56015

I need to do some more testing on FastChat, as I opted to use completion so I didn't sacrifice any input tokens, but there's a bug with FastChat-T5 where it doesn't work since it is not properly handling token counts. For now I'm going to continue on other work and come back to this once the bug is fixed with FastChat.