aorumbayev / autogpt4all

🛠️ User-friendly bash script for setting up and configuring your LocalAI server with the GPT4All for free! 💸
https://aorumbayev.github.io/autogpt4all/
MIT License
452 stars 67 forks source link

Update token limit to 2048 and setting SMART_LLM_MODEL #5

Closed watcher60 closed 1 year ago

watcher60 commented 1 year ago

Most local LLM have a context window/token limit of 2048, including the model downloaded in the script. Additionally I have seen it attempt to use GPT-4 for reasoning which appears to be the same as this issue https://github.com/Significant-Gravitas/Auto-GPT/issues/187.