LagPixelLOL / ChatGPTCLIBot

ChatGPT Bot in CLI with long term memory support using Embeddings.
MIT License
341 stars 38 forks source link

llama.cpp #10

Closed MariasStory closed 1 year ago

MariasStory commented 1 year ago

Maybe you could add the long term memory support to llama.cpp or use it as a "local back-end".

LagPixelLOL commented 1 year ago

nope, i'll only focus on openai's models currently, and i will make it as light weight as possible, with a local model like LLaMA, lots of people cant run it, and it will increase the setup difficulty of this program. so for now i will only use remote api.

MariasStory commented 1 year ago

Thanks for the answer. You may want to consider a local API such as the one offered by LocalAI.