micr0-dev / lexido

A terminal assistant, powered by Generative AI
GNU Affero General Public License v3.0
218 stars 8 forks source link

[Idea | Suggestion] TGPT as backend? #42

Closed xplshn closed 4 months ago

xplshn commented 4 months ago

Couldn't tgpt -w be used as backend? tgpt supports many providers and the default one is always the most reliable one. So, you could leverage all the AI backend work to tgpt and have your users install tgpt and lexido, where TGPT is a program that lets you use various AIs without any API keys and lexido is an assitant that uses various AI backends? Maybe this could become a simple flag that instead of using Gemini lets the user use an external command like TGPT, can be piped using tgpt -w, where -w means Whole, so, once the AI's response is complete it returns it as Markdown or Plaintext

micr0-dev commented 4 months ago

I would be open to adding the ability of using something like that, however I would like to keep the core as Gemini as it is way easier to install and set-up for new users. So maybe lexido would have a flag or config setting allowing you to use something like tgpt. How familiar are you with Golang or tgpt? Would you be interested in contributing?

xplshn commented 4 months ago

I am familiar but not proficient enough with Go, however I have written a very practical and usable program. That aside. TGPT is very straightforward and easy to use, you just fetch the binary, chmod +x and then use it without any config files or anything:

2024-04-03-024241_1366x768_scrot

micr0-dev commented 4 months ago

I have spent a good few hours trying to implement TGPT to function with lexido, however in its current state tgpt really doesn't like to run in a non-user environment (aka inside of lexido) it only spits out errors without generating anything. I think the easier way forward is to manually implement some more LLM options, as we could look at how TGPT implements the APIs and do the same. image

I pushed the branch (feature/tgpt) of what I was able to accomplish if you would like to take a look at it, but neither a content stream nor a whole-generation approach works with TGPT

micr0-dev commented 4 months ago

https://github.com/micr0-dev/lexido/compare/main...feature/tgpt

xplshn commented 4 months ago

Yeah, it is foolish to rely on external programs, sorry. But I like the idea of using the TGPT backends, like Phind.com for example, phind.com is reliable and obtains info from the internet, it can even read docs when it is unsure of how a program or piece of code works. And there is also support for locally hosted models and the weird yet fun KoboldAI

micr0-dev commented 4 months ago

Yeah, also phind.com looks interesting thanks for sharing. I definitely want to add a local llm option, do you know by chance which might be the best for this? The goal here is to make it as simple to set up as possible

micr0-dev commented 4 months ago

wait so phind.com's models can run locally? is that what you are saying?

micr0-dev commented 4 months ago

Currently, I am looking into having the user install ollama and using that, this allows for a lot of different local LLMs including llama.cpp

xplshn commented 4 months ago

Have you looked into https://github.com/Mozilla-Ocho/llamafile ?

micr0-dev commented 4 months ago

Check out v1.3! We now have local LLM support! :partying_face: