Sentdex / TermGPT

Giving LLMs like GPT-4 the ability to plan and execute terminal commands
MIT License
406 stars 95 forks source link

the future #8

Open cryptskii opened 1 year ago

cryptskii commented 1 year ago

"I would like to primarily find an open source model that can yield similar performance to GPT-4 in this realm."

I hear good things about Vacuna being what you describe https://medium.com/geekculture/list-of-open-sourced-fine-tuned-large-language-models-llm-8d95a2e0dc76

DRUMNICORN commented 1 year ago

Additional for Future:

achraf-oujjir commented 1 year ago

Very interesting. As I was testing TermGPT for installing openstack, I had a couple of difficulties and it gave me the idea of what a user would want to do with the commands. You can check my pull request or my forked repo. I also intend to add the possibility for the user to ask about the explanation of a command. I also have more functionalities in the future.

Free-Radical commented 1 year ago

@Sentdex, great tool, I’ve been experimenting with piping on the CLI for locally hosted or remote LLM models for a few months. As I write this, my not-so-powerful Ubuntu desktop is running WizardLM-7B (the best small model according to the Huggingface scorecard) locally using the llama-cpp-python API server. It’s quite easy to do.

llama-cpp-python API is fully compatible with the OPENAI/CHATGPT API, so you should be able to build this functionality into your client with ease. If I have extra time, I’ll push a patch.

However, from a design perspective, I would humbly suggest the following:

Keep in mind the old-school Unix spirit of CLI minimalism and piping/chaining commands. Avoid the “everything built-in except the kitchen sink” syndrome. Many tools already exist on the Linux CLI that can be used in conjunction with termGPT, so termGPT should not replicate these tools to avoid becoming bloated or slow.

For example, it would be better to use a terminal-based browser like Links2 or browsh that already support JavaScript and can scrape text and dump or pipe it to termGPT instead of expending effort solving this annoying unrelated problem using Beautiful Soup, which routinely fails due to most sites now using JavaScript!

Another example is tldr, which can already get human-friendly summary man pages for CLI commands and could be piped to termGPT to provide faster and better examples. Even DuckDuckGo and Google have terminal CLI tools that should be used with termGPT to lessen its load. BTW, AI compute load becomes a huge issue with locally hosted LLM's.

In summary, I suggest keeping termGPT lean and focused on its core function of providing I/O from any LLM (local or remote) while using other existing tools as much as possible. If you follow this philosophy, termGPT may become a de facto standard.

PS A client that already does this and is pretty nice (although slightly buggy and written in GO) is https://github.com/charmbracelet/mods and is well worth checking out for its design philosophy.

Good luck!