rsaryev / talk-codebase

Tool for chatting with your codebase and docs using OpenAI, LlamaCpp, and GPT-4-All
MIT License
476 stars 39 forks source link

Use optional dependency group for local LLMs #54

Open lordmauve opened 4 months ago

lordmauve commented 4 months ago

The size of dependencies for this project is enormous:

$ du -hs venv
5.6G    venv

It also took several minutes to install.

I only want to use hosted models. Can the dependencies for local models be added to an optional dependency group (e.g. pip install talk-codebase[local])? I imagine the dependencies for hosted models would be a few MB.