Talk-codebase is a tool that allows you to converse with your codebase using Large Language Models (LLMs) to answer your queries. It supports offline code processing using LlamaCpp and GPT4All without sharing your code with third parties, or you can use OpenAI if privacy is not a concern for you. Please note that talk-codebase is still under development and is recommended for educational purposes, not for production use.
Requirement Python 3.8.1 or higher Your project must be in a git repository
pip install talk-codebase
After installation, you can use it to chat with your codebase in the current directory by running the following command:
talk-codebase chat <path>
Select model type: Local or OpenAI
OpenAI
If you use the OpenAI model, you need an OpenAI API key. You can get it from here. Then you will be offered a choice of available models.
Local
If you want some files to be ignored, add them to .gitignore.
To reset the configuration, run the following command:
talk-codebase configure
You can manually edit the configuration by editing the ~/.config.yaml
file. If you cannot find the configuration file,
run the tool and it will output the path to the configuration file at the very beginning.
.csv
.doc
.docx
.epub
.md
.pdf
.txt
popular programming languages