An Code Llama agent that watches your repo for file changes and asks code llama to review your code, then responds with code reviews for failed files and "LGTM" files that pass.
Example usage
$ python senior-engineer.py --api=http://382e-34-143-193-144.ngrok.io /path/to/your/repo/folder
I built 10x-Senior-Engineer on this this live stream on my Youtube. It works pretty good.
10x-Senior-Engineer runs on it's own process and waits for file changes. As it gets new file changes it will generate code reviews using the Colab API you run. It will keep a queue of the past 10 files that have been reviewed.
10x-Senior-Engineer currently using a flask api running from a Google Colab to prompt Code Llama. If you want to change this to use llama.cpp please create an issue and I'll try to address it.
Go to the Google Colab and run the server and copy the ngrok url
You have to run all the cells and ensure the flask server is running properly
Example ngrok url below
$ git clone git@github.com:jawerty/10x-Senior-Engineer
$ cd 10x-Senior-Engineer
$ pip install -r requirements.txt
The arguments/options
usage: senior-engineer.py [-h] [--api API] repository
positional arguments:
repository Pass in the source code folder you want to watch
options:
-h, --help show this help message and exit
--api API Link to the colab ngrok url
Example command
$ python senior-engineer.py --api=http://382e-34-143-193-144.ngrok.io /path/to/your/repo/folder