A code improvement tool utlizing an AI agent swarm.
This project aims to build code improvement software that utlizes an AI agent swarm using locally hosted models. There are three components to this project:
The package can be used as a basic CLI tool to perform refactoring tasks on input code. It can accept the input as a string, or as a file/folder of files. This CLI functionality is primarily for testing purposes as it is not expected to be used directly by end-users (and will thus not be made available via PyPI).
The VSCode extension simply provides a command, available via right-clicking in a code window or via the Command Palette, to perform the AI refactoring on specified code. The extension is still in development and is not yet available for public use.
Note: a machine with a good GPU is highly recommended
Ollama is required to use the package or extension. Follow the instructions on the Ollama website to install it on your machine.
Then run the following command retrieve the models and store them locally:
ollama pull llama3
ollama pull codellama
Install the Python dependencies and pre-commit hooks by running the following commands:
poetry install --no-root
poetry run pre-commit install
Then navigate to the vscode-extension directory, and create and activate a virtual environment:
cd vscode-extension
python -m venv venv
# On MacOS/Linux
source venv/bin/activate
# On Windows
.\venv\Scripts\activate
Install nox in the activated environment, then run nox to install all python package dependencies (into vscode-extension/bundled/libs
):
python -m pip install nox
nox --session setup
Note: If this fails, you may have to install rust
:
# Install Rust using rustup
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
# Follow the on-screen instructions to complete the installation
# Add Cargo's bin directory to your PATH
# For zsh
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.zshrc
source ~/.zshrc
# For bash
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.bashrc
source ~/.bashrc
# Verify Rust installation
rustc --version
Finally, install node
packages:
npm install
To build the Python package, run the following command in the project root directory:
poetry build
This will create a .tar.gz
file in the dist
directory. The package can be then be installed via pip
.
To build the VSCode extension, first ensure the latest version of the Python package is built and stored in the vscode-extension/bundled/llm-swarm-build
directory. Then run the following command in the vscode-extension
directory:
echo "Updating bundled package dependencies..."
nox --session setup
echo "Building VSCode extension..."
npx vsce package
This will create a .vsix
file that can be installed in VSCode like a regular extension.
For more information on building and developing this extension, refer to the VS Code python tools extension README.
├── README.md This file.
├── pyproject.toml Poetry configuration file for Python packages.
├── python-package
│ ├── src
│ │ └── llm-swarm
│ │ ├── ai
│ │ │ ├── agents.py Defines the AI agents in the crew.
│ │ │ ├── crew.py Defines the AI crew.
│ │ │ ├── models.py Functions to retrieve models via Ollama.
│ │ │ └── tasks.py Defines the tasks that the AI crew can perform.
│ │ ├── main.py Main entry point for the Python package.
│ │ └── utils
│ │ └── cli.py CLI tool for running the Python package.
│ └── tests
│ └── input Test input files.
├── research
│ ├── README.md Research documentation.
│ └── llm_swarm.ipynb Jupyter notebook documenting the AI agent swarm.
├── test.sh Bash script for testing the Python package.
└── vscode-extension
├── README.md
├── bundled
│ ├── llm-swarm-build Python package builds (llm_swarm-*.tar.gz).
│ └── tool
│ └── lsp_server.py Python code called by extension.
├── eslint.config.js ESLint configuration file.
├── noxfile.py Configuration file for Nox.
├── package.json Node.js configuration file.
├── requirements.in Python package requirements.
└── src
└── extension.ts Main entry point for the VSCode extension.
Note: The vscode-extension
folder is based on a VSCode Extension Template - only modified files are included in this diagram.
---
title: VSCode Extension Architecture
---
flowchart TB
subgraph vscode_extension [VSCode IDE]
code[Current file / Selected code]
python_package --> code
end
code --> python_package
subgraph python_package [LLM Swarm]
agent1[Planning Agent]
agent2[Code Refactor Agent]
agent3[QA Agent]
agent4[Revision Agent]
agent1 --> agent2
agent2 --> agent3
agent3 --> agent4
end
Ensure the models are stored locally - see Ollama Setup for instructions.
The Ollama server must be running to use either the package or extension. It can be started either via the Ollama app, or via the terminal command ollama serve
.
# use -h to see args
poetry run cli
Alternatively, run the script directly with poetry run python main.py
.
To run the extension in development, open the project in VSCode and select Run > Start Debugging
from the top menu. This should open a new VSCode window where one can open some files and run the extension on them.
See the extension README for details on using the extension.
There is a bash script in the root folder that runs the AI Crew on a set of test files found in /python-package/tests/input
. Run it with:
./test.sh