LLM plugin to generate plugins for LLM
Install this plugin in the same environment as LLM:
llm install llm-plugin-generator
To generate a new LLM plugin, use the generate-plugin
command:
llm generate-plugin "Description of your plugin"
Options:
PROMPT
: Description of your plugin (optional)INPUT_FILES
: Path(s) to input README or prompt file(s) (optional, multiple allowed)--output-dir
: Directory to save generated plugin files (default: current directory)--type
: Type of plugin to generate (default, model, or utility)--model
, -m
: Model to use for generation--type model will use a few-shot prompt focused on llm model plugins. --type utility focuses on utilities. leaving off --type will use a default prompt that combines all off them. I suggest picking one of the focused options which should be faster.
Examples:
Basic usage:
llm generate-plugin "Create a plugin that translates text to emoji" --output-dir ./my-new-plugin --type utility --model gpt-4
Using a prompt and input files - Generating plugin from a README.md
llm generate-plugin "Few-shot Prompt Generator. Call it llm-few-shot-generator" \
'files/README.md' --output-dir plugins/Utilities/few-shot-generator \
--type utility -m claude-3.5-sonnet
Using websites or remote files:
llm generate-plugin "Write an llm-cerebras plugin from these docs: $(curl -s https://raw.githubusercontent.com/irthomasthomas/llm-cerebras/refs/heads/main/.artefacts/cerebras-api-notes.txt)" \
--output-dir llm-cerebras --type model -m sonnet-3.5
This will generate a new LLM plugin based on the provided description and/or input files. The files will be saved in the specified output directory.
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-plugin-generator
python -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
pip install -e '.[test]'
To run the tests:
pytest
Contributions to llm-plugin-generator are welcome! Please refer to the GitHub repository for more information on how to contribute.
This project is licensed under the Apache License 2.0. See the LICENSE file for details.