Open irthomasthomas opened 3 weeks ago
I always get caught out by package data! Try this:
[tool.setuptools.package-data]
llm_plugin_generator = ["**/*.xml"]
@simonw Thanks for the tip but I'm still unable to solve this after a couple more hours trying. For now it only works by cloning the repo. I'm just going to circle back to it later, as I have a lot of ideas for plugins I want to try building with it, instead of spending the rest of the day on packaging. Cheers!
@simonw I can hardly believe it but my agent was able to solve this! First I had to change the agent script to only load the claude plugin for itself, so that when the new plugin failed it wouldn't get blocked in its own thread. Then I let it run. It went round the block a few times and needed 55 turns! But I only had to prompt it once. I'm not sure if it's done the best thing, exactly, but it works. It looks like the main change was to stuff everything into init.py
So give it a try:
llm install llm-plugin-generator
llm generate-plugin "Write an llm-cerebras plugin from these docs: $(curl -s https://raw.githubusercontent.com/irthomasthomas/llm-cerebras/refs/heads/main/.artefacts/cerebras-api-notes.txt)" \
--output-dir llm-cerebras --type model -m sonnet-3.5
@simonw
In fact it only took 14 turns to solve. The other 40 where caused by some agent confusion, which I think I have now fixed. It prompts the llm with an error message if no command or final_response is received, and claude was taking this to mean an error related to the llm plugin. So I've rephrased that message now to be clearer.
This plugin works great if you clone the repo and llm install -e . But when installed from pypi, it cannot find the xml files.
LLMs tell me to follow this pattern:
llm-plugin-generator/ ├── llm_plugin_generator/ │ ├── init.py │ ├── llm_plugin_generator.py │ ├── few_shot_prompt_llm_plugin_all.xml │ ├── few_shot_prompt_llm_plugin_model.xml │ ├── few_shot_prompt_llm_plugin_utility.xml ├── pyproject.toml ├── README.md
This way the install succeeds, but the plugin does not show up in llm.