C-Loftus / talon-ai-tools

Query LLMs and AI tools with voice commands
http://colton.place/talon-ai-tools/
MIT License
51 stars 20 forks source link

Adds a model format loop command #41

Closed jaresty closed 6 months ago

jaresty commented 6 months ago

If you input some desired behavior and then run model for a loop on it will try to generate loop that solves the problem.

C-Loftus commented 6 months ago

I think we should probably change this to format code or something. Since changing selected text into code is not specific to iteration or looping. I feel format loop is a bit too specific, and implies we would also have format class or format switch or other programming constructs.

Alternatively if you disagree it also might be time at some point to change the repo so the prompt configuration is stored in an external file outside the current directory so we don't have merge conflicts. But not exactly sure how to do this at the moment since we are statically reading in the file for the help menu and there is not a good way to append to a list that is already initialized without creating a capture wrapper over it

jaresty commented 6 months ago

I do think it would be good to provide some additional programming constructs; I think providing the context that the expected output is a loop helps the LLM to give a more targeted response. I'm not sure if we need to do all constructs, but conditional was one I was thinking would be good (switch is one kind of conditional too)

jaresty commented 6 months ago

A general purpose code one could be good too, though I'm not sure it will do so well at that beyond basic syntax.

jaresty commented 6 months ago

Here's a thought-what if we add both and try them out for a while, and if we discover that format loop and format code are roughly equivalent we could deprecate format loop? That way we can test them out before making a final decision there.

jaresty commented 6 months ago

That said, I do think there's also value in providing some mechanism for users to customize their own prompts too, which I think is what you were hinting at. I don't think the two things are tied together though.

C-Loftus commented 6 months ago

I would prefer to first merge a model convert/format code since I think this is relatively niche. GitHub copilot is really much better at this given the fact it can insert into the proper destinations and gain the context of your codebase, and you don't need to select text beforehand. I think format code is ok since it is not too specific and we already have shell.

It's totally fine if you find format loop to be useful. I just think it is too specific to your setup to merge, since I don't think many people write the instructions down before they generate the code. I would prefer to encourage people to use the copilot integration. I know you said you don't use copilot, which is totally fair, but I think there are some open source alternatives that can be used instead.

It's possible that what we want to do is to change the static list into a talon capture, such that the capture is a logical OR of the prompts included within the static list or the user's special custom prompt list. If you want to look into that I would merge that as long as it doesn't break anything.

jaresty commented 6 months ago

We discussed on the phone and I think we arrived at a good compromise:

This way it doesn't collide with the model format commands and makes it fairly simple to see and experiment with the code generation ones.

C-Loftus commented 6 months ago

Looks good thanks. If the underlying model changes affect this at all going forward let me know and we can discuss going forward in more depth about pinning model versions