Closed restlessronin closed 10 months ago
Since there isn't a mock mode, I modified the 'Explain with context' template to remove the initial message (which gets sent automatically as the custom chat is opened). This means the full context is being sent with every message which may not be the right thing to do. This functionality needs some testing and refinement.
I like it, but it duplicates existing functionality. You can just set the log level to debug and check the rubberduck console output:
I'm cool with adding this, but in general I prefer to have less code whenever possible to reduce the maintenance burden. Let me know if you want to proceed considering that this is already possible (unless I'm missing something)
Ah. I didn't fully consider that. If there was a "Mock" AIClient, which didn't send an actual request (but did do the logging), then we could inspect the logs easily without sending out a request first.
But copying and pasting would require opening the log, finding the right lines, and then possibly scrolling as one copies. Which would be OK for occasional prompt testing.
But if one is doing it a lot (especially for the 2nd use case), there's something nice about being able to get exactly the right prompt in an editor that you can quickly copy / paste.
I guess it depends on how much it's getting used. I was hoping to try this out with some complex custom prompts, and having a nice way to get the context so I can just copy / paste it was worth spending a couple of hours working on this PR.
But even for this use case, I would like to put some more work into gathering just the right context to send with the request. Have you considered the tree-sitter based strategies that are being used by some of the AI code agents, to put together a good context?
I saw you had a RAG option as well, but I haven't tried that. Does it work well?
Agree, it's easier this way. Re RAG: it's pretty experimental and clunky. Something more fancy like the strategies that you mentioned would be great to see. I have not explored it myself.
I have added a setting
rubberduck.openAI.surfacePromptForPlus
(defaultfalse
) which enables a UI button on the Chat Title, to extract the prompt and insert it into a new text editor. The UI button (thecodicon-eye
) is only visible when the setting is true.This enables us