rubberduck-ai / rubberduck-vscode

Use AI-powered code edits, explanations, code generation, error diagnosis, and chat in Visual Studio Code with the official OpenAI API.
https://marketplace.visualstudio.com/items?itemName=Rubberduck.rubberduck-vscode
MIT License
602 stars 75 forks source link

Surface a way to extract the prompt text for use with OpenAI Plus web chat #111

Closed restlessronin closed 10 months ago

restlessronin commented 10 months ago

I have added a setting rubberduck.openAI.surfacePromptForPlus (default false) which enables a UI button on the Chat Title, to extract the prompt and insert it into a new text editor. The UI button (the codicon-eye) is only visible when the setting is true.

This enables us

  1. easily experiment with prompts on the web UI
  2. use the OpenAI plus webchat instead of the API with a VS Code project. this may be what this issue is asking for. This becomes more useful as the context being passed becomes more intelligent, and the large context models are used.
restlessronin commented 10 months ago

Since there isn't a mock mode, I modified the 'Explain with context' template to remove the initial message (which gets sent automatically as the custom chat is opened). This means the full context is being sent with every message which may not be the right thing to do. This functionality needs some testing and refinement.

lgrammel commented 10 months ago

I like it, but it duplicates existing functionality. You can just set the log level to debug and check the rubberduck console output:

image

I'm cool with adding this, but in general I prefer to have less code whenever possible to reduce the maintenance burden. Let me know if you want to proceed considering that this is already possible (unless I'm missing something)

restlessronin commented 10 months ago

Ah. I didn't fully consider that. If there was a "Mock" AIClient, which didn't send an actual request (but did do the logging), then we could inspect the logs easily without sending out a request first.

But copying and pasting would require opening the log, finding the right lines, and then possibly scrolling as one copies. Which would be OK for occasional prompt testing.

But if one is doing it a lot (especially for the 2nd use case), there's something nice about being able to get exactly the right prompt in an editor that you can quickly copy / paste.

I guess it depends on how much it's getting used. I was hoping to try this out with some complex custom prompts, and having a nice way to get the context so I can just copy / paste it was worth spending a couple of hours working on this PR.

But even for this use case, I would like to put some more work into gathering just the right context to send with the request. Have you considered the tree-sitter based strategies that are being used by some of the AI code agents, to put together a good context?

I saw you had a RAG option as well, but I haven't tried that. Does it work well?

lgrammel commented 10 months ago

Agree, it's easier this way. Re RAG: it's pretty experimental and clunky. Something more fancy like the strategies that you mentioned would be great to see. I have not explored it myself.