microsoft / vscode

Visual Studio Code
https://code.visualstudio.com
MIT License
164.04k stars 29.21k forks source link

Chat variables API #212995

Open isidorn opened 5 months ago

isidorn commented 5 months ago

Proposal dts: https://github.com/microsoft/vscode/blob/roblou/chat-references/src/vscode-dts/vscode.proposed.chatVariableResolver.d.ts

Sample: https://github.com/microsoft/vscode-extension-samples/tree/main/chat-sample Docs: https://code.visualstudio.com/api/extension-guides/chat#variables

Extension authors can subscribe to this issue to get updates about the proposed Variables API. There may still be breaking changes coming, and we will post here whenever a breaking change is made.

We are very interested in feedback about how you might use this API.

rvanider commented 5 months ago

With the latest 1.90.0 and chatVariableResolver api it appears, from my usage, there appears to be a few oddities.

In general, I would like to have the uri present when it is possible.

For participant provided variables it would be helpful to have the following:

In my particular use case I have many variables (as many as 200-300) per workspace that I want to make available however the compute effort to provide them is high - using the chat completions is preferable to an interactive selector (such as #file). Effectively, it is a decorated/augmented variant of a DocumentSymbol I am most interested being able to relay through the chat reference/variables.

I am hopeful that I can build a user experience where the user effectively "points" to some element (per above) and issues commands such as /review, /revise, /audit, /explain where some of these are simple while others effectively kick off an interactive revision and exploration session.

roblourens commented 5 months ago

selection does not appear in request.references consistently, after it shows up, is consistently there (sorry, not recreation)

If #selection is in the query, it should always show up in request.references. If you see a case where it doesn't, can you open an issue with more details?

selection, if there is a selection in an active editor and you switch editors which does not have a selection the behaviour appears to become #editor

Yeah, it does have this fallback to #editor (visible code), I'm not sure whether to keep that, but the idea is to have one variable that can mean "the code I'm looking at"

selection has the range but does not identity the file the selection is from

I think I will change selection to actually report a Location instead of just a string

editor would be more helpful if it included the document uri

Same here

well defined conventions for id and name since we are expected to parse the prompt to extract the variables and replace with the actual context

Not really- the reference in the prompt includes the range where the reference appears in the prompt, and you can use that to replace it or whatever you want. VS Code is responsible for parsing parts of the prompt.

guidance (and example) on participant specific variables

We will need to add this to the chat sample extension as well.

Interesting use-case, thanks. Is this for data from an existing extension? Would you want your variable to be available to any chat participant, or only your own chat participant?

rvanider commented 5 months ago

It is for an existing in-house extension that manages a custom DSL. There are a of couple different focused participants (development aids, audit and quality) that we want to explore. The participants rely heavily on the original extension for model content.

The variables are really only useful if you are aware of the additional context so there is little value in cluttering the other participants with a large symbol set. We have played with having the participant priming the conversation with the @workspace participant since we can make better relevance selections.

Exposure to other participants would be useful but only if we could delay value rendering until the point of consumption and know what participants we were rendering for so either generic or specific content could be provided. Right now, it is a bit frustrating that we have to supply a populated value prior to actual use. At one point, it appeared like the api was going to defer resolution to a fully flushed out result but that currently isn’t the case. Perhaps I can fake it with a getter but it doesn’t feel quite right.

I will keep an eye on the selection issue and if I see a clear pattern I will create an issue.

Yeah, I missed the range pointing at the prompt content itself - my bad.

Thanks for the work so far, very promising stuff.

isidorn commented 5 months ago

@rvanider thanks for sharing more details. Can you share the company name? If you do not want to disclose publicly you can also reach out to inikolic@microsoft.com I am very curious about your use case. Do you plan to publish your extension to the VS Marketplace, or keep it internal?

We will need to add this to the chat sample extension as well.

The chat sample had a cat_context variable, but we removed it because the API is still proposed. If you want to check what we had you can check it out here https://github.com/microsoft/vscode-extension-samples/commit/cfec264062a99a4ee247bff5a1cda4360bce9301#diff-9e4bad629c61f35e9282721dfb0f4145256183536f2d08424ecc35667193b98e

frblondin commented 5 months ago

There is no way to know what data from variable is actually sent to the LLM. While UI will probably better expose this in the future, would is be possible to simply add logs in the meantime, as suggested in #1251?

roblourens commented 5 months ago

The chat participant can reply with a reference which says what data it's using for its prompt. But vscode shouldn't take a role in this automatically, only the chat participant knows what was actually used.