Open lukka opened 9 months ago
@lukka thanks for filling this issue and for driving the language service integration in chat as we both learn how to best do this.
@jrieken and me just discussed this, and we like the approach of adding #cpp-context. And we think it is up to us in VS Code to add intent detection to automatically add related variables. Each variable should have a description when it is registered, and we should use these descriptions for intent detection purposes. So in the short term we suggest that you continue going down the #cpp-context variable path. And we will think how to make this smoother for users.
@isidorn just checking -- do you plan to have to have automatic intent detection before moving the chat API out of the proposed state, or will intent detection come later?
Very unlikely and even if we ship this we would likely allows users to disable this
@isidorn @jrieken do you have an approximate ETA for the landing the "variable pulled by intent detection" feature? E.g. without being exact, does it align with any milestone? It would be helpful to have at least a time frame for such a feature.
Did we consider the proposal in the original message? That is:
One idea is to extend the interaction between the
@workspace
agent and the language service, by adding a request of the "context" to the currently active language service for the active document, totally in a language-agnostic manner. This would extend the concept of providing additional prompt context to other languages.
My thoughts are:
@benmcmorran @esweet431 FYI
No ETA yet. We are hands-down focused on finalising the Chat and Language Model APIs. Once they are finalised (end of March) we will consider other features, and I expect us to have more details about "variable intent detection".
@roblourens @jrieken @isidorn when the variable is being mentioned, the context should come from the active document where the selection is currently living in. I am currently vscode.window.activeTextEditor?.document
to fetch the right context, and I wonder if it would be beneficial to have that expressed in the ChatVariableContext instead.
@hnrqbaggio @benmcmorran @spebl FYI
Problem
C++ language is evolved through multiple language standard versions (C+98, C++11, C++14, C++17, C++23) with differences in syntax, style, and practices. Similarly, the actual C++ code can significantly vary based on the used libraries and frameworks, target platforms which offers different API and dictates different coding conventions and idioms. Also compiler switches (e.g., disabling exceptions) affect the language available features.
Experiments with ChatAgent2 APIs
Experiments with a new chat-variable
#cpp-context
have been run on C++ codebases, by manually asking Copilot Chat various questions. The value of the#cpp-context
is being provided with the data pulled out from the C++ Language Service, which is aware of that information for (but not limited to) the active.cpp/.h
document.The answers have been compared when using and not using the
#cpp-context
variable: the results proved a higher accuracy of the answers since the provided code snippets and the suggestions are applicable and tailored to the underlying C++ codebase. As an example, when the#cpp-context
is used the question, the following is provided in the prompt (which depends on the codebase):the following is provided by the
#cpp-context
chat variable:Feature Request
We are open to suggestions about how to let users leverage the
#cpp-context
variable without typing it down in every question.One idea is to extend the interaction between the
@workspace
agent and the language service, by adding a request of the "context" to the currently active language service for the active document, totally in a language-agnostic manner. This would extend the concept of providing additional prompt context to other languages.