microsoft / vscode-copilot-release

Feedback on GitHub Copilot Chat UX in Visual Studio Code.
https://marketplace.visualstudio.com/items?itemName=GitHub.copilot-chat
Creative Commons Attribution 4.0 International
295 stars 28 forks source link

Option to select bigger (gpt4?) model #266

Open thedch opened 1 year ago

thedch commented 1 year ago

The copilot chat extension setting does not seem to allow changing the model. It would be great to optionally select a large / slower / more expensive model.

Screenshot 2023-06-20 at 10 58 46 AM

Unrelated, as others have noted, it would be great to have a better sense of what context is available to the model. Currently it is a bit unclear, and difficult to know how to frame my queries.

knoll3 commented 11 months ago

The chat seems to lose track of its context after a certain number of messages, which would indicate that it might automatically only keep the n latest messages in its context. However, at the same time the quality of the responses seem to degrade as the conversation becomes longer and token count increases, which would indicate that it does not automatically clear out messages from its context. I'm not sure what it does.

I would also really like some clarification on what context is being included in the prompts. Should I be clearing my older messages in the conversation to keep the context token length down or not?

RealityMoez commented 11 months ago

@microsoft why did my comment get deleted? at least to know my mistake?