Open thedch opened 1 year ago
The chat seems to lose track of its context after a certain number of messages, which would indicate that it might automatically only keep the n
latest messages in its context. However, at the same time the quality of the responses seem to degrade as the conversation becomes longer and token count increases, which would indicate that it does not automatically clear out messages from its context. I'm not sure what it does.
I would also really like some clarification on what context is being included in the prompts. Should I be clearing my older messages in the conversation to keep the context token length down or not?
@microsoft why did my comment get deleted? at least to know my mistake?
The copilot chat extension setting does not seem to allow changing the model. It would be great to optionally select a large / slower / more expensive model.
Unrelated, as others have noted, it would be great to have a better sense of what context is available to the model. Currently it is a bit unclear, and difficult to know how to frame my queries.