Closed dhlidongming closed 1 month ago
@dhlidongming I'd be curious to better understand the use case here before merging this. Is there a reason to prefer this rather than starting new chats?
@sestinj I noticed that after adding the defaultContext
parameter, each conversation might contain a large number of tokens, and users may not be accustomed to opening new chats too frequently. Adding this parameter can serve as an alternative way to control context length, in addition to the contextLength
parameter.
Ok that's a very good point! I think there might actually be a better solution here. I hadn't thought about this before, but the defaultContext should probably be placed into the system message to avoid being sent with every chat request. Is this something you would be interested in trying to implement? Otherwise, I think i'm going to close this in favor of doing that to solve the root of the problem
Thank you! That’s a great idea, but I don’t have much time to dedicate to it at the moment. I'm looking forward to seeing this new change though.
Description
This pull request introduces a new parameter, numHistory, in the completionOptions. The purpose of this parameter is to specify the number of recent messages to include in the input context when generating completions.
Checklist
dev
, rather thanmain
Testing
1.Add the following configuration in config.json:
{ "completionOptions": { "numHistory": 3 } }
2.During interactions, the llm's input context consists of only the most recent 3 messages, as specified by the numHistory parameter.