Open mccoysc opened 1 year ago
Ask for more detail.
The request structure for any ChatGPT query is as follows: n user queries (with the role attribute set as "user") + n system prompts (with the role attribute set as "system"). System prompts are not visible to the user and do not count towards the token limit (ChatGPT has a token limit per query, where a token refers to a segmented word in a sentence). If only user queries are submitted, ChatGPT will only use its known model and the user query to respond. However, if previous user queries and ChatGPT's responses are submitted together as system prompts, ChatGPT will use the system prompts (such as previous conversation content) to respond, giving the impression that ChatGPT knows what the user has previously asked and how it has responded, avoiding inconsistencies. Therefore, to achieve this effect, it is necessary to store all conversation content locally or elsewhere, and to provide the ability to reset the current or specified conversation,means "delete" the stored querys and responses
System prompts (i.e. content with the "system" role attribute) were designed by ChatGPT specifically for displaying prompt information, and there is no need to mix prompt information with user queries.
Yes, I have a local branch supported the System prompts.
I will update this issue when the branch is fully ready. The problem is that in the real-world, the system prompt does not work well on ChatGPT 3.5. In the v4 engine, the system prompts work a lot better. But I still need v4 API access, so I did not thoroughly test the codes and get them ready.
I will keep this open for tracking. Thanks.
Additionally, a toggle should be designed to allow users to choose between conversation mode or single query mode.
in conversation mode,first i query "2+3 equal what?" and then i ask "what's my last question?",chatGPT will know what i said. but in non-conversation mode,chatGPT always do not know what is my last question.
I don't think that should name conversation mode,
but the system prompt will be supported. Please come back to this issue when it is ready.
in conversation mode,first i query "2+3 equal what?" and then i ask "what's my last question?",chatGPT will know what i said. but in non-conversation mode,chatGPT always do not know what is my last question.
Yes.
I don't think that should name
conversation mode,
but the system prompt will be supported. Please come back to this issue when it is ready.
do not only "system promt",but add all user querys and chatGPT response to system promt
as the title