I was having some long conversations with GPT-4-turbo, which has an 128K message window.
Then I received the error message saying the input is too long. After some debugging, I found it's the title generation that always uses GPT 3.5 (which is great!) and full input message.
I guess for the purpose of generating the title, simply truncating the input would be enough :-)
I was having some long conversations with GPT-4-turbo, which has an 128K message window.
Then I received the error message saying the input is too long. After some debugging, I found it's the title generation that always uses GPT 3.5 (which is great!) and full input message.
I guess for the purpose of generating the title, simply truncating the input would be enough :-)