Open kariharju opened 1 month ago
Comment by tonnitommi
Friday Mar 22, 2024 at 05:42 GMT
OpenAI GPT allowed return value size is around 3500 ASCII characters
Doesn't look like GPTs would restrict this anymore (at least there is not an error), but the GPT will not have access to the entire output if it's long. When asking a GPT what it can handle, the response is:
The maximum length of text I can effectively handle in a single interaction depends on the context of the request. For processing and analyzing text within a single response, I work best with segments up to around 4096 tokens (words and punctuation marks). This limit helps ensure accuracy in my responses and analyses.
Note: Action Server Toolkit for langchain has a truncation in the connector. Here.
What happened?
Current Action Server @action has limitless return value size. OpenAI GPT and Large Language Models in general work with limited size tokenized texts. OpenAI GPT allowed return value size is around 3500 ASCII characters. After that it will just error on Action use.
This means that actions that return too big outputs will always fail from the callers (clients) point of view.
I propose that Action Server internally truncates the outputs. The exact max size number could be configured in the Web UI.
Reference implementation:
System Info