Open chew-z opened 1 month ago
I encountered the same error.
I found that problems would occur when I went to this code branch.
if adjusted_max_tokens <= 0:
I think the error is caused by the logic of this code.
async def generate_completion_from_openai(
prompt: str, max_tokens: int = 5000
) -> Optional[str]:
...
adjusted_max_tokens = min(
max_tokens, 4096 - prompt_tokens - TOKEN_BUFFER
)
...
response = await openai_client.chat.completions.create(
model=OPENAI_COMPLETION_MODEL,
messages=[{"role": "user", "content": chunk}],
max_tokens=adjusted_max_tokens, # there error
temperature=0.7,
)
...
I am getting negative max tokens in AI call.. Some small bug probably...