Closed mlarhrouch closed 1 year ago
Can the token be set to 4096?
https://github.com/mlarhrouch/azure-pipeline-gpt-pr-review/blob/main/GPTPullRequestReview/review.ts#L86 https://platform.openai.com/docs/guides/chat/managing-tokens
Originally posted by @ChengYen-Tang in https://github.com/mlarhrouch/azure-pipeline-gpt-pr-review/issues/1#issuecomment-1491673498
the max_tokens parameter is specify the max tokens that can ChatGPT use when answering, and the count tokens from our propmpt + tokens from ChatGPT shouldn't exeeded 4096.
OK, thanks
https://github.com/mlarhrouch/azure-pipeline-gpt-pr-review/blob/main/GPTPullRequestReview/review.ts#L86 https://platform.openai.com/docs/guides/chat/managing-tokens
Originally posted by @ChengYen-Tang in https://github.com/mlarhrouch/azure-pipeline-gpt-pr-review/issues/1#issuecomment-1491673498