Open Infracloud-harsh opened 7 months ago
CodeRabbit
This update enhances the GitHub Actions workflow and README file, introduces a new function splitPrompt
with its test suite for handling large prompts, and improves the handling of large files in the codeReview
and doSummary
functions. The changes aim to optimize performance and maintainability.
Files | Summary |
---|---|
.github/workflows/openai-review.yml , README.md |
Updated the concurrency group configuration in the GitHub Actions workflow and restructured the README file for better organization. |
__tests__/tokenizer.test.ts , src/tokenizer.ts |
Introduced a new function splitPrompt that splits a prompt into multiple pieces based on a maximum token count, along with a comprehensive test suite. |
src/inputs.ts , src/prompts.ts , src/review.ts |
Added a new property fileIndex to the Inputs class and improved the handling of large files and prompts in the codeReview and doSummary functions. |
🐰 "In the land of code, where logic is king,
We dance with tokens, in a ring.
Large files we split, with a swift rabbit's hop,
Making our codebase, rise to the top! 🎉"
Hello @harjotgill, I have worked on the issue of prompt limit for long PRs. It works properly for us. Can we please have a look on this PR.
Summary by CodeRabbit
splitPrompt
function to handle large prompts that exceed the maximum token limit, improving the handling of large files and prompts.codeReview
anddoSummary
function signatures for better code readability and maintainability.splitPrompt
function to ensure its correct functionality.