RocketChat / Apps.RC.AI.Programmer

Generate short code snippets in various languages using LLMs within Rocket.Chat. Share directly or create pull requests on GitHub.
3 stars 1 forks source link

Shorten the LLM prompts to comply with token limits #22

Closed RyanbowZ closed 3 months ago

RyanbowZ commented 3 months ago

User's personalized inputs might be too long to surpass the token limits for different LLMs. Thus, we should set a boundary for LLM prompts to comply with token limits of every LLM. By research, the token limits are the followings respectively: