Closed IgnisDa closed 3 months ago
Right now, well uses gpt-4o for all requests. I would prefer to use gpt-3.5-turbo since I have a very big codebase and this model has a higher TPM.
gpt-4o
gpt-3.5-turbo
OPENAI_MODEL
Perhaps I should make it clearer in the readme, thanks for pointing this one out.
Right now, well uses
gpt-4o
for all requests. I would prefer to usegpt-3.5-turbo
since I have a very big codebase and this model has a higher TPM.