Closed MarkRx closed 2 weeks ago
it is correct. its openAI fault, they made a mess of the names
"gpt-4-turbo-..." is currently the strongest (and slowest) model they have. "gpt-4o-..." is the mid-level model. lower quality, but faster, and less expensive
Currently the
configuration.toml
defaults have the turbo model in "model" and the 4o non-turbo model in "model_turbo". Is this correct?https://github.com/Codium-ai/pr-agent/blob/main/pr_agent/settings/configuration.toml#L3-L4