Closed michaelneale closed 1 month ago
Somehow it is always passing in gpt-4o/mini even when calling anthropic providers:
eg in exchange:
def complete( self, model: str, system: str, messages: List[Message], tools: List[Tool] = [], **kwargs: Dict[str, Any], ) -> Tuple[Message, Usage]: print("anthropic model", model)
Always shows a gpt4 model despite the config being:
anthropic: provider: anthropic processor: claude-3-5-sonnet-20240620 accelerator: claude-3-5-sonnet-20240620 moderator: truncate toolkits: - name: developer requires: {}
Seems a regression @baxen @lukealvoeiro with some refactoring with provider loading?
edit: this is due to truncate.py having it hard coded, fix: https://github.com/square/exchange/pull/35
I have an idea about this, will fix tomorrow morning - Monday PST.
Somehow it is always passing in gpt-4o/mini even when calling anthropic providers:
eg in exchange:
Always shows a gpt4 model despite the config being:
Seems a regression @baxen @lukealvoeiro with some refactoring with provider loading?
edit: this is due to truncate.py having it hard coded, fix: https://github.com/square/exchange/pull/35