AIMessage(content='\nAn LLM, or Master of Laws, is a postgraduate academic degree pursued by those holding a professional law degree, allowing them to specialize in a particular field of law or gain deeper understanding in multiple legal systems.', additional_kwargs={}, response_metadata={'token_usage': Usage(completion_tokens=48, prompt_tokens=14, total_tokens=62, completion_tokens_details=None, prompt_tokens_details=None), 'model_group': 'small', 'model_group_size': 1, 'deployment': 'bedrock/mistral.mistral-small-2402-v1:0', 'model_info': {'id': 'a97f9695d3aa3def1701731164802172cbc2046d203ed5e1efebdb681bcfd2cc', 'db_model': False}, 'api_base': None, 'caching_groups': None, 'hidden_params': {'custom_llm_provider': 'bedrock', 'region_name': None, 'optional_params': {'temperature': 1.0}, 'model_id': 'a97f9695d3aa3def1701731164802172cbc2046d203ed5e1efebdb681bcfd2cc', 'api_base': None, 'response_cost': 0.000158, 'additional_headers': {}}, 'finish_reason': 'stop'}, id='run-7cf6dba8-919f-4115-a73e-01ee6dffa1bf-0')
Description
The routing for a specific model is not functioning, by default its taking the first model from the model list.
We have configured three models here, small, medium and large. We have configured ChatLiteLLmRouter to use large, butit used first model in the list, which is small.
In the source code litellm_router.py, its configured to select the first model from the model_list configuration.
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
AIMessage(content='\nAn LLM, or Master of Laws, is a postgraduate academic degree pursued by those holding a professional law degree, allowing them to specialize in a particular field of law or gain deeper understanding in multiple legal systems.', additional_kwargs={}, response_metadata={'token_usage': Usage(completion_tokens=48, prompt_tokens=14, total_tokens=62, completion_tokens_details=None, prompt_tokens_details=None), 'model_group': 'small', 'model_group_size': 1, 'deployment': 'bedrock/mistral.mistral-small-2402-v1:0', 'model_info': {'id': 'a97f9695d3aa3def1701731164802172cbc2046d203ed5e1efebdb681bcfd2cc', 'db_model': False}, 'api_base': None, 'caching_groups': None, 'hidden_params': {'custom_llm_provider': 'bedrock', 'region_name': None, 'optional_params': {'temperature': 1.0}, 'model_id': 'a97f9695d3aa3def1701731164802172cbc2046d203ed5e1efebdb681bcfd2cc', 'api_base': None, 'response_cost': 0.000158, 'additional_headers': {}}, 'finish_reason': 'stop'}, id='run-7cf6dba8-919f-4115-a73e-01ee6dffa1bf-0')
Description
The routing for a specific model is not functioning, by default its taking the first model from the model list.
We have configured three models here, small, medium and large. We have configured ChatLiteLLmRouter to use large, butit used first model in the list, which is small.
In the source code litellm_router.py, its configured to select the first model from the model_list configuration.
self.model = self.router.model_list[0]["model_name"]
System Info
python -m langchain_core.sys_info
System Information
Package Information
Optional packages not installed
Other Dependencies