langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.84k stars 15.35k forks source link

ChatLiteLLMRouter: Routing functionality is not working #28077

Open kjoth opened 1 day ago

kjoth commented 1 day ago

Checked other resources

Example Code

import os
from litellm import Router
from litellm.router import RetryPolicy, AllowedFailsPolicy, AlertingConfig
from langchain_community.chat_models import ChatLiteLLMRouter
from langchain_core.globals import set_debug
import langchain
import litellm
import boto3
from requests_aws4auth import AWS4Auth

model_list = [
    {
        "model_name": "small",  
        "litellm_params": {
            "model": "bedrock/mistral.mistral-small-2402-v1:0", 
        }
    },
    {
        "model_name": "medium",  
        "litellm_params": {
            "model": "bedrock/mistral.mistral-7b-instruct-v0:2", 
        }
    },
    {
        "model_name": "large",  
        "litellm_params": {
            "model": "bedrock/mistral.mixtral-8x7b-instruct-v0:1",
        }
    }
]

router = Router(model_list=model_list)

chat = ChatLiteLLMRouter( model_name="large",router=router)
chat.invoke("Describe LLM")

Error Message and Stack Trace (if applicable)

AIMessage(content='\nAn LLM, or Master of Laws, is a postgraduate academic degree pursued by those holding a professional law degree, allowing them to specialize in a particular field of law or gain deeper understanding in multiple legal systems.', additional_kwargs={}, response_metadata={'token_usage': Usage(completion_tokens=48, prompt_tokens=14, total_tokens=62, completion_tokens_details=None, prompt_tokens_details=None), 'model_group': 'small', 'model_group_size': 1, 'deployment': 'bedrock/mistral.mistral-small-2402-v1:0', 'model_info': {'id': 'a97f9695d3aa3def1701731164802172cbc2046d203ed5e1efebdb681bcfd2cc', 'db_model': False}, 'api_base': None, 'caching_groups': None, 'hidden_params': {'custom_llm_provider': 'bedrock', 'region_name': None, 'optional_params': {'temperature': 1.0}, 'model_id': 'a97f9695d3aa3def1701731164802172cbc2046d203ed5e1efebdb681bcfd2cc', 'api_base': None, 'response_cost': 0.000158, 'additional_headers': {}}, 'finish_reason': 'stop'}, id='run-7cf6dba8-919f-4115-a73e-01ee6dffa1bf-0')

Description

The routing for a specific model is not functioning, by default its taking the first model from the model list.

We have configured three models here, small, medium and large. We have configured ChatLiteLLmRouter to use large, butit used first model in the list, which is small.

In the source code litellm_router.py, its configured to select the first model from the model_list configuration.

self.model = self.router.model_list[0]["model_name"]

image

System Info

python -m langchain_core.sys_info

System Information

OS: Darwin OS Version: Darwin Kernel Version 23.6.0: Mon Jul 29 21:14:30 PDT 2024; root:xnu-10063.141.2~1/RELEASE_ARM64_T6000 Python Version: 3.12.3 (v3.12.3:f6650f9ad7, Apr 9 2024, 08:18:47) [Clang 13.0.0 (clang-1300.0.29.30)]

Package Information

langchain_core: 0.3.15 langchain: 0.3.7 langchain_community: 0.3.5 langsmith: 0.1.136 langchain_aws: 0.2.3 langchain_cohere: 0.3.1 langchain_experimental: 0.3.2 langchain_openai: 0.2.2 langchain_text_splitters: 0.3.2

Optional packages not installed

langgraph langserve

Other Dependencies

aiohttp: 3.10.10 async-timeout: Installed. No version info available. boto3: 1.35.45 cohere: 5.11.1 dataclasses-json: 0.6.7 httpx: 0.27.2 httpx-sse: 0.4.0 jsonpatch: 1.33 numpy: 1.26.4 openai: 1.54.3 orjson: 3.10.7 packaging: 24.1 pandas: 2.2.3 pydantic: 2.9.2 pydantic-settings: 2.6.0 PyYAML: 6.0.2 requests: 2.32.3 requests-toolbelt: 1.0.0 SQLAlchemy: 2.0.35 tabulate: 0.9.0 tenacity: 8.5.0 tiktoken: 0.8.0 typing-extensions: 4.12.2

kjoth commented 1 day ago

@bburgin @baskaryan @mackong Please have a look at the issue.

doncat99 commented 11 hours ago

Yes, i met the same problem, i have a demo code block at another project. agent-service-toolkit: issue-89. Wait for bug fix.