lm-sys / RouteLLM

A framework for serving and evaluating LLM routers - save LLM costs without compromising quality!
Apache License 2.0
3.3k stars 250 forks source link

Langchain integration #28

Open mathieuisabel opened 4 months ago

mathieuisabel commented 4 months ago

How can RouteLLM be used with LangChain?

iojw commented 4 months ago

Hi! Do you have a specific use case in mind? We don't currently have a LangChain integration but I assume RouteLLM could be used in-place of model calls, where instead of just using GPT-4, you would have a router that routed between two models and returns the response.

mathieuisabel commented 4 months ago

I would think more as a drop-in replacement for the models. i.e. right now you can set llm as either ChatOpenAI or ChatAnthropic and you don't have to change the definition of the chain itself. i.e. llm = ChatOpenAI( model_name=model_name, temperature=0.0, openai_api_key=os.getenv('OpenAI__ApiKey'), max_retries=3, request_timeout=240 )
or llm = ChatAnthropic( model_name=model_name, temperature=0.0, api_key=os.getenv('Anthropic__ApiKey'), max_retries=3, timeout=10 )
and then:

This one touches on function calling see #20 as a consideration for drop-in replacement

chain = prompt | llm.bind_tools([structure_segment_function]) response=chain.invoke(chain_payload)

iojw commented 4 months ago

Yes exactly, a drop-in replacement for models is what I had in mind. We'd be happy to accept any contributions here!