langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.92k stars 14.9k forks source link

Getting from runnables.dict() in RunnableSequence not all expected variables. #19255

Closed SimonStanley1 closed 6 months ago

SimonStanley1 commented 6 months ago

Checked other resources

Example Code

from langchain_openai import AzureChatOpenAI
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnableLambda, RunnableSequence
from langchain_core.messages import HumanMessage

def langchain_model(prompt_func: callable) -> RunnableSequence:

    model = AzureChatOpenAI(azure_deployment="gpt-35-16k", temperature=0)
    return RunnableLambda(prompt_func) | model | StrOutputParser()

def prompt_func(
    _dict: dict,
) -> list:

    question = _dict.get("question")
    texts = _dict.get("texts")

    text_message = {
        "type": "text",
        "text": (
            "You are a classification system for Procurement Documents. Answer the question solely on the provided Reference texts.\n"
            "If you cant find a answer reply exactly like this: 'Sorry i dont have an answer for youre question'\n"
            "Return the answer as a string in the language the question is written in.\n\n "
            f"User-provided question: \n"
            f"{question} \n\n"
            "Reference texts:\n"
            f"{texts}"
        ),
    }

    return [HumanMessage(content=[text_message])]

model = langchain_model(prompt_func=prompt_func)

for steps, runnable in model:
    try:
        print(runnable[0].dict())
    except:
        print(runnable)

Error Message and Stack Trace (if applicable)

for AzureChatOpenAI as a component you will just get this output as dict:

{'model': 'gpt-3.5-turbo', 'stream': False, 'n': 1, 'temperature': 0.0, '_type': 'azure-openai-chat'}.

Description

This is not enough if you want to log the model with MLFlow, the deployment_name is definitely needed and needs to be gotten from dict. See related issue: https://github.com/mlflow/mlflow/issues/11439. Expected output would have more details at least a deployment name.

System Info

System Information

OS: Darwin Python Version: 3.10.11 (v3.10.11:7d4cc5aa85, Apr 4 2023, 19:05:19) [Clang 13.0.0]

Package Information

langchain_core: 0.1.32 langchain: 0.1.12 langchain_community: 0.0.28 langsmith: 0.1.26 langchain_openai: 0.0.8 langchain_text_splitters: 0.0.1

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph langserve

SimonStanley1 commented 6 months ago

https://github.com/mlflow/mlflow/issues/11439

liugddx commented 6 months ago

There is already a PR that solves this problem

SimonStanley1 commented 6 months ago

prefect thanks