langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.4k stars 14.77k forks source link

QAGenerateChain cannot be loaded #4977

Closed liangz1 closed 1 year ago

liangz1 commented 1 year ago

System Info

langchain==0.0.173

Who can help?

@hwchase17

Information

Related Components

Reproduction

from langchain.llms import OpenAI
from langchain.evaluation.qa import QAGenerateChain
from langchain.chains.loading import load_chain

example_gen_chain = QAGenerateChain.from_llm(OpenAI())
example_gen_chain.save("/Users/liang.zhang/qa_gen_chain.yaml")

loaded_chain = load_chain("/Users/liang.zhang/qa_gen_chain.yaml")

Error:

---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
Input In [13], in <cell line: 2>()
      1 from langchain.chains.loading import load_chain
----> 2 loaded_chain = load_chain("/Users/liang.zhang/qa_gen_chain.yaml")

File ~/miniforge3/envs/mlflow-3.8/lib/python3.8/site-packages/langchain/chains/loading.py:449, in load_chain(path, **kwargs)
    447     return hub_result
    448 else:
--> 449     return _load_chain_from_file(path, **kwargs)

File ~/miniforge3/envs/mlflow-3.8/lib/python3.8/site-packages/langchain/chains/loading.py:476, in _load_chain_from_file(file, **kwargs)
    473     config["memory"] = kwargs.pop("memory")
    475 # Load the chain from the config now.
--> 476 return load_chain_from_config(config, **kwargs)

File ~/miniforge3/envs/mlflow-3.8/lib/python3.8/site-packages/langchain/chains/loading.py:439, in load_chain_from_config(config, **kwargs)
    436     raise ValueError(f"Loading {config_type} chain not supported")
    438 chain_loader = type_to_loader_dict[config_type]
--> 439 return chain_loader(config, **kwargs)

File ~/miniforge3/envs/mlflow-3.8/lib/python3.8/site-packages/langchain/chains/loading.py:44, in _load_llm_chain(config, **kwargs)
     42 if "prompt" in config:
     43     prompt_config = config.pop("prompt")
---> 44     prompt = load_prompt_from_config(prompt_config)
     45 elif "prompt_path" in config:
     46     prompt = load_prompt(config.pop("prompt_path"))

File ~/miniforge3/envs/mlflow-3.8/lib/python3.8/site-packages/langchain/prompts/loading.py:30, in load_prompt_from_config(config)
     27     raise ValueError(f"Loading {config_type} prompt not supported")
     29 prompt_loader = type_to_loader_dict[config_type]
---> 30 return prompt_loader(config)

File ~/miniforge3/envs/mlflow-3.8/lib/python3.8/site-packages/langchain/prompts/loading.py:115, in _load_prompt(config)
    113 config = _load_template("template", config)
    114 config = _load_output_parser(config)
--> 115 return PromptTemplate(**config)

File ~/miniforge3/envs/mlflow-3.8/lib/python3.8/site-packages/pydantic/main.py:342, in pydantic.main.BaseModel.__init__()

ValidationError: 1 validation error for PromptTemplate
output_parser
  Can't instantiate abstract class BaseOutputParser with abstract methods parse (type=type_error)

Expected behavior

No errors should occur.

dev2049 commented 1 year ago

I believe #4987 should fix this

liangz1 commented 1 year ago

@dev2049 Yes, thank you for a quick fix!