langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.91k stars 15.37k forks source link

I started getting the error for the load_summarize_chain. The code that was running is broken. #26715

Open dpurandare opened 1 month ago

dpurandare commented 1 month ago

Checked other resources

Example Code

llm = ChatOpenAI(temperature=0, model="gpt-3.5-turbo")

or llm = ChatOpenAI(temperature=0)

chain = load_summarize_chain(llm, chain_type="map_reduce", map_prompt=map_prompt_template, combine_prompt=combine_prompt_template
)

output = chain.invoke(docs[0])

or output = chain(docs[0])

Error Message and Stack Trace (if applicable)


TypeError Traceback (most recent call last) Cell In[25], line 3 1 # output = chain(docs[0]) 2 # from itertools import chain ----> 3 output = chain.invoke(docs[0])

File ~/cuda118/lib/python3.12/site-packages/langchain/chains/base.py:166, in Chain.invoke(self, input, config, **kwargs) 164 except BaseException as e: 165 run_manager.on_chain_error(e) --> 166 raise e 167 run_manager.on_chain_end(outputs) 169 if include_run_info:

File ~/cuda118/lib/python3.12/site-packages/langchain/chains/base.py:156, in Chain.invoke(self, input, config, **kwargs) 153 try: 154 self._validate_inputs(inputs) 155 outputs = ( --> 156 self._call(inputs, run_manager=run_manager) 157 if new_arg_supported 158 else self._call(inputs) 159 ) 161 final_outputs: Dict[str, Any] = self.prep_outputs( 162 inputs, outputs, return_only_outputs 163 ) ... --> 378 overall_token_usage[k] += v 379 else: 380 overall_token_usage[k] = v

TypeError: unsupported operand type(s) for +=: 'OpenAIObject' and 'OpenAIObject'

Description

The code was running previously. I have used such code at two-three different places and it worked fine.

Further investigation shows the function _combine_llm_outputs fail. def _combine_llm_outputs(self, llm_outputs: List[Optional[dict]]) -> dict:

I checked the parameter llm_outputs passed to it and the value is as follows:

[{'token_usage': <OpenAIObject at 0x769d8ea08410> JSON: { "completion_tokens": 453, "completion_tokens_details": { "reasoning_tokens": 0 }, "prompt_tokens": 862, "total_tokens": 1315 }, 'model_name': 'gpt-3.5-turbo', 'system_fingerprint': None}, {'token_usage': <OpenAIObject at 0x769d88f0ad00> JSON: { "completion_tokens": 449, "completion_tokens_details": { "reasoning_tokens": 0 }, "prompt_tokens": 856, "total_tokens": 1305 }, 'model_name': 'gpt-3.5-turbo', 'system_fingerprint': None}, {'token_usage': <OpenAIObject at 0x769d88f0a260> JSON: { "completion_tokens": 354, "completion_tokens_details": { "reasoning_tokens": 0 }, "prompt_tokens": 653, "total_tokens": 1007 }, 'model_name': 'gpt-3.5-turbo', 'system_fingerprint': None}]

When you are merging on the token_usage it fails as some of the nodes are dictionaries and not numbers. I did not find any solution for this yet.

System Info

Not using yarn and Node. Linux deep-B650 6.8.0-40-generic langchain-ai/langchainjs#40-Ubuntu SMP PREEMPT_DYNAMIC Fri Jul 5 10:34:03 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux

Python environment: absl-py==2.1.0 accelerate==0.30.0 aiohttp==3.9.5 aiosignal==1.3.1 annotated-types==0.6.0 anyio==4.3.0 argon2-cffi==23.1.0 argon2-cffi-bindings==21.2.0 arrow==1.3.0 asttokens==2.4.1 astunparse==1.6.3 async-lru==2.0.4 attrs==23.2.0 azure-core==1.30.2 azure-storage-blob==12.22.0 Babel==2.15.0 beautifulsoup4==4.12.3 bertviz==1.4.0 bitsandbytes==0.39.0 bleach==6.1.0 boto3==1.34.98 botocore==1.34.98 certifi==2024.2.2 cffi==1.16.0 charset-normalizer==3.3.2 comm==0.2.2 contourpy==1.2.1 cryptography==43.0.0 cycler==0.12.1 dataclasses-json==0.6.7 datasets==2.21.0 debugpy==1.8.1 decorator==5.1.1 defusedxml==0.7.1 dill==0.3.8 diskcache==5.6.3 distro==1.9.0 dnspython==2.6.1 evaluate==0.4.2 executing==2.0.1 fastjsonschema==2.19.1 filelock==3.13.1 flatbuffers==24.3.25 fonttools==4.51.0 fqdn==1.5.1 frozenlist==1.4.1 fsspec==2024.2.0 gast==0.5.4 gguf==0.10.0 google-pasta==0.2.0 greenlet==3.0.3 grpcio==1.64.1 h11==0.14.0 h5py==3.11.0 httpcore==1.0.5 httpx==0.27.0 huggingface-hub==0.24.6 idna==3.7 install==1.3.5 ipykernel==6.29.4 ipython==8.24.0 ipywidgets==8.1.2 isodate==0.6.1 isoduration==20.11.0 jedi==0.19.1 Jinja2==3.1.3 jmespath==1.0.1 joblib==1.4.2 json5==0.9.25 jsonlines==4.0.0 jsonpatch==1.33 jsonpointer==2.4 jsonschema==4.22.0 jsonschema-specifications==2023.12.1 jupyter-events==0.10.0 jupyter-lsp==2.2.5 jupyter_client==8.6.1 jupyter_core==5.7.2 jupyter_server==2.14.0 jupyter_server_terminals==0.5.3 jupyterlab==4.1.8 jupyterlab_pygments==0.3.0 jupyterlab_server==2.27.1 jupyterlab_widgets==3.0.10 keras==3.5.0 kiwisolver==1.4.5 lamini-configuration==0.8.3 langchain==0.2.6 langchain-community==0.2.6 langchain-core==0.2.10 langchain-text-splitters==0.2.2 langsmith==0.1.82 libclang==18.1.1 llama_cpp_python==0.2.89 Markdown==3.6 markdown-it-py==3.0.0 MarkupSafe==2.1.5 marshmallow==3.21.3 matplotlib==3.8.4 matplotlib-inline==0.1.7 mdurl==0.1.2 mistune==3.0.2 ml-dtypes==0.3.2 mpmath==1.3.0 multidict==6.0.5 multiprocess==0.70.16 mypy-extensions==1.0.0 namex==0.0.8 nbclient==0.10.0 nbconvert==7.16.4 nbformat==5.10.4 nest-asyncio==1.6.0 networkx==3.2.1 notebook_shim==0.2.4 numpy==1.26.3 nvidia-cublas-cu11==11.11.3.6 nvidia-cuda-cupti-cu11==11.8.87 nvidia-cuda-nvrtc-cu11==11.8.89 nvidia-cuda-runtime-cu11==11.8.89 nvidia-cuda-runtime-cu12==12.5.39 nvidia-cudnn-cu11==8.7.0.84 nvidia-cufft-cu11==10.9.0.58 nvidia-curand-cu11==10.3.0.86 nvidia-cusolver-cu11==11.4.1.48 nvidia-cusparse-cu11==11.7.5.86 nvidia-nccl-cu11==2.20.5 nvidia-nvtx-cu11==11.8.86 openai==0.27.7 opt-einsum==3.3.0 optree==0.11.0 orjson==3.10.5 overrides==7.7.0 packaging==24.0 pandas==2.2.2 pandocfilters==1.5.1 parso==0.8.4 patsy==0.5.6 pdf2text==1.0.0 pdfminer==20191125 pexpect==4.9.0 pillow==10.2.0 platformdirs==4.2.1 plotly==5.22.0 plotly-express==0.4.1 prometheus_client==0.20.0 prompt-toolkit==3.0.43 protobuf==4.25.3 psutil==5.9.8 ptyprocess==0.7.0 pure-eval==0.2.2 pyarrow==16.0.0 pyarrow-hotfix==0.6 pycparser==2.22 pycryptodome==3.20.0 pydantic==2.7.1 pydantic_core==2.18.2 Pygments==2.18.0 pymongo==4.9.1 pyparsing==3.1.2 pypdf==4.2.0 PyPDF2==3.0.1 python-dateutil==2.9.0.post0 python-json-logger==2.0.7 pytz==2024.1 PyYAML==6.0.1 pyzmq==26.0.3 referencing==0.35.1 regex==2024.4.28 requests==2.32.3 rfc3339-validator==0.1.4 rfc3986-validator==0.1.1 rich==13.7.1 rpds-py==0.18.0 s3transfer==0.10.1 safetensors==0.4.3 scikit-learn==1.4.2 scipy==1.13.0 seaborn==0.13.2 Send2Trash==1.8.3 sentence-transformers==2.7.0 sentencepiece==0.2.0 setuptools==70.0.0 six==1.16.0 sniffio==1.3.1 soupsieve==2.5 SQLAlchemy==2.0.31 stack-data==0.6.3 statsmodels==0.14.2 sympy==1.12 tenacity==8.2.3 tensorboard==2.17.1 tensorboard-data-server==0.7.2 tensorflow==2.17.0 tensorrt==10.1.0 tensorrt-cu12==10.1.0 tensorrt-cu12-bindings==10.1.0 tensorrt-cu12-libs==10.1.0 termcolor==2.4.0 terminado==0.18.1 tf_keras==2.17.0 threadpoolctl==3.5.0 tinycss2==1.3.0 tokenizers==0.19.1 torch==2.3.0+cu118 torchaudio==2.3.0+cu118 torchvision==0.18.0+cu118 tornado==6.4 tqdm==4.66.4 traitlets==5.14.3 transformers==4.44.2 types-python-dateutil==2.9.0.20240316 typing-inspect==0.9.0 typing_extensions==4.9.0 tzdata==2024.1 uri-template==1.3.0 urllib3==2.2.1 wcwidth==0.2.13 webcolors==1.13 webencodings==0.5.1 websocket-client==1.8.0 Werkzeug==3.0.3 wheel==0.43.0 widgetsnbextension==4.0.10 wrapt==1.16.0 xxhash==3.4.1 yarl==1.9.4

efriis commented 1 month ago

pip install -U langchain-openai should fix it as long as you're using the non-deprecated versions!

efriis commented 1 month ago

same issue here: https://github.com/langchain-ai/langchain/issues/26606#issuecomment-2357400608