langchain-ai / langchain-google

MIT License
105 stars 121 forks source link

IndexError: list index out of range #177

Closed duob-ai closed 5 months ago

duob-ai commented 5 months ago

I'm using a simple RAG chain (sample code below). When using ChatVertexAI as my LLM I get the following error only for certain question prompts. It works fine for some prompts but throws the IndexError for others. Using the same prompt always results in the error below. Changing the LLM for AzureAI solves the problem.

Error Log: IndexError('list index out of range')Traceback (most recent call last):

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1979, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform async for output in final_pipeline:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4734, in atransform async for item in self.bound.atransform(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2883, in atransform async for chunk in self._atransform_stream_with_config(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1979, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform async for output in final_pipeline:

File "/usr/local/lib/python3.11/site-packages/langchain_core/output_parsers/transform.py", line 60, in atransform async for chunk in self._atransform_stream_with_config(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1979, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/output_parsers/transform.py", line 38, in _atransform async for chunk in input:

File "/usr/local/lib/python3.11/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer item = await iterator.anext() ^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1333, in atransform async for output in self.astream(final, config, **kwargs):

File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 315, in astream raise e

File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 293, in astream async for chunk in self._astream(

File "/usr/local/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 768, in _astream message = _parse_response_candidate(chunk.candidates[0], streaming=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 325, in _parse_response_candidate first_part = response_candidate.content.parts[0]


IndexError: list index out of range

**Sample Code**

    #  Vertex AI
    llm = ChatVertexAI(model_name="gemini-1.5-pro-preview-0409")
    retriever = get_retriever()
    answer_chain = create_chain(
        llm,
        retriever,
    )
    return answer_chain

**Library Versions**
langchain 0.1.16
langchain-google-vertexai 1.0.1

**Maybe related**
https://github.com/langchain-ai/langchain/issues/17800
lkuligin commented 5 months ago

I added retries that checks for empty generations too, please, take a look whether it solves the problem.

In case of streaming, there's also a new flag to check for empty generations but it essentially breaks streaming and waits until the full response is generated.

lkuligin commented 5 months ago

It should be fixed with the recent release. Closing but feel free to re-open please if you observe the issues.

duob-ai commented 5 months ago

@lkuligin I updated my project to the latest release. It is now throwing a new error for the same prompts I had trouble with before:

Tracing with LangSmith:

TypeError("Additional kwargs key is_blocked already exists in left dict and value has unsupported type <class 'bool'>.")Traceback (most recent call last):

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1980, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform async for output in final_pipeline:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4748, in atransform async for item in self.bound.atransform(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2883, in atransform async for chunk in self._atransform_stream_with_config(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1980, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform async for output in final_pipeline:

File "/usr/local/lib/python3.11/site-packages/langchain_core/output_parsers/transform.py", line 60, in atransform async for chunk in self._atransform_stream_with_config(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1980, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/output_parsers/transform.py", line 38, in _atransform async for chunk in input:

File "/usr/local/lib/python3.11/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer item = await iterator.anext() ^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1334, in atransform async for output in self.astream(final, config, **kwargs):

File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 319, in astream raise e

File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 312, in astream generation += chunk

File "/usr/local/lib/python3.11/site-packages/langchain_core/outputs/chat_generation.py", line 74, in add generation_info = merge_dicts( ^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/utils/_merge.py", line 40, in merge_dicts raise TypeError(

TypeError: Additional kwargs key is_blocked already exists in left dict and value has unsupported type <class 'bool'>.