langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.46k stars 15.27k forks source link

Amazon Bedrock streaming not working with stop sequence #12926

Closed sibijr closed 1 year ago

sibijr commented 1 year ago

System Info

Issue you'd like to raise.

Bedrock Streaming support was added in the PR

But there is an issue if streaming is enabled with stop sequence, See the below code, tailing comma is added to the self.provider_stop_sequence_key_name_map.get(provider), which is causing

"TypeError('keys must be str, int, float, bool or None, not tuple')"

line 35, in _prepare_input_and_invoke_stream
    body = json.dumps(input_body)

{'temperature': 0, 'max_tokens_to_sample': 4048, ('stop_sequences',): ['<observation>'], 'prompt': '\n\nHuman: 
        if stop:
            if provider not in self.provider_stop_sequence_key_name_map:
                raise ValueError(
                    f"Stop sequence key name for {provider} is not supported."
                )

            # stop sequence from _generate() overrides
            # stop sequences in the class attribute
            _model_kwargs[
                self.provider_stop_sequence_key_name_map.get(provider),
            ] = stop

Suggested fix

Remove the trailing comma after self.provider_stop_sequence_key_name_map.get(provider) in the _prepare_input_and_invoke_stream method.

For example:

_model_kwargs[
    self.provider_stop_sequence_key_name_map.get(provider)
] = stop

This will resolve the TypeError and allow the stop sequence to be properly passed for streaming.

Who can help?

cc @3coins @baskaryan @mukitmomi

Information

Related Components

Reproduction

enable bedrock streaming with stop sequence.


"TypeError('keys must be str, int, float, bool or None, not tuple')"

line 35, in _prepare_input_and_invoke_stream
    body = json.dumps(input_body)

{'temperature': 0, 'max_tokens_to_sample': 4048, ('stop_sequences',): ['<observation>'], 'prompt': '\n\nHuman: 

Expected behavior

{'temperature': 0, 'max_tokens_to_sample': 4048, 'stop_sequences': [''], 'prompt': '\n\nHuman:

stop_sequences as str key.

sibijr commented 1 year ago

https://github.com/langchain-ai/langchain/commit/2f83350eace3f7f809fe5a3bd75550b6ee0c6d02

issue seems fixed in latest version.