But there is an issue if streaming is enabled with stop sequence, See the below code, tailing comma is added to the self.provider_stop_sequence_key_name_map.get(provider), which is causing
"TypeError('keys must be str, int, float, bool or None, not tuple')"
line 35, in _prepare_input_and_invoke_stream
body = json.dumps(input_body)
{'temperature': 0, 'max_tokens_to_sample': 4048, ('stop_sequences',): ['<observation>'], 'prompt': '\n\nHuman:
if stop:
if provider not in self.provider_stop_sequence_key_name_map:
raise ValueError(
f"Stop sequence key name for {provider} is not supported."
)
# stop sequence from _generate() overrides
# stop sequences in the class attribute
_model_kwargs[
self.provider_stop_sequence_key_name_map.get(provider),
] = stop
Suggested fix
Remove the trailing comma after self.provider_stop_sequence_key_name_map.get(provider) in the _prepare_input_and_invoke_stream method.
This will resolve the TypeError and allow the stop sequence to be properly passed for streaming.
Who can help?
cc @3coins @baskaryan @mukitmomi
Information
[ ] The official example notebooks/scripts
[ ] My own modified scripts
Related Components
[X] LLMs/Chat Models
[ ] Embedding Models
[ ] Prompts / Prompt Templates / Prompt Selectors
[ ] Output Parsers
[ ] Document Loaders
[ ] Vector Stores / Retrievers
[ ] Memory
[X] Agents / Agent Executors
[ ] Tools / Toolkits
[ ] Chains
[ ] Callbacks/Tracing
[ ] Async
Reproduction
enable bedrock streaming with stop sequence.
"TypeError('keys must be str, int, float, bool or None, not tuple')"
line 35, in _prepare_input_and_invoke_stream
body = json.dumps(input_body)
{'temperature': 0, 'max_tokens_to_sample': 4048, ('stop_sequences',): ['<observation>'], 'prompt': '\n\nHuman:
System Info
Issue you'd like to raise.
Bedrock Streaming support was added in the PR
But there is an issue if streaming is enabled with
stop sequence
, See the below code, tailing comma is added to theself.provider_stop_sequence_key_name_map.get(provider),
which is causingSuggested fix
Remove the trailing comma after
self.provider_stop_sequence_key_name_map.get(provider)
in the_prepare_input_and_invoke_stream
method.For example:
This will resolve the TypeError and allow the stop sequence to be properly passed for streaming.
Who can help?
cc @3coins @baskaryan @mukitmomi
Information
Related Components
Reproduction
enable bedrock streaming with stop sequence.
Expected behavior
{'temperature': 0, 'max_tokens_to_sample': 4048, 'stop_sequences': [''], 'prompt': '\n\nHuman:
stop_sequences as str key.