run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.89k stars 5.09k forks source link

[Bug]: Add claude 3.5 sonnet support to Bedrock InvokeAPI #14624

Closed nfrnunes closed 2 months ago

nfrnunes commented 3 months ago

Bug Description

There is a missing comma, that prevents the usage of streaming for haiku and sonnet3.5 models: from llama_index.llms.bedrock.utils import STREAMING_MODELS STREAMING_MODELS

{'amazon.titan-text-express-v1', 'amazon.titan-tg1-large', 'anthropic.claude-3-5-sonnet-20240620-v1:0anthropic.claude-3-haiku-20240307-v1:0', 'anthropic.claude-3-opus-20240229-v1:0', 'anthropic.claude-3-sonnet-20240229-v1:0', 'anthropic.claude-instant-v1', 'anthropic.claude-v1', 'anthropic.claude-v2', 'anthropic.claude-v2:1', 'meta.llama2-13b-chat-v1', 'mistral.mistral-7b-instruct-v0:2', 'mistral.mistral-large-2402-v1:0', 'mistral.mixtral-8x7b-instruct-v0:1'}

ValueError: Model anthropic.claude-3-5-sonnet-20240620-v1:0 does not support streaming

Version

v0.10.0

Steps to Reproduce

from llama_index.llms.bedrock.utils import STREAMING_MODELS STREAMING_MODELS

{'amazon.titan-text-express-v1', 'amazon.titan-tg1-large', 'anthropic.claude-3-5-sonnet-20240620-v1:0anthropic.claude-3-haiku-20240307-v1:0', 'anthropic.claude-3-opus-20240229-v1:0', 'anthropic.claude-3-sonnet-20240229-v1:0', 'anthropic.claude-instant-v1', 'anthropic.claude-v1', 'anthropic.claude-v2', 'anthropic.claude-v2:1', 'meta.llama2-13b-chat-v1', 'mistral.mistral-7b-instruct-v0:2', 'mistral.mistral-large-2402-v1:0', 'mistral.mixtral-8x7b-instruct-v0:1'}

Relevant Logs/Tracbacks

from llama_index.llms.bedrock.utils import STREAMING_MODELS
STREAMING_MODELS

{'amazon.titan-text-express-v1',
'amazon.titan-tg1-large',
'anthropic.claude-3-5-sonnet-20240620-v1:0anthropic.claude-3-haiku-20240307-v1:0',
'anthropic.claude-3-opus-20240229-v1:0',
'anthropic.claude-3-sonnet-20240229-v1:0',
'anthropic.claude-instant-v1',
'anthropic.claude-v1',
'anthropic.claude-v2',
'anthropic.claude-v2:1',
'meta.llama2-13b-chat-v1',
'mistral.mistral-7b-instruct-v0:2',
'mistral.mistral-large-2402-v1:0',
'mistral.mixtral-8x7b-instruct-v0:1'}
dosubot[bot] commented 3 months ago

Hello @nfrnunes! I'm here to assist you with any bugs, questions, or contributions. Let's work together to resolve any issues you encounter.

To fix the missing comma in the STREAMING_MODELS dictionary, you need to add a comma between "anthropic.claude-3-5-sonnet-20240620-v1:0" and "anthropic.claude-3-haiku-20240307-v1:0".

Here is the corrected code:

STREAMING_MODELS = {
    "amazon.titan-tg1-large",
    "amazon.titan-text-express-v1",
    "anthropic.claude-instant-v1",
    "anthropic.claude-v1",
    "anthropic.claude-v2",
    "anthropic.claude-v2:1",
    "anthropic.claude-3-5-sonnet-20240620-v1:0",
    "anthropic.claude-3-haiku-20240307-v1:0",  # Added comma here
    "anthropic.claude-3-opus-20240229-v1:0",
    "anthropic.claude-3-sonnet-20240229-v1:0",
    "meta.llama2-13b-chat-v1",
    "mistral.mistral-7b-instruct-v0:2",
    "mistral.mixtral-8x7b-instruct-v0:1",
    "mistral.mistral-large-2402-v1:0",
}

This will separate the haiku and sonnet3.5 models correctly and prevent the ValueError you encountered [1].

To continue talking to Dosu, mention @dosu.

logan-markewich commented 2 months ago

This has since been added

pip install -U llama-index-llms-bedrock