run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
33.46k stars 4.69k forks source link

[Bug]: Bedrock Integration with Claude3 fails #11739

Open ayseozgun opened 4 months ago

ayseozgun commented 4 months ago

Bug Description

I am trying to run Claude 3 sonnet model with llama index Bedrock. But the code is not running. Is Claude 3 not available yet on llama index? The code i am trying to run is below:

`from llama_index.llms.bedrock import Bedrock

llm = Bedrock(model="anthropic.claude-3-sonnet-20240229-v1:0", client=bedrock_runtime)`

My env: llama-index 0.10.17 llama-index-agent-openai 0.1.5 llama-index-cli 0.1.8 llama-index-core 0.10.17 llama-index-embeddings-openai 0.1.6 llama-index-indices-managed-llama-cloud 0.1.3 llama-index-legacy 0.9.48 llama-index-llms-anthropic 0.1.5 llama-index-llms-bedrock 0.1.3 llama-index-llms-openai 0.1.7 llama-index-multi-modal-llms-openai 0.1.4 llama-index-program-openai 0.1.4 llama-index-question-gen-openai 0.1.3 llama-index-readers-file 0.1.8 llama-index-readers-llama-parse 0.1.3 llama-index-vector-stores-chroma 0.1.5 llama-index-vector-stores-postgres 0.1.2 llama-parse 0.3.7 llamaindex-py-client 0.1.13

Can you please help? Thanks

Version

0.10.17

Steps to Reproduce

`from llama_index.llms.bedrock import Bedrock

llm = Bedrock(model="anthropic.claude-3-sonnet-20240229-v1:0", client=bedrock_runtime)`

Relevant Logs/Tracbacks

No response

dosubot[bot] commented 4 months ago

Based on the information you've provided, it seems like you're trying to use the Claude 3 sonnet model with the LlamaIndex Bedrock. The model name you're using, "anthropic.claude-3-sonnet-20240229-v1:0", might not be correct or available in the LlamaIndex Bedrock.

In the LlamaIndex Bedrock, the Claude 3 sonnet model is referred to as "anthropic.claude-v2:1". You can find this information in the CHAT_ONLY_MODELS dictionary in the llama-index-integrations/llms/llama-index-llms-bedrock/llama_index/llms/bedrock/utils.py file.

Here's how you can modify your code:

from llama_index.llms.bedrock import Bedrock
llm = Bedrock(model="anthropic.claude-v2:1", client=bedrock_runtime)

Also, make sure that your environment is set up to handle the specific requirements of chat-based models. This includes starting prompts with "Human:" and ending with "Assistant:", as indicated in the comments above the CHAT_ONLY_MODELS dictionary.

If you're still facing issues, it would be helpful to have more specific details about the errors or problems you're encountering. This will allow for more targeted assistance.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
brycecf commented 4 months ago

Same thing here. I have a PR open that will (hopefully) be merged soon to fix this.

ayseozgun commented 3 months ago

is there any update? @brycecf

danielftapiar commented 3 months ago

tried with the latest version llama-index==0.10.20 but still doesn't work

ayseozgun commented 3 months ago

yes, same here.

ayseozgun commented 3 months ago

ValueError:context_sizeargument not provided andmodel provided refers to a non-foundation model. Please specify the context_size

Aniket-ByteIQ commented 3 months ago

Did you find any fix for that? @ayseozgun

ayseozgun commented 3 months ago

no, could not solve @Aniket-ByteIQ

Aniket-ByteIQ commented 3 months ago

Hey @ayseozgun , It seems like you're using an older version of llama-index-llms-bedrock (0.1.3) and llama-index-llms-anthropic (0.1.5). Try upgrading them to the newer version: pip install --upgrade llama-index-llms-bedrock==0.1.5 pip install --upgrade llama-index-llms-anthropic==0.1.7 This version supports the use of Claude 3 models. Hope it helps

Aniket-ByteIQ commented 3 months ago

tried with the latest version llama-index==0.10.20 but still doesn't work @danielftapiar

Try upgrading this to the newer version: pip install --upgrade llama-index-llms-bedrock==0.1.5

from llama_index.llms.bedrock import Bedrock
llm = Bedrock(model="anthropic.claude-3-sonnet-20240229-v1:0",context_size=context_size,..)

This should work

ayseozgun commented 3 months ago

Should i specify context_size parameter? @Aniket-ByteIQ

Aniket-ByteIQ commented 3 months ago

Should i specify context_size parameter? @Aniket-ByteIQ

@ayseozgun Not necessary, it was just an example. Let me know if it worked.

ayseozgun commented 3 months ago

llama-index==0.10.20 llama-index-llms-bedrock==0.1.5 llama-index-llms-anthropic==0.1.7 llama-index-vector-stores-pinecone==0.1.1 llama-index-readers-database==0.1.2

This is my current versions, should i update other versions also? @Aniket-ByteIQ

Aniket-ByteIQ commented 3 months ago

@ayseozgun This should work to work with Claude 3. Are you still facing issues to load the LLM? Can you show me the error which you're facing?

brycecf commented 2 months ago

This should be resolved now. I use Claude Sonnet regularly with LlamaIndex Bedrock.

@ayseozgun can you confirm please so this can be closed?