Open ayseozgun opened 4 months ago
Based on the information you've provided, it seems like you're trying to use the Claude 3 sonnet model with the LlamaIndex Bedrock. The model name you're using, "anthropic.claude-3-sonnet-20240229-v1:0", might not be correct or available in the LlamaIndex Bedrock.
In the LlamaIndex Bedrock, the Claude 3 sonnet model is referred to as "anthropic.claude-v2:1". You can find this information in the CHAT_ONLY_MODELS
dictionary in the llama-index-integrations/llms/llama-index-llms-bedrock/llama_index/llms/bedrock/utils.py file.
Here's how you can modify your code:
from llama_index.llms.bedrock import Bedrock
llm = Bedrock(model="anthropic.claude-v2:1", client=bedrock_runtime)
Also, make sure that your environment is set up to handle the specific requirements of chat-based models. This includes starting prompts with "Human:" and ending with "Assistant:", as indicated in the comments above the CHAT_ONLY_MODELS
dictionary.
If you're still facing issues, it would be helpful to have more specific details about the errors or problems you're encountering. This will allow for more targeted assistance.
Same thing here. I have a PR open that will (hopefully) be merged soon to fix this.
is there any update? @brycecf
tried with the latest version llama-index==0.10.20
but still doesn't work
yes, same here.
ValueError:
context_sizeargument not provided andmodel provided refers to a non-foundation model. Please specify the context_size
Did you find any fix for that? @ayseozgun
no, could not solve @Aniket-ByteIQ
Hey @ayseozgun , It seems like you're using an older version of llama-index-llms-bedrock (0.1.3) and llama-index-llms-anthropic (0.1.5).
Try upgrading them to the newer version:
pip install --upgrade llama-index-llms-bedrock==0.1.5
pip install --upgrade llama-index-llms-anthropic==0.1.7
This version supports the use of Claude 3 models.
Hope it helps
tried with the latest version
llama-index==0.10.20
but still doesn't work @danielftapiar
Try upgrading this to the newer version:
pip install --upgrade llama-index-llms-bedrock==0.1.5
from llama_index.llms.bedrock import Bedrock
llm = Bedrock(model="anthropic.claude-3-sonnet-20240229-v1:0",context_size=context_size,..)
This should work
Should i specify context_size parameter? @Aniket-ByteIQ
Should i specify context_size parameter? @Aniket-ByteIQ
@ayseozgun Not necessary, it was just an example. Let me know if it worked.
llama-index==0.10.20 llama-index-llms-bedrock==0.1.5 llama-index-llms-anthropic==0.1.7 llama-index-vector-stores-pinecone==0.1.1 llama-index-readers-database==0.1.2
This is my current versions, should i update other versions also? @Aniket-ByteIQ
@ayseozgun This should work to work with Claude 3. Are you still facing issues to load the LLM? Can you show me the error which you're facing?
This should be resolved now. I use Claude Sonnet regularly with LlamaIndex Bedrock.
@ayseozgun can you confirm please so this can be closed?
Bug Description
I am trying to run Claude 3 sonnet model with llama index Bedrock. But the code is not running. Is Claude 3 not available yet on llama index? The code i am trying to run is below:
`from llama_index.llms.bedrock import Bedrock
llm = Bedrock(model="anthropic.claude-3-sonnet-20240229-v1:0", client=bedrock_runtime)`
My env: llama-index 0.10.17 llama-index-agent-openai 0.1.5 llama-index-cli 0.1.8 llama-index-core 0.10.17 llama-index-embeddings-openai 0.1.6 llama-index-indices-managed-llama-cloud 0.1.3 llama-index-legacy 0.9.48 llama-index-llms-anthropic 0.1.5 llama-index-llms-bedrock 0.1.3 llama-index-llms-openai 0.1.7 llama-index-multi-modal-llms-openai 0.1.4 llama-index-program-openai 0.1.4 llama-index-question-gen-openai 0.1.3 llama-index-readers-file 0.1.8 llama-index-readers-llama-parse 0.1.3 llama-index-vector-stores-chroma 0.1.5 llama-index-vector-stores-postgres 0.1.2 llama-parse 0.3.7 llamaindex-py-client 0.1.13
Can you please help? Thanks
Version
0.10.17
Steps to Reproduce
`from llama_index.llms.bedrock import Bedrock
llm = Bedrock(model="anthropic.claude-3-sonnet-20240229-v1:0", client=bedrock_runtime)`
Relevant Logs/Tracbacks
No response