docker / genai-stack

Langchain + Docker + Neo4j + Ollama
Creative Commons Zero v1.0 Universal
4.06k stars 880 forks source link

I'm unable to use the pdf_bot using AWS claudev2 #145

Open SAGE-Rebirth opened 8 months ago

SAGE-Rebirth commented 8 months ago

ValueError: Error raised by inference endpoint: An error occurred (AccessDeniedException) when calling the InvokeModel operation: You don't have access to the model with the specified model ID

File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 535, in _run_script exec(code, module.dict) File "/app/pdf_bot.py", line 95, in main() File "/app/pdf_bot.py", line 72, in main vectorstore = Neo4jVector.from_texts( ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/langchain_community/vectorstores/neo4j_vector.py", line 679, in from_texts embeddings = embedding.embed_documents(list(texts)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/langchain_community/embeddings/bedrock.py", line 169, in embed_documents response = self._embedding_func(text) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/langchain_community/embeddings/bedrock.py", line 150, in _embedding_func raise ValueError(f"Error raised by inference endpoint: {e}")

I'm facing this error when I'm running the pdf_bot app. I've got the access to the claudev2 model from AWS still it's showing the above mentioned error.

[P.S I'm using Windows, and as discussed in #123 I've made some changes in my docker-compose.yml]

Here's the copy of the .env file -

OLLAMA_BASE_URL=http://host.docker.internal:11434 NEO4J_URI=neo4j://database:7687 NEO4J_USERNAME=neo4j NEO4J_PASSWORD=password LLM=claudev2 EMBEDDING_MODEL=aws AWS_ACCESS_KEY_ID=[I've have provided it but I can't share this] AWS_SECRET_ACCESS_KEY=[I've have provided it but I can't share this] AWS_DEFAULT_REGION=us-east-1

image