docker / genai-stack

Langchain + Docker + Neo4j + Ollama
Creative Commons Zero v1.0 Universal
3.85k stars 821 forks source link

Build failure on genai-stack-api-1 container #169

Closed dmoore44 closed 2 months ago

dmoore44 commented 2 months ago

Standing up the genai-stack fails when using docker compose up or docker compose up --build - specifically, it's failing when standing up the genai-stack-api-1 container.

Here's the truncated output when trying to build...

Attaching to api-1, bot-1, database-1, front-end-1, loader-1, pdf_bot-1, pull-model-1
database-1    | Installing Plugin 'apoc' from /var/lib/neo4j/labs/apoc-*-core.jar to /var/lib/neo4j/plugins/apoc.jar
pull-model-1  | pulling ollama model llama2 using http://host.docker.internal:11434
pull-model-1  | 2024/07/10 16:48:39 ERROR invalid setting OLLAMA_MODELS="" error="$HOME is not defined"
database-1    | Applying default values for plugin apoc to neo4j.conf
pull-model-1 exited with code 0
database-1    | Changed password for user 'neo4j'. IMPORTANT: this change will only take effect if performed before the database is started for the first time.
database-1    | 2024-07-10 16:48:42.261+0000 INFO  Starting...
database-1    | 2024-07-10 16:48:42.691+0000 INFO  This instance is ServerId{c5d127a3} (c5d127a3-f598-463b-bbc7-28c5b9ff0bd2)
database-1    | 2024-07-10 16:48:43.212+0000 INFO  ======== Neo4j 5.11.0 ========
database-1    | 2024-07-10 16:48:45.603+0000 INFO  Bolt enabled on 0.0.0.0:7687.
database-1    | [main] INFO org.eclipse.jetty.server.Server - jetty-10.0.15; built: 2023-04-11T17:25:14.480Z; git: 68017dbd00236bb7e187330d7585a059610f661d; jvm 17.0.8.1+1
database-1    | [main] INFO org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.h.MovedContextHandler@20f1cf40{/,null,AVAILABLE}
database-1    | [main] INFO org.eclipse.jetty.server.session.DefaultSessionIdManager - Session workerName=node0
database-1    | [main] INFO org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@27447e9f{/db,null,AVAILABLE}
database-1    | [main] INFO org.eclipse.jetty.webapp.StandardDescriptorProcessor - NO JSP Support for /browser, did not find org.eclipse.jetty.jsp.JettyJspServlet
database-1    | [main] INFO org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.w.WebAppContext@3982d73d{/browser,jar:file:/var/lib/neo4j/lib/neo4j-browser-5.11.0.jar!/browser,AVAILABLE}
database-1    | [main] INFO org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@e2d12b8{/,null,AVAILABLE}
database-1    | [main] INFO org.eclipse.jetty.server.AbstractConnector - Started http@781ce680{HTTP/1.1, (http/1.1)}{0.0.0.0:7474}
database-1    | [main] INFO org.eclipse.jetty.server.Server - Started Server@6d9f0836{STARTING}[10.0.15,sto=0] @4085ms
database-1    | 2024-07-10 16:48:45.976+0000 INFO  Remote interface available at http://localhost:7474/
database-1    | 2024-07-10 16:48:45.978+0000 INFO  id: B025B0B9629A845A42FB19E64043504DE0DABC75B9A0ED58237717C96A518991
database-1    | 2024-07-10 16:48:45.978+0000 INFO  name: system
database-1    | 2024-07-10 16:48:45.979+0000 INFO  creationDate: 2024-07-09T21:55:19.279Z
database-1    | 2024-07-10 16:48:45.979+0000 INFO  Started.
bot-1         | 
bot-1         | Collecting usage statistics. To deactivate, set browser.gatherUsageStats to false.
bot-1         | 
loader-1      | 
loader-1      | Collecting usage statistics. To deactivate, set browser.gatherUsageStats to false.
loader-1      | 
pdf_bot-1     | 
pdf_bot-1     | Collecting usage statistics. To deactivate, set browser.gatherUsageStats to false.
pdf_bot-1     | 
pdf_bot-1     | 
pdf_bot-1     |   You can now view your Streamlit app in your browser.
bot-1         | 
loader-1      | 
pdf_bot-1     | 
bot-1         |   You can now view your Streamlit app in your browser.
loader-1      |   You can now view your Streamlit app in your browser.
pdf_bot-1     |   URL: http://0.0.0.0:8503
bot-1         | 
loader-1      | 
pdf_bot-1     | 
bot-1         |   URL: http://0.0.0.0:8501
loader-1      |   URL: http://0.0.0.0:8502
bot-1         | 
loader-1      | 
api-1         | /usr/local/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:139: LangChainDeprecationWarning: The class `HuggingFaceEmbeddings` was deprecated in LangChain 0.2.2 and will be removed in 0.3.0. An updated version of the class exists in the langchain-huggingface package and should be used instead. To use it run `pip install -U langchain-huggingface` and import as `from langchain_huggingface import HuggingFaceEmbeddings`.
api-1         |   warn_deprecated(
api-1         | Embedding: Using SentenceTransformer
api-1         | LLM: Using Ollama: llama2
api-1         | Traceback (most recent call last):
api-1         |   File "/usr/local/bin/uvicorn", line 8, in <module>
api-1         |     sys.exit(main())
api-1         |              ^^^^^^
api-1         |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
api-1         |     return self.main(*args, **kwargs)
api-1         |            ^^^^^^^^^^^^^^^^^^^^^^^^^^
api-1         |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1078, in main
api-1         |     rv = self.invoke(ctx)
api-1         |          ^^^^^^^^^^^^^^^^
api-1         |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
api-1         |     return ctx.invoke(self.callback, **ctx.params)
api-1         |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-1         |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 783, in invoke
api-1         |     return __callback(*args, **kwargs)
api-1         |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-1         |   File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 410, in main
api-1         |     run(
api-1         |   File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 577, in run
api-1         |     server.run()
api-1         |   File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 65, in run
api-1         |     return asyncio.run(self.serve(sockets=sockets))
api-1         |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-1         |   File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
api-1         |     return runner.run(main)
api-1         |            ^^^^^^^^^^^^^^^^
api-1         |   File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
api-1         |     return self._loop.run_until_complete(task)
api-1         |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-1         |   File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
api-1         |   File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 69, in serve
api-1         |     await self._serve(sockets)
api-1         |   File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 76, in _serve
api-1         |     config.load()
api-1         |   File "/usr/local/lib/python3.11/site-packages/uvicorn/config.py", line 434, in load
api-1         |     self.loaded_app = import_from_string(self.app)
api-1         |                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-1         |   File "/usr/local/lib/python3.11/site-packages/uvicorn/importer.py", line 19, in import_from_string
api-1         |     module = importlib.import_module(module_str)
api-1         |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-1         |   File "/usr/local/lib/python3.11/importlib/__init__.py", line 126, in import_module
api-1         |     return _bootstrap._gcd_import(name[level:], package, level)
api-1         |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-1         |   File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
api-1         |   File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
api-1         |   File "<frozen importlib._bootstrap>", line 1147, in _find_and_load_unlocked
api-1         |   File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
api-1         |   File "<frozen importlib._bootstrap_external>", line 940, in exec_module
api-1         |   File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
api-1         |   File "/app/api.py", line 52, in <module>
api-1         |     rag_chain = configure_qa_rag_chain(
api-1         |                 ^^^^^^^^^^^^^^^^^^^^^^^
api-1         |   File "/app/chains.py", line 148, in configure_qa_rag_chain
api-1         |     kg = Neo4jVector.from_existing_index(
api-1         |          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-1         |   File "/usr/local/lib/python3.11/site-packages/langchain_community/vectorstores/neo4j_vector.py", line 1216, in from_existing_index
api-1         |     raise ValueError(
api-1         | ValueError: The provided embedding function and vector index dimensions do not match.
api-1         | Embedding function dimension: 384
api-1         | Vector index dimension: 4096
api-1 exited with code 1
Gracefully stopping... (press Ctrl+C again to force)
dependency failed to start: container genai-stack-api-1 exited (1)
mingtaoloreal commented 2 months ago

The same problem occurred. Wating for solutions.

dmoore44 commented 2 months ago

@mingtaoloreal Got it - take a look in chains.py (lines 29-56 are the relevant ones); If you're using Ollama to serve up your model, you'll need to chance you EMBEDDING_MODEL value to ollama in your .env file.

The default value for the EMBEDDING_MODEL conflicts with the LLM embedding model vector and dimension values.