Closed MikePos1581 closed 11 months ago
Do you have the logs of the pull-model
service ?
Hi, where would I find those? I looked in .docker but the only log I can see is the devenv.log that I attached earlier as "docker genaiinstall fails.txt" I looked at pull_model.Dockerfile and wonder if the error relates to line 33? As I'm using windows I will be using gpt-3.5 or gpt-4 instead of ollama.
You shouldn't pull anything if using GPT-3.5/4. Are you sure you wrote correctly the name in the .env
?
To see the logs, I only know about the CLI: docker logs <complete name of container>
.
This is what I have in the .env file
Seems fine to me, what about the logs?
I couldn't find any other logs apart from that one I attached earlier. I created the dev environment Created my .env file Deleted the container and image Started the dev environment again It's almost as if the container is still creating from the original downloaded files and ignoring my .env file
On 5 Dec 2023, at 14:19, Matthieu Mérigot-Lombard @.***> wrote:
Seems fine to me, what about the logs?
— Reply to this email directly, view it on GitHubhttps://github.com/docker/genai-stack/issues/105#issuecomment-1840884283, or unsubscribehttps://github.com/notifications/unsubscribe-auth/BEOUTUBTB4NMMZ5TXZOC5QLYH4UOXAVCNFSM6AAAAABAHI5HDCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNBQHA4DIMRYGM. You are receiving this because you authored the thread.Message ID: @.***>
OK, progress of sorts. Instead of using Docker Desktop I used the CLI command docker-compose up --build
this seems to avoid pulling the OLLAMA model:
Attaching to repository-api-1, repository-bot-1, repository-database-1, repository-front-end-1, repository-loader-1, repository-pdf_bot-1, repository-pull-model-1 repository-pull-model-1 | pulling ollama model gpt-4 using http://host.docker.internal:11434 repository-pull-model-1 | OLLAMA model only pulled if both LLM and OLLAMA_BASE_URL are set and the LLM model is not gpt repository-pull-model-1 exited with code 0
However now there's a failure with repository-api: repository-api-1 | File "/usr/local/lib/python3.11/site-packages/langchain/vectorstores/neo4j_vector.py", line 743, in from_existing_index repository-api-1 | raise ValueError( repository-api-1 | ValueError: The provided embedding function and vector index dimensions do not match. repository-api-1 | Embedding function dimension: 1536 repository-api-1 | Vector index dimension: 384 dependency failed to start: container repository-api-1 exited (1)
Did you try recreating the database from zero (recreating any volumes/containers from Neo4J)?
Looks like you created vector embeddings using a different embedding model. I had the same issue when switching from sentence to ollama.
I decided it was easier to delete the neo4j database.
You can cd into the data folder that docker mounted into the root of GenAI. Find the neo4j folder and delete the neo4j database which is just a folder
Restart the database container
It'll re create the default DB using the settings in your .env. Then go ahead and re import.
Hi, I tried deleting neo4j from \wsl.localhost\docker-desktop\tmp\docker-desktop-root\containers\services\01-docker\rootfs\data\databases and then re-running CLI command docker-compose up --build. This time the database container seemed to start up and to recreate a neo4j folder in that location, but could not connect to the neo4j database.
I have put my original neo4j folder back in place. I found a way to install ollama on windows from here [https://www.jeremymorgan.com/blog/generative-ai/how-to-run-llm-local-windows/] then changed my .env file to this and then ran docker-compose up --build It seems to work with that combination.
Docker desktop v4.52.2 on Windows 11. Error creating a dev environment:
Full log attached. docker genaiinstall fails.txt
Any ideas please? Thanks Mike