neo4j-labs / llm-graph-builder

Neo4j graph construction from unstructured data using LLMs
https://neo4j.com/labs/genai-ecosystem/llm-graph-builder/
Apache License 2.0
2.05k stars 304 forks source link

neo4j-nvl packages not found #261

Closed pythinker closed 4 months ago

pythinker commented 4 months ago

While running docker-compose up --build, I get this error message:

ERROR [build 4/7] RUN yarn add @neo4j-nvl/interaction-handlers @neo4j-nvl/core @neo4j-nvl/react error An unexpected error occurred: "https://registry.yarnpkg.com/@neo4j-nvl%2finteraction-handlers: Not found"

Are the neo4j-nvl/interaction-handlers, neo4j-nvl/core, and neo4j-nvl/react private packages?

kartikpersistent commented 4 months ago

Hi @pythinker as it is private library, we can't expose it publicly we will update the README about it.

nickknyc commented 4 months ago

well - the readme is not updated... what is the workaround please...

cip22 commented 4 months ago

I have the same issue.

kartikpersistent commented 4 months ago

it is going to be public soon

msenechal commented 4 months ago

For transparency and clarity, this neo4j-nvl package is the library used for the graph rendering visualization (same one used in other Neo4j tools), which was an internal (private) library. As Kartik mentioned, it will be public (very) soon, once that is public we will update the source to use the public package and that will fix this issue. (Once done we will also update this issue so you get notified)

swanandbagve commented 4 months ago

Till that time is it possible to have any workaround to get the setup complete ?

ernesttan1976 commented 4 months ago

same error I'm having

msenechal commented 4 months ago

Hi all, The lib for the graph visualization should become public soon, in the meantime, I provided a small fix here

Which essentially:

swanandbagve commented 4 months ago

I am still getting this error despite downloading from the branch. Did anyone's error get fixed please ? in case Yes do help please => CACHED [llm-graph-builder-tmp-issue-261-frontend stage-1 1/3] FROM docker.io/library/nginx:alpine@sha256:fdbf 0.0s => CANCELED [llm-graph-builder-tmp-issue-261-backend 4/4] RUN apt-get update && apt-get install -y libgl1-m 11.0s => CACHED [llm-graph-builder-tmp-issue-261-frontend build 2/7] WORKDIR /app 0.0s => [llm-graph-builder-tmp-issue-261-frontend build 3/7] COPY package.json .npmrc yarn.lock ./ 0.8s => ERROR [llm-graph-builder-tmp-issue-261-frontend build 4/7] RUN yarn add @neo4j-nvl/interaction-handlers @neo4 8.4s

[llm-graph-builder-tmp-issue-261-frontend build 4/7] RUN yarn add @neo4j-nvl/interaction-handlers @neo4j-nvl/core @neo4j-nvl/react:

0 6.571 yarn add v1.22.19

0 6.701 error An unexpected error occurred: "Failed to replace env in config: ${NPM_TOKEN}".

0 6.702 info If you think this is a bug, please open a bug report with the information provided in "/app/yarn-error.log".

0 6.702 info Visit https://yarnpkg.com/en/docs/cli/add for documentation about this command.


failed to solve: executor failed running [/bin/sh -c yarn add @neo4j-nvl/interaction-handlers @neo4j-nvl/core @neo4j-nvl/react]: exit code: 1

C:\neo4j\llm-graph-builder-tmp-issue-261\llm-graph-builder-tmp-issue-261>

msenechal commented 4 months ago

Hi @swanandbagve , Seeing your log, it seems that you are deploying through docker (The fix I made was only working for if you run through npm ; npm run dev, apologize if I wasn't clear) I've updated that same branch to remove the few bits that were still present in the dockerfile and other places that would prevent the npm run build to succeed. I've tested both yarn run build and docker build, both works fine now, let me know if you are still facing the issue

swanandbagve commented 4 months ago

Hi @msenechal , i get this error now on trying to start the docker container from the test branch you gave as workaround . i am not using GCP nor google colab. Not sure why this reference to google cloud :( frontend | 2024/05/07 16:36:27 [notice] 1#1: start worker process 30 backend | 2024-05-07 16:36:36,283 - Load pretrained SentenceTransformer: all-MiniLM-L6-v2 backend | 2024-05-07 16:36:38,445 - Use pytorch device_name: cpu backend | 2024-05-07 16:36:38,449 - Embedding: Using SentenceTransformer , Dimension:384 backend | 2024-05-07 16:36:38,523 - Authentication failed using Compute Engine authentication due to unavailable metadata server. backend | Traceback (most recent call last): backend | File "/usr/local/lib/python3.10/site-packages/google/cloud/aiplatform/initializer.py", line 305, in project backend | self._set_project_as_env_var_or_google_auth_default() backend | File "/usr/local/lib/python3.10/site-packages/google/cloud/aiplatform/initializer.py", line 93, in _set_project_as_env_var_or_google_auth_default backend | credentials, project = google.auth.default() backend | File "/usr/local/lib/python3.10/site-packages/google/auth/_default.py", line 691, in default backend | raise exceptions.DefaultCredentialsError(_CLOUD_SDK_MISSING_CREDENTIALS) backend | google.auth.exceptions.DefaultCredentialsError: Your default credentials were not found. To set up Application Default Credentials, see https://cloud.google.com/docs/authentication/external/set-up-adc for more information. backend | backend | The above exception was the direct cause of the following exception: backend | backend | Traceback (most recent call last): backend | File "/usr/local/lib/python3.10/site-packages/vertexai/_model_garden/_model_garden_models.py", line 289, in from_pretrained backend | return _from_pretrained(interface_class=cls, model_name=model_name) backend | File "/usr/local/lib/python3.10/site-packages/vertexai/_model_garden/_model_garden_models.py", line 206, in _from_pretrained backend | model_info = _get_model_info( backend | File "/usr/local/lib/python3.10/site-packages/vertexai/_model_garden/_model_garden_models.py", line 122, in _get_model_info backend | _publisher_models._PublisherModel( # pylint: disable=protected-access backend | File "/usr/local/lib/python3.10/site-packages/google/cloud/aiplatform/_publisher_models.py", line 63, in init backend | super().init(project=project, location=location, credentials=credentials) backend | File "/usr/local/lib/python3.10/site-packages/google/cloud/aiplatform/base.py", line 556, in init backend | self.project = project or initializer.global_config.project backend | File "/usr/local/lib/python3.10/site-packages/google/cloud/aiplatform/initializer.py", line 308, in project backend | raise GoogleAuthError(project_not_found_exception_str) from exc backend | google.auth.exceptions.GoogleAuthError: Unable to find your project. Please provide a project ID by: backend | - Passing a constructor argument backend | - Using vertexai.init() backend | - Setting project using 'gcloud config set project my-project' backend | - Setting a GCP environment variable backend | - To create a Google Cloud project, please follow guidance at https://developers.google.com/workspace/guides/create-project backend | backend | The above exception was the direct cause of the following exception: backend | backend | Traceback (most recent call last): backend | File "/usr/local/bin/uvicorn", line 8, in backend | sys.exit(main()) backend | File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1157, in call backend | return self.main(args, kwargs) backend | File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1078, in main backend | rv = self.invoke(ctx) backend | File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1434, in invoke backend | return ctx.invoke(self.callback, ctx.params) backend | File "/usr/local/lib/python3.10/site-packages/click/core.py", line 783, in invoke backend | return __callback(args, kwargs) backend | File "/usr/local/lib/python3.10/site-packages/uvicorn/main.py", line 416, in main backend | run( backend | File "/usr/local/lib/python3.10/site-packages/uvicorn/main.py", line 587, in run backend | server.run() backend | File "/usr/local/lib/python3.10/site-packages/uvicorn/server.py", line 61, in run backend | return asyncio.run(self.serve(sockets=sockets)) backend | File "/usr/local/lib/python3.10/asyncio/runners.py", line 44, in run backend | return loop.run_until_complete(main) backend | File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete backend | return future.result() backend | File "/usr/local/lib/python3.10/site-packages/uvicorn/server.py", line 68, in serve backend | config.load() backend | File "/usr/local/lib/python3.10/site-packages/uvicorn/config.py", line 467, in load backend | self.loaded_app = import_from_string(self.app) backend | File "/usr/local/lib/python3.10/site-packages/uvicorn/importer.py", line 21, in import_from_string backend | module = importlib.import_module(module_str) backend | File "/usr/local/lib/python3.10/importlib/init.py", line 126, in import_module backend | return _bootstrap._gcd_import(name[level:], package, level) backend | File "", line 1050, in _gcd_import backend | File "", line 1027, in _find_and_load backend | File "", line 1006, in _find_and_load_unlocked backend | File "", line 688, in _load_unlocked backend | File "", line 883, in exec_module backend | File "", line 241, in _call_with_frames_removed backend | File "/code/score.py", line 40, in backend | add_routes(app,ChatVertexAI(), path="/vertexai") backend | File "/usr/local/lib/python3.10/site-packages/langchain_google_vertexai/chat_models.py", line 517, in init backend | super().init(kwargs) backend | File "/usr/local/lib/python3.10/site-packages/pydantic/v1/main.py", line 339, in init backend | values, fields_set, validation_error = validate_model(__pydantic_self.class__, data) backend | File "/usr/local/lib/python3.10/site-packages/pydantic/v1/main.py", line 1100, in validatemodel backend | values = validator(cls, values) backend | File "/usr/local/lib/python3.10/site-packages/langchain_google_vertexai/chat_models.py", line 567, in validate_environment backend | values["client"] = model_cls.from_pretrained(generative_model_name) backend | File "/usr/local/lib/python3.10/site-packages/vertexai/_model_garden/_model_garden_models.py", line 291, in from_pretrained backend | raise auth_exceptions.GoogleAuthError(credential_exception_str) from e backend | google.auth.exceptions.GoogleAuthError: backend | Unable to authenticate your request. backend | Depending on your runtime environment, you can complete authentication by: backend | - if in local JupyterLab instance: !gcloud auth login backend | - if in Colab: backend | -from google.colab import auth backend | -auth.authenticate_user() backend | - if in service account or other: please follow guidance in https://cloud.google.com/docs/authentication backend exited with code 1

msenechal commented 4 months ago

Hi @swanandbagve , That is because in our app, we do have the support for gemini/VertexAI (which would require you to setup the gcp env) If you don't need the gemini/VertexAI integration, and only want to play with openAI, you can simply comment out this line:

In score.py , line 40:

add_routes(app,ChatVertexAI(), path="/vertexai")

Don't forget to:

ernesttan1976 commented 4 months ago

may I know how do I avoid using Diffbot and use OpenAI instead?

prakriti-solankey commented 4 months ago
  • Change your LLM Models env variable to remove gemini: export LLM_MODELS="Diffbot,OpenAI GPT 3.5,OpenAI GPT 4"

Change your LLM Models env variable to remove gemini: export LLM_MODELS="Diffbot,OpenAI GPT 3.5,OpenAI GPT 4" Rebuild the docker images (docker-compose build)

msenechal commented 4 months ago

Hi all, FYI, the library Neo4j Visualization Library as been released I've made the switch and this has been merged on the DEV branch https://github.com/neo4j-labs/llm-graph-builder/pull/299