Closed pythinker closed 4 months ago
Hi @pythinker as it is private library, we can't expose it publicly we will update the README about it.
well - the readme is not updated... what is the workaround please...
I have the same issue.
it is going to be public soon
For transparency and clarity, this neo4j-nvl package is the library used for the graph rendering visualization (same one used in other Neo4j tools), which was an internal (private) library. As Kartik mentioned, it will be public (very) soon, once that is public we will update the source to use the public package and that will fix this issue. (Once done we will also update this issue so you get notified)
Till that time is it possible to have any workaround to get the setup complete ?
same error I'm having
Hi all, The lib for the graph visualization should become public soon, in the meantime, I provided a small fix here
Which essentially:
[llm-graph-builder-tmp-issue-261-frontend build 4/7] RUN yarn add @neo4j-nvl/interaction-handlers @neo4j-nvl/core @neo4j-nvl/react:
0 6.571 yarn add v1.22.19
0 6.701 error An unexpected error occurred: "Failed to replace env in config: ${NPM_TOKEN}".
0 6.702 info If you think this is a bug, please open a bug report with the information provided in "/app/yarn-error.log".
0 6.702 info Visit https://yarnpkg.com/en/docs/cli/add for documentation about this command.
failed to solve: executor failed running [/bin/sh -c yarn add @neo4j-nvl/interaction-handlers @neo4j-nvl/core @neo4j-nvl/react]: exit code: 1
C:\neo4j\llm-graph-builder-tmp-issue-261\llm-graph-builder-tmp-issue-261>
Hi @swanandbagve , Seeing your log, it seems that you are deploying through docker (The fix I made was only working for if you run through npm ; npm run dev, apologize if I wasn't clear) I've updated that same branch to remove the few bits that were still present in the dockerfile and other places that would prevent the npm run build to succeed. I've tested both yarn run build and docker build, both works fine now, let me know if you are still facing the issue
Hi @msenechal ,
i get this error now on trying to start the docker container from the test branch you gave as workaround . i am not using GCP nor google colab. Not sure why this reference to google cloud :(
frontend | 2024/05/07 16:36:27 [notice] 1#1: start worker process 30
backend | 2024-05-07 16:36:36,283 - Load pretrained SentenceTransformer: all-MiniLM-L6-v2
backend | 2024-05-07 16:36:38,445 - Use pytorch device_name: cpu
backend | 2024-05-07 16:36:38,449 - Embedding: Using SentenceTransformer , Dimension:384
backend | 2024-05-07 16:36:38,523 - Authentication failed using Compute Engine authentication due to unavailable metadata server.
backend | Traceback (most recent call last):
backend | File "/usr/local/lib/python3.10/site-packages/google/cloud/aiplatform/initializer.py", line 305, in project
backend | self._set_project_as_env_var_or_google_auth_default()
backend | File "/usr/local/lib/python3.10/site-packages/google/cloud/aiplatform/initializer.py", line 93, in _set_project_as_env_var_or_google_auth_default
backend | credentials, project = google.auth.default()
backend | File "/usr/local/lib/python3.10/site-packages/google/auth/_default.py", line 691, in default
backend | raise exceptions.DefaultCredentialsError(_CLOUD_SDK_MISSING_CREDENTIALS)
backend | google.auth.exceptions.DefaultCredentialsError: Your default credentials were not found. To set up Application Default Credentials, see https://cloud.google.com/docs/authentication/external/set-up-adc for more information.
backend |
backend | The above exception was the direct cause of the following exception:
backend |
backend | Traceback (most recent call last):
backend | File "/usr/local/lib/python3.10/site-packages/vertexai/_model_garden/_model_garden_models.py", line 289, in from_pretrained
backend | return _from_pretrained(interface_class=cls, model_name=model_name)
backend | File "/usr/local/lib/python3.10/site-packages/vertexai/_model_garden/_model_garden_models.py", line 206, in _from_pretrained
backend | model_info = _get_model_info(
backend | File "/usr/local/lib/python3.10/site-packages/vertexai/_model_garden/_model_garden_models.py", line 122, in _get_model_info
backend | _publisher_models._PublisherModel( # pylint: disable=protected-access
backend | File "/usr/local/lib/python3.10/site-packages/google/cloud/aiplatform/_publisher_models.py", line 63, in init
backend | super().init(project=project, location=location, credentials=credentials)
backend | File "/usr/local/lib/python3.10/site-packages/google/cloud/aiplatform/base.py", line 556, in init
backend | self.project = project or initializer.global_config.project
backend | File "/usr/local/lib/python3.10/site-packages/google/cloud/aiplatform/initializer.py", line 308, in project
backend | raise GoogleAuthError(project_not_found_exception_str) from exc
backend | google.auth.exceptions.GoogleAuthError: Unable to find your project. Please provide a project ID by:
backend | - Passing a constructor argument
backend | - Using vertexai.init()
backend | - Setting project using 'gcloud config set project my-project'
backend | - Setting a GCP environment variable
backend | - To create a Google Cloud project, please follow guidance at https://developers.google.com/workspace/guides/create-project
backend |
backend | The above exception was the direct cause of the following exception:
backend |
backend | Traceback (most recent call last):
backend | File "/usr/local/bin/uvicorn", line 8, in !gcloud auth login
backend | - if in Colab:
backend | -from google.colab import auth
backend | -auth.authenticate_user()
backend | - if in service account or other: please follow guidance in https://cloud.google.com/docs/authentication
backend exited with code 1
Hi @swanandbagve , That is because in our app, we do have the support for gemini/VertexAI (which would require you to setup the gcp env) If you don't need the gemini/VertexAI integration, and only want to play with openAI, you can simply comment out this line:
In score.py , line 40:
Don't forget to:
may I know how do I avoid using Diffbot and use OpenAI instead?
- Change your LLM Models env variable to remove gemini: export LLM_MODELS="Diffbot,OpenAI GPT 3.5,OpenAI GPT 4"
Change your LLM Models env variable to remove gemini: export LLM_MODELS="Diffbot,OpenAI GPT 3.5,OpenAI GPT 4" Rebuild the docker images (docker-compose build)
Hi all, FYI, the library Neo4j Visualization Library as been released I've made the switch and this has been merged on the DEV branch https://github.com/neo4j-labs/llm-graph-builder/pull/299
While running docker-compose up --build, I get this error message:
ERROR [build 4/7] RUN yarn add @neo4j-nvl/interaction-handlers @neo4j-nvl/core @neo4j-nvl/react error An unexpected error occurred: "https://registry.yarnpkg.com/@neo4j-nvl%2finteraction-handlers: Not found"
Are the neo4j-nvl/interaction-handlers, neo4j-nvl/core, and neo4j-nvl/react private packages?