SciPhi-AI / R2R

The Elasticsearch for RAG. Build, scale, and deploy state of the art Retrieval-Augmented Generation applications
https://r2r-docs.sciphi.ai/
MIT License
3.25k stars 238 forks source link

How to run enrich-graph with local llm? #989

Closed smilefufu closed 2 weeks ago

smilefufu commented 2 weeks ago

I'm using this kg config:

[kg]
provider = "neo4j"
batch_size = 1
max_entities = 10
max_relations = 20
kg_extraction_prompt = "zero_shot_ner_kg_extraction"

  [kg.kg_extraction_config]
  model = "ollama/sciphi/triplex"
  temperature = 1
  top_p = 1
  max_tokens_to_sample = 1_024
  stream = false
  add_generation_kwargs = { }

  [kg.kg_enrichment_settings]
  generation_config = { model = "ollama/llama3.1" }

When I run r2r enrich-graph, I got errors like this:

Traceback (most recent call last):
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/bin/r2r", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/cli/cli.py", line 47, in main
    cli()
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/click/decorators.py", line 45, in new_func
    return f(get_current_context().obj, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/cli/commands/restructure.py", line 14, in enrich_graph
    response = client.enrich_graph()
               ^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/sdk/client.py", line 224, in sync_wrapper
    result = asyncio.get_event_loop().run_until_complete(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/asyncio/tasks.py", line 314, in __step_run_and_handle_result
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/sdk/restructure.py", line 25, in enrich_graph
    return await client._make_request("POST", "enrich_graph", json=data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/sdk/client.py", line 144, in _make_request
    await handle_request_error_async(response)
  File "/opt/homebrew/Caskroom/miniconda/base/envs/r2r/lib/python3.12/site-packages/sdk/client.py", line 69, in handle_request_error_async
    raise R2RException(
sdk.models.R2RException: {"detail":{"message":"Graph enrichment failed: litellm.APIConnectionError: APIConnectionError: OpenAIException - Connection error.","error_type":"R2RException"}}

Looks like it didn't using ollama/llama3.1....

emrgnt-cmplxty commented 2 weeks ago

@shreyaspimpalgaonkar put a fix into our latest release.