rahulnyk / knowledge_graph

Convert any text to a graph of knowledge. This can be used for Graph Augmented Generation or Knowledge Graph based QnA
1.43k stars 280 forks source link

error trying to regenerate #9

Open gautambak opened 10 months ago

gautambak commented 10 months ago

Hi Rahul,

Thank you for this wonderful project and I'm excited to play around with it.

If I said regenerate to false, then the code works but when I try to regenerate using my own PDF and data, I get the following error: ERROR ### Here is the buggy response: None

An error occurred: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7caf1e449180>: Failed to establish a new connection: [Errno 111] Connection refused'))

ERROR ### Here is the buggy response: None

An error occurred: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7caf1e44a9e0>: Failed to establish a new connection: [Errno 111] Connection refused'))

ERROR ### Here is the buggy response: None


ValueError Traceback (most recent call last) in <cell line: 4>() 3 4 if regenerate: ----> 5 concepts_list = df2Graph(df, model='zephyr:latest') 6 dfg1 = graph2Df(concepts_list) 7 if not os.path.exists(outputdirectory):

1 frames /usr/local/lib/python3.10/dist-packages/numpy/core/overrides.py in concatenate(*args, **kwargs)

ValueError: need at least one array to concatenate

-- if it helps, I'm using Google Collab and pulled your repo into my instance

rahulnyk commented 10 months ago

Hey Gautam, I think this is because of the missing Ollama server.

when you set regenerate to False, it just reads the existing CSV file. When you set it to False, it tries to re-extract concepts from the LLM. For that, it needs an Ollama server hosted locally and Zephyr model pulled locally.

Step 1: Install Ollama https://ollama.ai Step 2: run ollama run zephyr in your terminal. This will pull the zephyr model to your local machine and start the Ollama server.

gautambak commented 10 months ago

Hey Rahul,

Thank you for that, I'm still getting errors: /content# ollama pull zephyr Error: could not connect to ollama server, run 'ollama serve' to start it /content# ollama serve 2023/11/26 03:54:06 images.go:779: total blobs: 0 2023/11/26 03:54:06 images.go:786: total unused blobs removed: 0 2023/11/26 03:54:06 routes.go:777: Listening on 127.0.0.1:11434 (version 0.1.11)

[GIN] 2023/11/26 - 03:54:20 | 404 | 507.407µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 70.048µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 85.69µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 72.12µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 79.2µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 70.018µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 88.667µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 75.345µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 66.661µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 73.33µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 68.976µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 75.881µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 93.462µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 61.866µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 71.39µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 69.099µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 79.912µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 63.016µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 70.322µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 67.904µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 57.398µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 83.668µs | 127.0.0.1 | POST "/api/generate" [GIN] 2023/11/26 - 03:54:20 | 404 | 62.811µs | 127.0.0.1 | POST "/api/generate"

[0] 0:ollama*

rahulnyk commented 10 months ago

I am confused. Do you have an Ollama server running before you run Ollama pull zephyr

In any case, this problem pertains to the Ollama server on your local. I am sure if it is running properly, the code will run. Please refer to Ollama documentation

gautambak commented 10 months ago

ok i'll try again. I am not running this locally, I'm trying to run it on colab so that might be the problem.

nestorDario commented 4 months ago

Good morning. I can run in Colab? How do I use Ollama?