Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
22.46k stars 2.27k forks source link

[BUG]: Unable to get "Live web search and browsing" agent working using google custom search engine. Error: getaddrinfo ENOTFOUND - http errno: -3008 #1595

Closed tugzii closed 3 months ago

tugzii commented 3 months ago

How are you running AnythingLLM?

AnythingLLM desktop app

What happened?

Running ollama locally/manually on Windows 11 Pro manual (manually built ollama to support my RX 6750 XT)

.\ollama.exe run llama3:8b-instruct-q8_0

Chat works fine. Embedded Document works. Embedded Scrapper works.

Setup "Live web search and browsing" agent using Google Custom Search (setup new api key and tested in browser url that it worked)

Run: @agent Can you scrape useanything.com and tell me the key features

[AgentHandler] Attached websocket plugin to Agent cluster [AgentHandler] Attached chat-history plugin to Agent cluster [AgentHandler] Attaching user and default agent to Agent cluster. [AgentHandler] Attached rag-memory plugin to Agent cluster [AgentHandler] Attached document-summarizer plugin to Agent cluster [AgentHandler] Attached web-scraping plugin to Agent cluster [AgentHandler] Attached web-browsing plugin to Agent cluster fetch failed TypeError: fetch failed at Object.fetch (node:internal/deps/undici/undici:11457:11) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async post (C:\Users\sshar\AppData\Local\Programs\anythingllm-desktop\resources\backend\node_modules\ollama\dist\shared\ollama.05b0f34c.cjs:81:20) at async Ollama.processStreamableRequest (C:\Users\sshar\AppData\Local\Programs\anythingllm-desktop\resources\backend\node_modules\ollama\dist\shared\ollama.05b0f34c.cjs:183:22) at async #handleFunctionCallChat (C:\Users\sshar\AppData\Local\Programs\anythingllm-desktop\resources\backend\server.js:20839:26) at async OllamaProvider.functionCall (C:\Users\sshar\AppData\Local\Programs\anythingllm-desktop\resources\backend\server.js:20668:26) at async OllamaProvider.complete (C:\Users\sshar\AppData\Local\Programs\anythingllm-desktop\resources\backend\server.js:20859:40) at async AIbitat.handleExecution (C:\Users\sshar\AppData\Local\Programs\anythingllm-desktop\resources\backend\server.js:22536:28) at async AIbitat.reply (C:\Users\sshar\AppData\Local\Programs\anythingllm-desktop\resources\backend\server.js:22526:25) at async AIbitat.chat (C:\Users\sshar\AppData\Local\Programs\anythingllm-desktop\resources\backend\server.js:22377:19) { cause: Error: getaddrinfo ENOTFOUND http at GetAddrInfoReqWrap.onlookup [as oncomplete] (node:dns:108:26) { errno: -3008, code: 'ENOTFOUND', syscall: 'getaddrinfo', hostname: 'http' } } [AgentHandler] End

Are there known steps to reproduce?

No response

tugzii commented 3 months ago

Note:

When setting up Google Custom Search I generated API under the following section (not sure if this is correct but didn't see another way to generate API):

Programmatic Access

Custom Search JSON API Limit of 10,000 queries per day.

tugzii commented 3 months ago

Note:

I was running AnythingLLM as "standard user" when I submitted this issue. But have since confirmed same error if I run AnythingLLMDesktop.exe from "admin" terminal.

shatfield4 commented 3 months ago

Do you have another device you could try testing this on using the same API keys? Also when you ask it to scrape/tell you key features, does the agent chat show a message similar to @agent: Searching on Google for "useanything.com key features"?

It sounds like it may be your firewall blocking the google search endpoints. I have tried to replicate this using an invalid search engine ID and API key and cannot get the same error, so I don't think your keys are wrong.

tugzii commented 3 months ago

I'm yet to try on another device.

However, what I have done since is the following:

1) Uninstalled from my user account (using the supplied exe and then deleting directories) and installed using my admin account.

2) Before enabling the "Live web search and browsing" agent I ran the following chat

@agent Can you scrape useanything.com and tell me the key features

Result: Same error

Text from Chat window (not cmd terminal)

Agent @agent invoked. Swapping over to agent chat. Type /exit to exit agent execution loop early. Could not respond to message. fetch failed

Call stack:

[AgentHandler] Attached websocket plugin to Agent cluster [AgentHandler] Attached chat-history plugin to Agent cluster [AgentHandler] Attaching user and default agent to Agent cluster. [AgentHandler] Attached rag-memory plugin to Agent cluster [AgentHandler] Attached document-summarizer plugin to Agent cluster [AgentHandler] Attached web-scraping plugin to Agent cluster fetch failed TypeError: fetch failed

at async AIbitat.reply (C:\Users\xxx\AppData\Local\Programs\anythingllm-desktop\resources\backend\server.js:22526:25) at async AIbitat.chat (C:\Users\xxx\AppData\Local\Programs\anythingllm-desktop\resources\backend\server.js:22377:19) { cause: Error: getaddrinfo ENOTFOUND http at GetAddrInfoReqWrap.onlookup [as oncomplete] (node:dns:108:26) { errno: -3008, code: 'ENOTFOUND', syscall: 'getaddrinfo', hostname: 'http' }

3) Enabled the "Live web search and browsing" agent and ran the following chat

@agent Can you scrape useanything.com and tell me the key features

Result: Same (similar) error

but included ... [AgentHandler] Attached web-scraping plugin to Agent cluster [AgentHandler] Attached web-browsing plugin to Agent cluster

4) Signed up to Serper copied over API key and I got the exact same error.

5) Tried all of the above while having my Windows Firewall disabled and I got the exact same error.

timothycarambat commented 3 months ago

@tugzii and to confirm you can send a chat - just a normal chat without @agent to the workspace with no issue? This error is coming from ollama not being able to connect to the Ollama instance you are using for chatting. If it is broken in agents it should also be broken in regular chatting as well.

There is a slight difference with ollama in regular chat vs agent chat (fetch api vs Ollama SDK), but the connection information provided should be all the same.

tugzii commented 3 months ago

I'm new to Ollama and AnythingLLM but yes, chat seems to be connecting to my local instance of Ollama to answer me.

As mentioned in initial post, I did have to build the Ollama code using the following instructions and run it manually from the cmd prompt (i.e. not as a service and I never downloaded and installed the original Ollama using their installer)

https://github.com/ollama/ollama/issues/3107#issuecomment-2144741411

See attach screenshots to confirm

admin_anythingLLM_1 admin_anythingLLM_2 admin_anythingLLM_3

tugzii commented 3 months ago

I officially installed Ollama (for windows) and copied my compiled files to the official install location (instead of running from c:\Documents\Ollama):

C:\Users[username]\AppData\Local\Programs\Ollama

The errors above went away and I did get a response.

However, the response was not good as even thou it stated that it had scraped the website it gave me data from 2022.

Closing this bug report for now as the above issue no longer appears (but may raise a separate issue for the old scrape data)