v2rockets / Loyal-Elephie

Your Trusty Memory-enabled AI Companion - Multilingual RAG chatbot optimized for local LLMs | OpenAI API Compatible
MIT License
208 stars 19 forks source link

Searching isnt' working (Windows) #12

Open NatchoApps opened 2 weeks ago

NatchoApps commented 2 weeks ago

I'm running locally using Ollama and Llama3.

When i ask it for info on past discussions, I will get an error, or it will be nonsense, telling me I spoke to it in 2022 about things I did not

I put a text file in notes and it looks like it processes it:

Move: ../md_website/notes\aboutme.md to ../md_website/notes\about-me.md Change: ../md_website/notes\about-me.md

but.... when i ask about it

2024-06-15 08:36:44 INFO: ::1:3777 - "POST /v1/chat/completions HTTP/1.1" 500 ERROR: Exception in ASGI application Traceback (most recent call last): File "C:\Python311\Lib\site-packages\httpx_transports\default.py", line 69, in map_httpcore_exceptions yield File "C:\Python311\Lib\site-packages\httpx_transports\default.py", line 233, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\httpcore_sync\connection_pool.py", line 216, in handle_request raise exc from None File "C:\Python311\Lib\site-packages\httpcore_sync\connection_pool.py", line 196, in handle_request response = connection.handle_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\httpcore_sync\connection.py", line 99, in handle_request raise exc File "C:\Python311\Lib\site-packages\httpcore_sync\connection.py", line 76, in handle_request stream = self._connect(request) ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\httpcore_sync\connection.py", line 122, in _connect stream = self._network_backend.connect_tcp(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\httpcore_backends\sync.py", line 205, in connect_tcp with map_exceptions(exc_map): File "C:\Python311\Lib\contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "C:\Python311\Lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: [WinError 10061] No connection could be made because the target machine actively refused it

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Python311\Lib\site-packages\openai_base_client.py", line 880, in _request response = self._client.send( ^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\httpx_client.py", line 914, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\httpx_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\httpx_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\httpx_client.py", line 1015, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\httpx_transports\default.py", line 232, in handle_request with map_httpcore_exceptions(): File "C:\Python311\Lib\contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "C:\Python311\Lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: [WinError 10061] No connection could be made because the target machine actively refused it

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\Main\AppData\Roaming\Python\Python311\site-packages\uvicorn\protocols\http\httptools_impl.py", line 435, in run_asgi result = await app( # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Main\AppData\Roaming\Python\Python311\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in call return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\fastapi\applications.py", line 1054, in call await super().call(scope, receive, send) File "C:\Python311\Lib\site-packages\starlette\applications.py", line 116, in call await self.middleware_stack(scope, receive, send) File "C:\Python311\Lib\site-packages\starlette\middleware\errors.py", line 186, in call raise exc File "C:\Python311\Lib\site-packages\starlette\middleware\errors.py", line 164, in call await self.app(scope, receive, _send) File "C:\Python311\Lib\site-packages\starlette\middleware\cors.py", line 83, in call await self.app(scope, receive, send) File "C:\Python311\Lib\site-packages\starlette\middleware\exceptions.py", line 62, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "C:\Python311\Lib\site-packages\starlette_exception_handler.py", line 55, in wrapped_app raise exc File "C:\Python311\Lib\site-packages\starlette_exception_handler.py", line 44, in wrapped_app await app(scope, receive, sender) File "C:\Python311\Lib\site-packages\starlette\routing.py", line 746, in call await route.handle(scope, receive, send) File "C:\Python311\Lib\site-packages\starlette\routing.py", line 288, in handle await self.app(scope, receive, send) File "C:\Python311\Lib\site-packages\starlette\routing.py", line 75, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "C:\Python311\Lib\site-packages\starlette_exception_handler.py", line 55, in wrapped_app raise exc File "C:\Python311\Lib\site-packages\starlette_exception_handler.py", line 44, in wrapped_app await app(scope, receive, sender) File "C:\Python311\Lib\site-packages\starlette\routing.py", line 70, in app response = await func(request) ^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\fastapi\routing.py", line 299, in app raise e File "C:\Python311\Lib\site-packages\fastapi\routing.py", line 294, in app raw_response = await run_endpoint_function( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\fastapi\routing.py", line 191, in run_endpoint_function return await dependant.call(*values) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Main\Loyal-Elephie\backend\memory_server.py", line 278, in create_chat_completion completion_or_chunks = client.chat.completions.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\openai_utils_utils.py", line 271, in wrapper return func(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\openai\resources\chat\completions.py", line 643, in create return self._post( ^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\openai_base_client.py", line 1091, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\openai_base_client.py", line 852, in request return self._request( ^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\openai_base_client.py", line 899, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\openai_base_client.py", line 961, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\openai_base_client.py", line 899, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\openai_base_client.py", line 961, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\openai_base_client.py", line 908, in _request raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.

▲ Next.js 14.2.4

v2rockets commented 2 weeks ago

Looks like a connection problem. What is your port number your local LLM running on and what are your URLs in settings.py?