nashsu / FreeAskInternet

FreeAskInternet is a completely free, PRIVATE and LOCALLY running search aggregator & answer generate using MULTI LLMs, without GPU needed. The user can ask a question and the system will make a multi engine search and combine the search result to LLM and generate the answer based on search results. It's all FREE to use.
Apache License 2.0
8.5k stars 898 forks source link

用的ollama,但没有回复 #61

Open taozhiyuai opened 7 months ago

taozhiyuai commented 7 months ago

ollama的模型 hiyu@603e5f4a42f1 FreeAskInternet % ollama list NAME ID SIZE MODIFIED qwen:32b-chat-v1.5-fp16 bb930ce340f6 65 GB 6 days ago

freeaskinternet的设置,用的局域网IP,v1后加不加/都一样的没有回复.

截屏2024-04-12 19 31 31

docker的log如下 2024-04-12 19:29:19 backend-1 | INFO: 172.19.0.9:57854 - "POST /api/search/get_search_refs HTTP/1.0" 200 OK 2024-04-12 19:29:19 freeaskinternet-ui-1 | 192.168.65.1 - - [12/Apr/2024:11:29:19 +0000] "POST /api/search/get_search_refs HTTP/1.1" 200 36 "http://localhost:3031/search/3UUj9QvUHU6iccgTKjizqa?query=hi&model=gpt3.5&ask_type=llm&model_token=" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.4.1 Safari/605.1.15" "-" 2024-04-12 19:29:19 backend-1 | INFO: 172.19.0.9:57860 - "POST /api/search/stream/5rh2zteIRiDWNAHzXdcNJd HTTP/1.0" 200 OK 2024-04-12 19:29:21 backend-1 | ERROR: Exception in ASGI application 2024-04-12 19:29:21 backend-1 | Traceback (most recent call last): 2024-04-12 19:29:21 backend-1 | File "/usr/local/lib/python3.9/site-packages/starlette/responses.py", line 265, in __call__ 2024-04-12 19:29:21 backend-1 | await wrap(partial(self.listen_for_disconnect, receive)) 2024-04-12 19:29:21 backend-1 | File "/usr/local/lib/python3.9/site-packages/starlette/responses.py", line 261, in wrap 2024-04-12 19:29:21 backend-1 | await func() 2024-04-12 19:29:21 backend-1 | File "/usr/local/lib/python3.9/site-packages/starlette/responses.py", line 238, in listen_for_disconnect 2024-04-12 19:29:21 backend-1 | message = await receive() 2024-04-12 19:29:21 backend-1 | File "/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py", line 535, in receive 2024-04-12 19:29:21 backend-1 | await self.message_event.wait() 2024-04-12 19:29:21 backend-1 | File "/usr/local/lib/python3.9/asyncio/locks.py", line 226, in wait 2024-04-12 19:29:21 backend-1 | await fut 2024-04-12 19:29:21 freeaskinternet-ui-1 | 192.168.65.1 - - [12/Apr/2024:11:29:21 +0000] "POST /api/search/stream/5rh2zteIRiDWNAHzXdcNJd HTTP/1.1" 200 5 "http://localhost:3031/search/3UUj9QvUHU6iccgTKjizqa?query=hi&model=gpt3.5&ask_type=llm&model_token=" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.4.1 Safari/605.1.15" "-" 2024-04-12 19:29:21 backend-1 | asyncio.exceptions.CancelledError: Cancelled by cancel scope ffffaa5493d0 2024-04-12 19:29:21 backend-1 | 2024-04-12 19:29:21 backend-1 | During handling of the above exception, another exception occurred: 2024-04-12 19:29:21 backend-1 | 2024-04-12 19:29:21 backend-1 | + Exception Group Traceback (most recent call last): 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py", line 407, in run_asgi 2024-04-12 19:29:21 backend-1 | | result = await app( # type: ignore[func-returns-value] 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__ 2024-04-12 19:29:21 backend-1 | | return await self.app(scope, receive, send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/fastapi/applications.py", line 1054, in __call__ 2024-04-12 19:29:21 backend-1 | | await super().__call__(scope, receive, send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/applications.py", line 123, in __call__ 2024-04-12 19:29:21 backend-1 | | await self.middleware_stack(scope, receive, send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/middleware/errors.py", line 186, in __call__ 2024-04-12 19:29:21 backend-1 | | raise exc 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/middleware/errors.py", line 164, in __call__ 2024-04-12 19:29:21 backend-1 | | await self.app(scope, receive, _send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/middleware/cors.py", line 93, in __call__ 2024-04-12 19:29:21 backend-1 | | await self.simple_response(scope, receive, send, request_headers=headers) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/middleware/cors.py", line 148, in simple_response 2024-04-12 19:29:21 backend-1 | | await self.app(scope, receive, send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 65, in __call__ 2024-04-12 19:29:21 backend-1 | | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app 2024-04-12 19:29:21 backend-1 | | raise exc 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app 2024-04-12 19:29:21 backend-1 | | await app(scope, receive, sender) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 756, in __call__ 2024-04-12 19:29:21 backend-1 | | await self.middleware_stack(scope, receive, send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 776, in app 2024-04-12 19:29:21 backend-1 | | await route.handle(scope, receive, send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 297, in handle 2024-04-12 19:29:21 backend-1 | | await self.app(scope, receive, send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 77, in app 2024-04-12 19:29:21 backend-1 | | await wrap_app_handling_exceptions(app, request)(scope, receive, send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app 2024-04-12 19:29:21 backend-1 | | raise exc 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app 2024-04-12 19:29:21 backend-1 | | await app(scope, receive, sender) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 75, in app 2024-04-12 19:29:21 backend-1 | | await response(scope, receive, send) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/responses.py", line 265, in __call__ 2024-04-12 19:29:21 backend-1 | | await wrap(partial(self.listen_for_disconnect, receive)) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 678, in __aexit__ 2024-04-12 19:29:21 backend-1 | | raise BaseExceptionGroup( 2024-04-12 19:29:21 backend-1 | | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) 2024-04-12 19:29:21 backend-1 | +-+---------------- 1 ---------------- 2024-04-12 19:29:21 backend-1 | | Traceback (most recent call last): 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions 2024-04-12 19:29:21 backend-1 | | yield 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpx/_transports/default.py", line 233, in handle_request 2024-04-12 19:29:21 backend-1 | | resp = self._pool.handle_request(req) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request 2024-04-12 19:29:21 backend-1 | | raise exc from None 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request 2024-04-12 19:29:21 backend-1 | | response = connection.handle_request( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 99, in handle_request 2024-04-12 19:29:21 backend-1 | | raise exc 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 76, in handle_request 2024-04-12 19:29:21 backend-1 | | stream = self._connect(request) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 122, in _connect 2024-04-12 19:29:21 backend-1 | | stream = self._network_backend.connect_tcp(**kwargs) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 213, in connect_tcp 2024-04-12 19:29:21 backend-1 | | sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/contextlib.py", line 137, in __exit__ 2024-04-12 19:29:21 backend-1 | | self.gen.throw(typ, value, traceback) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions 2024-04-12 19:29:21 backend-1 | | raise to_exc(exc) from exc 2024-04-12 19:29:21 backend-1 | | httpcore.ConnectError: [Errno 111] Connection refused 2024-04-12 19:29:21 backend-1 | | 2024-04-12 19:29:21 backend-1 | | The above exception was the direct cause of the following exception: 2024-04-12 19:29:21 backend-1 | | 2024-04-12 19:29:21 backend-1 | | Traceback (most recent call last): 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/openai/_base_client.py", line 931, in _request 2024-04-12 19:29:21 backend-1 | | response = self._client.send( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpx/_client.py", line 914, in send 2024-04-12 19:29:21 backend-1 | | response = self._send_handling_auth( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpx/_client.py", line 942, in _send_handling_auth 2024-04-12 19:29:21 backend-1 | | response = self._send_handling_redirects( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpx/_client.py", line 979, in _send_handling_redirects 2024-04-12 19:29:21 backend-1 | | response = self._send_single_request(request) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpx/_client.py", line 1015, in _send_single_request 2024-04-12 19:29:21 backend-1 | | response = transport.handle_request(request) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpx/_transports/default.py", line 233, in handle_request 2024-04-12 19:29:21 backend-1 | | resp = self._pool.handle_request(req) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/contextlib.py", line 137, in __exit__ 2024-04-12 19:29:21 backend-1 | | self.gen.throw(typ, value, traceback) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions 2024-04-12 19:29:21 backend-1 | | raise mapped_exc(message) from exc 2024-04-12 19:29:21 backend-1 | | httpx.ConnectError: [Errno 111] Connection refused 2024-04-12 19:29:21 backend-1 | | 2024-04-12 19:29:21 backend-1 | | The above exception was the direct cause of the following exception: 2024-04-12 19:29:21 backend-1 | | 2024-04-12 19:29:21 backend-1 | | Traceback (most recent call last): 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/responses.py", line 261, in wrap 2024-04-12 19:29:21 backend-1 | | await func() 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/responses.py", line 250, in stream_response 2024-04-12 19:29:21 backend-1 | | async for chunk in self.body_iterator: 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/concurrency.py", line 65, in iterate_in_threadpool 2024-04-12 19:29:21 backend-1 | | yield await anyio.to_thread.run_sync(_next, as_iterator) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/anyio/to_thread.py", line 56, in run_sync 2024-04-12 19:29:21 backend-1 | | return await get_async_backend().run_sync_in_worker_thread( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread 2024-04-12 19:29:21 backend-1 | | return await future 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 851, in run 2024-04-12 19:29:21 backend-1 | | result = context.run(func, *args) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/starlette/concurrency.py", line 54, in _next 2024-04-12 19:29:21 backend-1 | | return next(iterator) 2024-04-12 19:29:21 backend-1 | | File "/app/server.py", line 211, in generator 2024-04-12 19:29:21 backend-1 | | for token in free_ask_internet.chat(prompt=prompt,model=model,llm_auth_token=llm_auth_token,llm_base_url=llm_base_url,using_custom_llm=using_custom_llm,stream=True): 2024-04-12 19:29:21 backend-1 | | File "/app/free_ask_internet.py", line 196, in chat 2024-04-12 19:29:21 backend-1 | | for chunk in openai.chat.completions.create( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/openai/_utils/_utils.py", line 275, in wrapper 2024-04-12 19:29:21 backend-1 | | return func(*args, **kwargs) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/openai/resources/chat/completions.py", line 667, in create 2024-04-12 19:29:21 backend-1 | | return self._post( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/openai/_base_client.py", line 1213, in post 2024-04-12 19:29:21 backend-1 | | return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/openai/_base_client.py", line 902, in request 2024-04-12 19:29:21 backend-1 | | return self._request( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/openai/_base_client.py", line 955, in _request 2024-04-12 19:29:21 backend-1 | | return self._retry_request( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/openai/_base_client.py", line 1026, in _retry_request 2024-04-12 19:29:21 backend-1 | | return self._request( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/openai/_base_client.py", line 955, in _request 2024-04-12 19:29:21 backend-1 | | return self._retry_request( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/openai/_base_client.py", line 1026, in _retry_request 2024-04-12 19:29:21 backend-1 | | return self._request( 2024-04-12 19:29:21 backend-1 | | File "/usr/local/lib/python3.9/site-packages/openai/_base_client.py", line 965, in _request 2024-04-12 19:29:21 backend-1 | | raise APIConnectionError(request=request) from err 2024-04-12 19:29:21 backend-1 | | openai.APIConnectionError: Connection error. 2024-04-12 19:29:21 backend-1 | +------------------------------------ 2024-04-12 19:30:35 llm-freegpt35-1 | Error refreshing session ID, retrying in 1 minute... 2024-04-12 19:30:35 llm-freegpt35-1 | If this error persists, your country may not be supported yet. 2024-04-12 19:30:35 llm-freegpt35-1 | If your country was the issue, please consider using a U.S. VPN. 2024-04-12 19:32:36 llm-freegpt35-1 | Error refreshing session ID, retrying in 1 minute... 2024-04-12 19:32:36 llm-freegpt35-1 | If this error persists, your country may not be supported yet. 2024-04-12 19:32:36 llm-freegpt35-1 | If your country was the issue, please consider using a U.S. VPN.

taozhiyuai commented 7 months ago

我没有开VPN

taozhiyuai commented 7 months ago

我填入了kimi和chatglm的key, 但是,模型列表还是灰色,点击不出列表,无法选择模型, 只是显示gpt 3.5

taozhiyuai commented 7 months ago

我没有执行 ‘export OLLAMA_HOST=0.0.0.0 ollama serve’

以上错误是否和这个有关系?

我是MAC, 目前ollama会自动在tray里面出现图标,自动做服务器,open webui等都很正常.不需要ollama serve.

运行你这个项目必须要改host?对其他APP有影响吗? 我在docker运行dify 也是用局域网IP就行了,没有改HOST的呀

miaoxiaolv commented 6 months ago

我没有执行 ‘export OLLAMA_HOST=0.0.0.0 ollama serve’

以上错误是否和这个有关系?

我是MAC, 目前ollama会自动在tray里面出现图标,自动做服务器,open webui等都很正常.不需要ollama serve.

运行你这个项目必须要改host?对其他APP有影响吗? 我在docker运行dify 也是用局域网IP就行了,没有改H

这个OLLAMA_HOST 需要设置一下,不然默认只是监听127.0.0.1,亲测即使是和Ollam部署在一台机器上也需要设置

taozhiyuai commented 6 months ago

到底要如何设置