NoneJou072 / robochain

A simulation framework based on ROS2 and LLMs(like GPT) for robot interaction tasks in the era of large models
MIT License
89 stars 12 forks source link

请问为什么使用离线大模型还会出现openai.APIConnectionError: Connection error. #21

Open andrewwjh opened 1 week ago

andrewwjh commented 1 week ago

Human: pick up the red block You: INFO:openai._base_client:Retrying request to /chat/completions in 0.790899 seconds INFO:openai._base_client:Retrying request to /chat/completions in 1.967744 seconds Traceback (most recent call last): File "/usr/local/lib/python3.8/dist-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions yield File "/usr/local/lib/python3.8/dist-packages/httpx/_transports/default.py", line 233, in handle_request resp = self._pool.handle_request(req) File "/usr/local/lib/python3.8/dist-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request raise exc from None File "/usr/local/lib/python3.8/dist-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request response = connection.handle_request( File "/usr/local/lib/python3.8/dist-packages/httpcore/_sync/connection.py", line 99, in handle_request raise exc File "/usr/local/lib/python3.8/dist-packages/httpcore/_sync/connection.py", line 76, in handle_request stream = self._connect(request) File "/usr/local/lib/python3.8/dist-packages/httpcore/_sync/connection.py", line 122, in _connect stream = self._network_backend.connect_tcp(**kwargs) File "/usr/local/lib/python3.8/dist-packages/httpcore/_backends/sync.py", line 213, in connect_tcp sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1) File "/usr/lib/python3.8/contextlib.py", line 131, in exit self.gen.throw(type, value, traceback) File "/usr/local/lib/python3.8/dist-packages/httpcore/_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: [Errno 101] Network is unreachable

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/usr/local/lib/python3.8/dist-packages/openai/_base_client.py", line 962, in _request response = self._client.send( File "/usr/local/lib/python3.8/dist-packages/httpx/_client.py", line 914, in send response = self._send_handling_auth( File "/usr/local/lib/python3.8/dist-packages/httpx/_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( File "/usr/local/lib/python3.8/dist-packages/httpx/_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) File "/usr/local/lib/python3.8/dist-packages/httpx/_client.py", line 1015, in _send_single_request response = transport.handle_request(request) File "/usr/local/lib/python3.8/dist-packages/httpx/_transports/default.py", line 233, in handle_request resp = self._pool.handle_request(req) File "/usr/lib/python3.8/contextlib.py", line 131, in exit self.gen.throw(type, value, traceback) File "/usr/local/lib/python3.8/dist-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: [Errno 101] Network is unreachable

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/home/wu/gpt_ws/install/gpt_client/lib/gpt_client/gpt_client", line 11, in load_entry_point('gpt-client', 'console_scripts', 'gpt_client')() File "/home/wu/gpt_ws/build/gpt_client/gpt_client/examples/client_retrieval_gpt.py", line 117, in main result = gpt_node.ask(question) # Ask a question File "/home/wu/gpt_ws/build/gpt_client/gpt_client/examples/client_retrieval_gpt.py", line 96, in ask return self.gpt.ask(question) File "/home/wu/gpt_ws/build/gpt_client/gpt_client/examples/client_retrieval_gpt.py", line 58, in ask result_dict = self.conversation(question) File "/usr/local/lib/python3.8/dist-packages/langchain_core/_api/deprecation.py", line 168, in warning_emitting_wrapper return wrapped(*args, kwargs) File "/usr/local/lib/python3.8/dist-packages/langchain/chains/base.py", line 383, in call return self.invoke( File "/usr/local/lib/python3.8/dist-packages/langchain/chains/base.py", line 166, in invoke raise e File "/usr/local/lib/python3.8/dist-packages/langchain/chains/base.py", line 156, in invoke self._call(inputs, run_manager=run_manager) File "/usr/local/lib/python3.8/dist-packages/langchain/chains/retrieval_qa/base.py", line 145, in _call answer = self.combine_documents_chain.run( File "/usr/local/lib/python3.8/dist-packages/langchain_core/_api/deprecation.py", line 168, in warning_emitting_wrapper return wrapped(*args, *kwargs) File "/usr/local/lib/python3.8/dist-packages/langchain/chains/base.py", line 605, in run return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[ File "/usr/local/lib/python3.8/dist-packages/langchain_core/_api/deprecation.py", line 168, in warning_emitting_wrapper return wrapped(args, kwargs) File "/usr/local/lib/python3.8/dist-packages/langchain/chains/base.py", line 383, in call return self.invoke( File "/usr/local/lib/python3.8/dist-packages/langchain/chains/base.py", line 166, in invoke raise e File "/usr/local/lib/python3.8/dist-packages/langchain/chains/base.py", line 156, in invoke self._call(inputs, run_manager=run_manager) File "/usr/local/lib/python3.8/dist-packages/langchain/chains/combine_documents/base.py", line 137, in _call output, extra_return_dict = self.combine_docs( File "/usr/local/lib/python3.8/dist-packages/langchain/chains/combine_documents/stuff.py", line 244, in combine_docs return self.llm_chain.predict(callbacks=callbacks, inputs), {} File "/usr/local/lib/python3.8/dist-packages/langchain/chains/llm.py", line 317, in predict return self(kwargs, callbacks=callbacks)[self.output_key] File "/usr/local/lib/python3.8/dist-packages/langchain_core/_api/deprecation.py", line 168, in warning_emitting_wrapper return wrapped(*args, kwargs) File "/usr/local/lib/python3.8/dist-packages/langchain/chains/base.py", line 383, in call return self.invoke( File "/usr/local/lib/python3.8/dist-packages/langchain/chains/base.py", line 166, in invoke raise e File "/usr/local/lib/python3.8/dist-packages/langchain/chains/base.py", line 156, in invoke self._call(inputs, run_manager=run_manager) File "/usr/local/lib/python3.8/dist-packages/langchain/chains/llm.py", line 127, in _call response = self.generate([inputs], run_manager=run_manager) File "/usr/local/lib/python3.8/dist-packages/langchain/chains/llm.py", line 139, in generate return self.llm.generate_prompt( File "/usr/local/lib/python3.8/dist-packages/langchain_core/language_models/chat_models.py", line 677, in generate_prompt return self.generate(prompt_messages, stop=stop, callbacks=callbacks, kwargs) File "/usr/local/lib/python3.8/dist-packages/langchain_core/language_models/chat_models.py", line 534, in generate raise e File "/usr/local/lib/python3.8/dist-packages/langchain_core/language_models/chat_models.py", line 524, in generate self._generate_with_cache( File "/usr/local/lib/python3.8/dist-packages/langchain_core/language_models/chat_models.py", line 749, in _generate_with_cache result = self._generate( File "/usr/local/lib/python3.8/dist-packages/langchain_openai/chat_models/base.py", line 549, in _generate response = self.client.create(messages=message_dicts, *params) File "/usr/local/lib/python3.8/dist-packages/openai/_utils/_utils.py", line 277, in wrapper return func(args, kwargs) File "/usr/local/lib/python3.8/dist-packages/openai/resources/chat/completions.py", line 643, in create return self._post( File "/usr/local/lib/python3.8/dist-packages/openai/_base_client.py", line 1250, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/usr/local/lib/python3.8/dist-packages/openai/_base_client.py", line 931, in request return self._request( File "/usr/local/lib/python3.8/dist-packages/openai/_base_client.py", line 986, in _request return self._retry_request( File "/usr/local/lib/python3.8/dist-packages/openai/_base_client.py", line 1063, in _retry_request return self._request( File "/usr/local/lib/python3.8/dist-packages/openai/_base_client.py", line 986, in _request return self._retry_request( File "/usr/local/lib/python3.8/dist-packages/openai/_base_client.py", line 1063, in _retry_request return self._request( File "/usr/local/lib/python3.8/dist-packages/openai/_base_client.py", line 996, in _request raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.

andrewwjh commented 6 days ago

挂了代理之后出现了这个openai.RateLimitError: Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}}

NoneJou072 commented 5 days ago

hi, @andrewwjh . 看上去您运行的是 client_gpt,并不是离线大模型,所以需要网络代理.

NoneJou072 commented 5 days ago

挂了代理之后出现了这个openai.RateLimitError: Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}}

报错是因为您的openai账户没有额度了

andrewwjh commented 5 days ago

那如果我想用llama_client应该怎么做呢,为什么EMBEDDING_MODEL和VECTOR_STORE都是1,还是运行的是gpt_client呢?