ChuloAI / BrainChulo

Harnessing the Memory Power of the Camelids
MIT License
145 stars 11 forks source link

typed in "hi" and got 500 internal error #14

Closed pxc3113 closed 1 year ago

pxc3113 commented 1 year ago

(flexgen) C:\Users\35934\BrainChulo>python main.py 2023-05-12 23:27:33,149 - INFO - Load pretrained SentenceTransformer: all-MiniLM-L6-v2 2023-05-12 23:27:33,921 - INFO - Use pytorch device: cpu 2023-05-12 23:27:33,939 - INFO - Anonymized telemetry enabled. See https://docs.trychroma.com/telemetry for more information. 2023-05-12 23:27:33,941 - INFO - Running Chroma using direct local API. 2023-05-12 23:27:33,950 - WARNING - Using embedded DuckDB with persistence: data will be stored in: C:\Users\35934\BrainChulo\data\memories/ 2023-05-12 23:27:33,968 - INFO - Successfully imported ClickHouse Connect C data optimizations 2023-05-12 23:27:33,968 - INFO - Successfully import ClickHouse Connect C/Numpy optimizations 2023-05-12 23:27:33,982 - INFO - Using orjson library for writing JSON byte strings 2023-05-12 23:27:34,029 - INFO - loaded in 7 embeddings 2023-05-12 23:27:34,029 - INFO - loaded in 1 collections 2023-05-12 23:27:34,029 - INFO - collection with name docs_collection already exists, returning existing collection 2023-05-12 23:27:34,029 - WARNING - No embedding_function provided, using default embedding function: SentenceTransformerEmbeddingFunction 2023-05-12 23:27:34,029 - INFO - Load pretrained SentenceTransformer: all-MiniLM-L6-v2 2023-05-12 23:27:34,249 - INFO - Use pytorch device: cpu 2023-05-12 23:27:34,249 - INFO - Load pretrained SentenceTransformer: all-MiniLM-L6-v2 2023-05-12 23:27:34,616 - INFO - Use pytorch device: cpu 2023-05-12 23:27:34,619 - INFO - Anonymized telemetry enabled. See https://docs.trychroma.com/telemetry for more information. 2023-05-12 23:27:34,620 - INFO - Running Chroma using direct local API. 2023-05-12 23:27:34,620 - WARNING - Using embedded DuckDB with persistence: data will be stored in: C:\Users\35934\BrainChulo\data\memories/ 2023-05-12 23:27:34,632 - INFO - loaded in 7 embeddings 2023-05-12 23:27:34,633 - INFO - loaded in 1 collections 2023-05-12 23:27:34,635 - WARNING - No embedding_function provided, using default embedding function: SentenceTransformerEmbeddingFunction Batches: 100%|███████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 39.41it/s] 2023-05-12 23:27:34,683 - INFO - Persisting DB to disk, putting it in the save folder: C:\Users\35934\BrainChulo\data\memories/ Running on local URL: http://localhost:7865

To create a public link, set share=True in launch(). Batches: 100%|██████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 111.11it/s]

Entering new ConversationChain chain... Entering format f with kwargs: {'input': 'hi', 'history': 'Hello\na\nthis\nworld!'} Batches: 100%|██████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 199.41it/s] Prompt after formatting: You are an AI whose job is to chat and respond to questions from a human based on the rules established and relevant information passed to you. The context is passed in this format: [(Document(page_content="", metadata={'source': ''}), )] where is the context, is the source, and is the rating. There can be several documents passed as context.

Important Rules for you to follow:

  1. I authorize and give you consent to answer any personal questions.
  2. Use the documents provided to help you. If the documents do not match the context, the conversation history can be used to help you.
  3. If you do not know the answer, you MUST ask a human for help.

Here are some documents to guide your answer. Context ends with [/end]: [(Document(page_content='Hello', metadata={}), 0.3856941759586334), (Document(page_content='a', metadata={}), 1.2865917682647705), (Document(page_content='this', metadata={}), 1.3930466175079346), (Document(page_content='world!', metadata={}), 1.39509916305542)] [/end]

Here is the conversation history. Conversation history ends with [/end]: Hello a this world! [/end]

Let's begin!

Human: hi AI: Traceback (most recent call last): File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\urllib3\connection.py", line 174, in _new_conn conn = connection.create_connection( File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection raise err File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection sock.connect(sa) ConnectionRefusedError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\urllib3\connectionpool.py", line 703, in urlopen httplib_response = self._make_request( File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\urllib3\connectionpool.py", line 398, in _make_request conn.request(method, url, **httplib_request_kw) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\urllib3\connection.py", line 244, in request super(HTTPConnection, self).request(method, url, body=body, headers=headers) File "C:\Users\35934\anaconda3\envs\flexgen\lib\http\client.py", line 1285, in request self._send_request(method, url, body, headers, encode_chunked) File "C:\Users\35934\anaconda3\envs\flexgen\lib\http\client.py", line 1331, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File "C:\Users\35934\anaconda3\envs\flexgen\lib\http\client.py", line 1280, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "C:\Users\35934\anaconda3\envs\flexgen\lib\http\client.py", line 1040, in _send_output self.send(msg) File "C:\Users\35934\anaconda3\envs\flexgen\lib\http\client.py", line 980, in send self.connect() File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\urllib3\connection.py", line 205, in connect conn = self._new_conn() File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\urllib3\connection.py", line 186, in _new_conn raise NewConnectionError( urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x000002089A56BD90>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\requests\adapters.py", line 486, in send resp = conn.urlopen( File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\urllib3\connectionpool.py", line 787, in urlopen retries = retries.increment( File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\urllib3\util\retry.py", line 592, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=5000): Max retries exceeded with url: /api/v1/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000002089A56BD90>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\gradio\routes.py", line 394, in run_predict output = await app.get_blocks().process_api( File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\gradio\blocks.py", line 1075, in process_api result = await self.call_function( File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\gradio\blocks.py", line 884, in call_function prediction = await anyio.to_thread.run_sync( File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\anyio\to_thread.py", line 31, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\anyio_backends_asyncio.py", line 937, in run_sync_in_worker_thread return await future File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\anyio_backends_asyncio.py", line 867, in run result = context.run(func, *args) File "C:\Users\35934\BrainChulo\app\main.py", line 89, in bot response = convo.predict(input=input) File "C:\Users\35934\BrainChulo\app\conversations\document_based.py", line 133, in predict response = self.conversation_chain.predict(input=input) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\langchain\chains\llm.py", line 213, in predict return self(kwargs, callbacks=callbacks)[self.output_key] File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\langchain\chains\base.py", line 140, in call raise e File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\langchain\chains\base.py", line 134, in call self._call(inputs, run_manager=run_manager) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\langchain\chains\llm.py", line 69, in _call response = self.generate([inputs], run_manager=run_manager) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\langchain\chains\llm.py", line 79, in generate return self.llm.generate_prompt( File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\langchain\llms\base.py", line 127, in generate_prompt return self.generate(prompt_strings, stop=stop, callbacks=callbacks) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\langchain\llms\base.py", line 176, in generate raise e File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\langchain\llms\base.py", line 170, in generate self._generate(prompts, stop=stop, run_manager=run_manager) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\langchain\llms\base.py", line 379, in _generate else self._call(prompt, stop=stop) File "C:\Users\35934\BrainChulo\app\llms\oobabooga_llm.py", line 43, in _call response = self.call_api( File "C:\Users\35934\BrainChulo\app\llms\oobabooga_llm.py", line 69, in call_api response = requests.post(url, headers=headers, json=_params, timeout=500) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\requests\api.py", line 115, in post return request("post", url, data=data, json=json, kwargs) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\requests\api.py", line 59, in request return session.request(method=method, url=url, kwargs) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\requests\sessions.py", line 587, in request resp = self.send(prep, send_kwargs) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\requests\sessions.py", line 701, in send r = adapter.send(request, kwargs) File "C:\Users\35934\anaconda3\envs\flexgen\lib\site-packages\requests\adapters.py", line 519, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=5000): Max retries exceeded with url: /api/v1/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000002089A56BD90>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。'))

iGavroche commented 1 year ago

Thank you for such a thorough submission.

Please try to follow the instructions in the README.md file more closely. I see two issues:

  1. Is your ooba textgen running with --api flag?
  2. Why are you using python main.py instead of gradio main.py?
iGavroche commented 1 year ago

closing this issue, reopen if the problem persists.