问题描述 / Problem Description
一个纠结了2个星期还没有解决的问题,不知道作者能否关注到:
虚拟机[win+CPU4核心+23G内存]:启动服务成功后,输入hello回车进行对话
报错ERROR: Exception in ASGI application (从下面的详细报错信息看,好像是去请求了“OpenAI API”,没有去请求chat/chat接口”)
复现问题的步骤 / Steps to Reproduce
1、启动成功的部分日志:
==============================Langchain-Chatchat Configuration==============================
操作系统:Windows-10-10.0.17763-SP0.
You can now view your Streamlit app in your browser.python版本:3.10.13 | packaged by Anaconda, Inc. | (main, Sep 11 2023, 13:24:38) [MSC v.1916 64 bit (AMD64)]
项目版本:v0.2.7
问题描述 / Problem Description 一个纠结了2个星期还没有解决的问题,不知道作者能否关注到: 虚拟机[win+CPU4核心+23G内存]:启动服务成功后,输入hello回车进行对话 报错ERROR: Exception in ASGI application (从下面的详细报错信息看,好像是去请求了“OpenAI API”,没有去请求chat/chat接口”)
复现问题的步骤 / Steps to Reproduce 1、启动成功的部分日志: ==============================Langchain-Chatchat Configuration============================== 操作系统:Windows-10-10.0.17763-SP0. You can now view your Streamlit app in your browser.python版本:3.10.13 | packaged by Anaconda, Inc. | (main, Sep 11 2023, 13:24:38) [MSC v.1916 64 bit (AMD64)] 项目版本:v0.2.7
langchain版本:0.0.335. fastchat版本:0.2.32
当前使用的分词器:ChineseRecursiveTextSplitter URL: 当前启动的LLM模型:['chatglm2-6b'] @ cpu http://127.0.0.1:8501{'device': 'cpu',
'host': '127.0.0.1', 'infer_turbo': False,
'model_path': 'C:\conda3\big_models\THUDM_chatglm2-6b', 'port': 20002} 当前Embbedings模型: m3e-base @ cpu
服务端运行信息: OpenAI API Server: http://127.0.0.1:20000/v1 Chatchat API Server: http://127.0.0.1:7861 Chatchat WEBUI Server: http://127.0.0.1:8501 ==============================Langchain-Chatchat Configuration==============================
2、打开Chatchat WEBUI Server: http://127.0.0.1:8501 输入hello回车进行对话,等待几分钟后报错,日志如下: {'base_url': 'http://127.0.0.1:7861', 'timeout': 300.0, 'proxies': {'all://127.0.0.1': None, 'all://localhost': None, 'http://127.0.0.1': None, 'http://': None, 'https://': None, 'all://': None, 'http://localhost': None}} {'timeout': 300.0, 'proxies': {'all://127.0.0.1': None, 'all://localhost': None, 'http://127.0.0.1': None, 'http://': None, 'https://': None, 'all://': None, 'http://localhost': None}} 2023-11-19 21:22:21,779 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:51527 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2023-11-19 21:22:21,795 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" {'timeout': 300.0, 'proxies': {'all://127.0.0.1': None, 'all://localhost': None, 'http://127.0.0.1': None, 'http://': None, 'https://': None, 'all://': None, 'http://localhost': None}} 2023-11-19 21:22:22,275 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:51527 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2023-11-19 21:22:22,288 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:51527 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK 2023-11-19 21:22:22,301 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK" {'timeout': 300.0, 'proxies': {'all://127.0.0.1': None, 'all://localhost': None, 'http://127.0.0.1': None, 'http://': None, 'https://': None, 'all://': None, 'http://localhost': None}} 2023-11-19 21:22:22,613 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:51527 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2023-11-19 21:22:22,623 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" {'base_url': 'http://127.0.0.1:7861', 'timeout': 300.0, 'proxies': {'all://127.0.0.1': None, 'all://localhost': None, 'http://127.0.0.1': None, 'http://': None, 'https://': None, 'all://': None, 'http://localhost': None}} {'timeout': 300.0, 'proxies': {'all://127.0.0.1': None, 'all://localhost': None, 'http://127.0.0.1': None, 'http://': None, 'https://': None, 'all://': None, 'http://localhost': None}} 2023-11-19 21:23:00,469 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:51544 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2023-11-19 21:23:00,481 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:51544 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK 2023-11-19 21:23:00,503 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK" {'timeout': 300.0, 'proxies': {'all://127.0.0.1': None, 'all://localhost': None, 'http://127.0.0.1': None, 'http://': None, 'https://': None, 'all://': None, 'http://localhost': None}} 2023-11-19 21:23:00,995 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:20001/list_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:51544 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2023-11-19 21:23:01,005 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" received input message: {'history': [], 'max_tokens': None, 'model_name': 'chatglm2-6b', 'prompt_name': 'default', 'query': 'hello', 'stream': True, 'temperature': 0.7} INFO: 127.0.0.1:51544 - "POST /chat/chat HTTP/1.1" 200 OK 2023-11-19 21:23:01,065 - _client.py[line:1013] - INFO: HTTP Request: POST http://127.0.0.1:7861/chat/chat "HTTP/1.1 200 OK" 2023-11-19 21:23:02 | INFO | stdout | INFO: 127.0.0.1:51547 - "POST /v1/chat/completions HTTP/1.1" 200 OK 2023-11-19 21:23:02,057 - util.py[line:67] - INFO: message='OpenAI API response' path=http://127.0.0.1:20000/v1/chat/completions processing_ms=None request_id=None response_code=200 2023-11-19 21:23:02 | INFO | httpx | HTTP Request: POST http://127.0.0.1:20002/worker_generate_stream "HTTP/1.1 200 OK" 2023-11-19 21:24:42 | ERROR | stderr | ERROR: Exception in ASGI application 2023-11-19 21:24:42 | ERROR | stderr | Traceback (most recent call last): 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpcore_backends\anyio.py", line 34, in read 2023-11-19 21:24:42 | ERROR | stderr | return await self._stream.receive(max_bytes=max_bytes) 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\anyio_backends_asyncio.py", line 1203, in receive 2023-11-19 21:24:42 | ERROR | stderr | await self._protocol.read_event.wait() 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\asyncio\locks.py", line 214, in wait 2023-11-19 21:24:42 | ERROR | stderr | await fut 2023-11-19 21:24:42 | ERROR | stderr | asyncio.exceptions.CancelledError 2023-11-19 21:24:42 | ERROR | stderr | 2023-11-19 21:24:42 | ERROR | stderr | During handling of the above exception, another exception occurred: 2023-11-19 21:24:42 | ERROR | stderr | 2023-11-19 21:24:42 | ERROR | stderr | Traceback (most recent call last): 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpcore_exceptions.py", line 10, in map_exceptions 2023-11-19 21:24:42 | ERROR | stderr | yield 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpcore_backends\anyio.py", line 32, in read 2023-11-19 21:24:42 | ERROR | stderr | with anyio.fail_after(timeout): 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\anyio_core_tasks.py", line 119, in exit 2023-11-19 21:24:42 | ERROR | stderr | raise TimeoutError 2023-11-19 21:24:42 | ERROR | stderr | TimeoutError 2023-11-19 21:24:42 | ERROR | stderr | 2023-11-19 21:24:42 | ERROR | stderr | The above exception was the direct cause of the following exception: 2023-11-19 21:24:42 | ERROR | stderr | 2023-11-19 21:24:42 | ERROR | stderr | Traceback (most recent call last): 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpx_transports\default.py", line 66, in map_httpcore_exceptions 2023-11-19 21:24:42 | ERROR | stderr | yield 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpx_transports\default.py", line 249, in aiter 2023-11-19 21:24:42 | ERROR | stderr | async for part in self._httpcore_stream: 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpcore_async\connection_pool.py", line 361, in aiter 2023-11-19 21:24:42 | ERROR | stderr | async for part in self._stream: 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpcore_async\http11.py", line 337, in aiter 2023-11-19 21:24:42 | ERROR | stderr | raise exc 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpcore_async\http11.py", line 329, in aiter 2023-11-19 21:24:42 | ERROR | stderr | async for chunk in self._connection._receive_response_body(**kwargs): 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpcore_async\http11.py", line 198, in _receive_response_body 2023-11-19 21:24:42 | ERROR | stderr | event = await self._receive_event(timeout=timeout) 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpcore_async\http11.py", line 212, in _receive_event 2023-11-19 21:24:42 | ERROR | stderr | data = await self._network_stream.read( 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpcore_backends\anyio.py", line 31, in read 2023-11-19 21:24:42 | ERROR | stderr | with map_exceptions(exc_map): 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\contextlib.py", line 153, in exit 2023-11-19 21:24:42 | ERROR | stderr | self.gen.throw(typ, value, traceback) 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions 2023-11-19 21:24:42 | ERROR | stderr | raise to_exc(exc) from exc 2023-11-19 21:24:42 | ERROR | stderr | httpcore.ReadTimeout 2023-11-19 21:24:42 | ERROR | stderr | 2023-11-19 21:24:42 | ERROR | stderr | The above exception was the direct cause of the following exception: 2023-11-19 21:24:42 | ERROR | stderr | 2023-11-19 21:24:42 | ERROR | stderr | Traceback (most recent call last): 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 408, in run_asgi 2023-11-19 21:24:42 | ERROR | stderr | result = await app( # type: ignore[func-returns-value] 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 84, in call 2023-11-19 21:24:42 | ERROR | stderr | return await self.app(scope, receive, send) 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\fastapi\applications.py", line 1115, in call 2023-11-19 21:24:42 | ERROR | stderr | await super().call(scope, receive, send) 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\applications.py", line 122, in call 2023-11-19 21:24:42 | ERROR | stderr | await self.middleware_stack(scope, receive, send) 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\middleware\errors.py", line 184, in call 2023-11-19 21:24:42 | ERROR | stderr | raise exc 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\middleware\errors.py", line 162, in call 2023-11-19 21:24:42 | ERROR | stderr | await self.app(scope, receive, _send) 2023-11-19 21:24:42 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\middleware\cors.py", line 83, in call 2023-11-19 21:24:42 | ERROR | stderr | await self.app(scope, receive, send) 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\middleware\exceptions.py", line 79, in call 2023-11-19 21:24:43 | ERROR | stderr | raise exc 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\middleware\exceptions.py", line 68, in call 2023-11-19 21:24:43 | ERROR | stderr | await self.app(scope, receive, sender) 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 20, in call 2023-11-19 21:24:43 | ERROR | stderr | raise e 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 17, in call 2023-11-19 21:24:43 | ERROR | stderr | await self.app(scope, receive, send) 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\routing.py", line 718, in call 2023-11-19 21:24:43 | ERROR | stderr | await route.handle(scope, receive, send) 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\routing.py", line 276, in handle 2023-11-19 21:24:43 | ERROR | stderr | await self.app(scope, receive, send) 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\routing.py", line 69, in app 2023-11-19 21:24:43 | ERROR | stderr | await response(scope, receive, send) 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\responses.py", line 270, in call 2023-11-19 21:24:43 | ERROR | stderr | async with anyio.create_task_group() as task_group: 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\anyio_backends_asyncio.py", line 597, in aexit 2023-11-19 21:24:43 | ERROR | stderr | raise exceptions[0] 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\responses.py", line 273, in wrap 2023-11-19 21:24:43 | ERROR | stderr | await func() 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\starlette\responses.py", line 262, in stream_response 2023-11-19 21:24:43 | ERROR | stderr | async for chunk in self.body_iterator: 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\fastchat\serve\openai_api_server.py", line 458, in chat_completion_stream_generator 2023-11-19 21:24:43 | ERROR | stderr | async for content in generate_completion_stream(gen_params, worker_addr): 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\fastchat\serve\openai_api_server.py", line 638, in generate_completion_stream 2023-11-19 21:24:43 | ERROR | stderr | async for raw_chunk in response.aiter_raw(): 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpx_models.py", line 990, in aiter_raw 2023-11-19 21:24:43 | ERROR | stderr | async for raw_stream_bytes in self.stream: 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpx_client.py", line 146, in aiter 2023-11-19 21:24:43 | ERROR | stderr | async for chunk in self._stream: 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpx_transports\default.py", line 248, in aiter 2023-11-19 21:24:43 | ERROR | stderr | with map_httpcore_exceptions(): 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\contextlib.py", line 153, in exit 2023-11-19 21:24:43 | ERROR | stderr | self.gen.throw(typ, value, traceback) 2023-11-19 21:24:43 | ERROR | stderr | File "C:\conda3\envs\chatchat\lib\site-packages\httpx_transports\default.py", line 83, in map_httpcore_exceptions 2023-11-19 21:24:43 | ERROR | stderr | raise mapped_exc(message) from exc 2023-11-19 21:24:43 | ERROR | stderr | httpx.ReadTimeout 2023-11-19 21:24:43,208 - utils.py[line:25] - ERROR: ClientPayloadError: Caught exception: Response payload is not completed
预期的结果 / Expected Result 初始化后,应该可以成功写到向量库中,保存到磁盘。
实际结果 / Actual Result 报错见上面日志
环境信息 / Environment Information Package Version
accelerate 0.24.1 aiohttp 3.8.6 aiolimiter 1.1.0 aiosignal 1.3.1 altair 5.1.2 antlr4-python3-runtime 4.9.3 anyio 3.7.1 async-timeout 4.0.3 attrs 23.1.0 backoff 2.2.1 bce-python-sdk 0.8.96 beautifulsoup4 4.12.2 blinker 1.7.0 blis 0.7.11 Brotli 1.1.0 cachetools 5.3.2 catalogue 2.0.10 certifi 2023.7.22 cffi 1.16.0 chardet 5.2.0 charset-normalizer 3.3.2 click 8.1.7 cloudpathlib 0.16.0 colorama 0.4.6 coloredlogs 15.0.1 confection 0.1.3 contourpy 1.2.0 cryptography 41.0.5 cycler 0.12.1 cymem 2.0.8 dashscope 1.13.2 dataclasses 0.6 dataclasses-json 0.6.2 distro 1.8.0 effdet 0.4.1 einops 0.7.0 emoji 2.8.0 et-xmlfile 1.1.0 exceptiongroup 1.1.3 faiss-cpu 1.7.4 fastapi 0.104.1 filelock 3.13.1 filetype 1.2.0 flatbuffers 23.5.26 fonttools 4.44.0 frozenlist 1.4.0 fschat 0.2.32 fsspec 2023.10.0 future 0.18.3 gitdb 4.0.11 GitPython 3.1.40 greenlet 3.0.1 grpcio 1.47.5 grpcio-tools 1.47.5 h11 0.14.0 h2 4.1.0 hpack 4.0.0 httpcore 1.0.2 httpx 0.25.1 huggingface-hub 0.17.3 humanfriendly 10.0 hyperframe 6.0.1 idna 3.4 importlib-metadata 6.8.0 iniconfig 2.0.0 iopath 0.1.10 Jinja2 3.1.2 joblib 1.3.2 jsonpatch 1.33 jsonpointer 2.4 jsonschema 4.19.2 jsonschema-specifications 2023.7.1 kiwisolver 1.4.5 langchain 0.0.335 langchain-experimental 0.0.40 langcodes 3.3.0 langdetect 1.0.9 langsmith 0.0.64 layoutparser 0.3.4 lxml 4.9.3 Markdown 3.5.1 markdown-it-py 3.0.0 markdown2 2.4.10 markdownify 0.11.6 MarkupSafe 2.1.3 marshmallow 3.20.1 matplotlib 3.8.1 mdurl 0.1.2 mmh3 3.0.0 mpmath 1.3.0 msg-parser 1.2.0 multidict 6.0.4 murmurhash 1.0.10 mypy-extensions 1.0.0 networkx 3.2.1 nh3 0.2.14 nltk 3.8.1 numexpr 2.8.7 numpy 1.24.4 olefile 0.46 omegaconf 2.3.0 onnx 1.15.0 onnxruntime 1.15.1 openai 0.28.0 opencv-python 4.8.1.78 openpyxl 3.1.2 packaging 23.2 pandas 2.0.3 pathlib 1.0.1 pdf2image 1.16.3 pdfminer.six 20221105 pdfplumber 0.10.3 peft 0.6.2 pgvector 0.2.3 Pillow 9.5.0 pip 23.3 pluggy 1.3.0 portalocker 2.8.2 preshed 3.0.9 prompt-toolkit 3.0.41 protobuf 3.20.3 psutil 5.9.6 psycopg2 2.9.9 pyarrow 14.0.1 pyclipper 1.3.0.post5 pycocotools 2.0.7 pycparser 2.21 pycryptodome 3.19.0 pydantic 1.10.13 pydeck 0.8.1b0 Pygments 2.16.1 PyJWT 2.8.0 pymilvus 2.1.3 PyMuPDF 1.23.6 PyMuPDFb 1.23.6 pypandoc 1.12 pyparsing 3.1.1 pypdfium2 4.24.0 pyreadline3 3.4.1 pytesseract 0.3.10 pytest 7.4.3 python-dateutil 2.8.2 python-decouple 3.8 python-docx 1.1.0 python-iso639 2023.6.15 python-magic 0.4.27 python-magic-bin 0.4.14 python-multipart 0.0.6 python-pptx 0.6.23 pytz 2023.3.post1 pywin32 306 PyYAML 6.0.1 qianfan 0.1.1 rapidfuzz 3.5.2 rapidocr-onnxruntime 1.3.8 referencing 0.30.2 regex 2023.10.3 requests 2.31.0 rich 13.6.0 rpds-py 0.12.0 safetensors 0.4.0 scikit-learn 1.3.2 scipy 1.11.3 sentence-transformers 2.2.2 sentencepiece 0.1.99 setuptools 68.0.0 shapely 2.0.2 shortuuid 1.0.11 simplejson 3.19.2 six 1.16.0 smart-open 6.4.0 smmap 5.0.1 sniffio 1.3.0 socksio 1.0.0 soupsieve 2.5 spacy 3.7.2 spacy-legacy 3.0.12 spacy-loggers 1.0.5 SQLAlchemy 2.0.19 srsly 2.4.8 starlette 0.27.0 streamlit 1.27.2 streamlit-aggrid 0.3.4.post3 streamlit-antd-components 0.2.3 streamlit-chatbox 1.1.11 streamlit-feedback 0.1.2 streamlit-option-menu 0.3.6 strsimpy 0.2.1 svgwrite 1.4.3 sympy 1.12 tabulate 0.9.0 tenacity 8.2.3 thinc 8.2.1 threadpoolctl 3.2.0 tiktoken 0.5.1 timm 0.9.10 tokenizers 0.14.1 toml 0.10.2 tomli 2.0.1 toolz 0.12.0 torch 2.1.0 torchaudio 2.1.0 torchvision 0.16.0 tornado 6.3.3 tqdm 4.66.1 transformers 4.35.0 transformers-stream-generator 0.0.4 typer 0.9.0 typing_extensions 4.8.0 typing-inspect 0.9.0 tzdata 2023.3 tzlocal 5.2 ujson 5.4.0 unstructured 0.10.30 unstructured-inference 0.7.11 unstructured.pytesseract 0.3.12 urllib3 2.1.0 uvicorn 0.23.2 validators 0.22.0 wasabi 1.1.2 watchdog 3.0.0 wavedrom 2.0.3.post3 wcwidth 0.2.10 weasel 0.3.4 websockets 12.0 wheel 0.41.2 xformers 0.0.22.post7 xlrd 2.0.1 XlsxWriter 3.1.9 yarl 1.9.2 zhipuai 1.0.7 zipp 3.17.0