THUDM / ChatGLM-6B

ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
Apache License 2.0
39.96k stars 5.15k forks source link

[BUG/Help] web_demo.py启动失败 #1430

Closed liusn0329 closed 7 months ago

liusn0329 commented 7 months ago

Is there an existing issue for this?

Current Behavior

E:\code\LLM\ChatGLM2-6B>python web_demo.py D:\Program Files\python3.8\lib\site-packages\requests__init__.py:109: RequestsDependencyWarning: urllib3 (1.26.9) or chardet (5.0.0)/charset_normalizer (2.0.12) doesn't match a supported version! warnings.warn( Traceback (most recent call last): File "D:\Program Files\python3.8\lib\site-packages\urllib3\connectionpool.py", line 700, in urlopen self._prepare_proxy(conn) File "D:\Program Files\python3.8\lib\site-packages\urllib3\connectionpool.py", line 994, in _prepare_proxy conn.connect() File "D:\Program Files\python3.8\lib\site-packages\urllib3\connection.py", line 364, in connect self.sock = conn = self._connect_tls_proxy(hostname, conn) File "D:\Program Files\python3.8\lib\site-packages\urllib3\connection.py", line 499, in _connect_tls_proxy socket = ssl_wrapsocket( File "D:\Program Files\python3.8\lib\site-packages\urllib3\util\ssl.py", line 453, in ssl_wrap_socket ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_intls) File "D:\Program Files\python3.8\lib\site-packages\urllib3\util\ssl.py", line 495, in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock) File "D:\Program Files\python3.8\lib\ssl.py", line 500, in wrap_socket return self.sslsocket_class._create( File "D:\Program Files\python3.8\lib\ssl.py", line 1040, in _create self.do_handshake() File "D:\Program Files\python3.8\lib\ssl.py", line 1309, in do_handshake self._sslobj.do_handshake() OSError: [Errno 0] Error

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "D:\Program Files\python3.8\lib\site-packages\requests\adapters.py", line 489, in send resp = conn.urlopen( File "D:\Program Files\python3.8\lib\site-packages\urllib3\connectionpool.py", line 785, in urlopen retries = retries.increment( File "D:\Program Files\python3.8\lib\site-packages\urllib3\util\retry.py", line 592, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /THUDM/chatglm2-6b/resolve/main/tokenizer_config.json (Caused by ProxyError('Cannot connect to proxy.', OSError(0, 'Error')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "web_demo.py", line 6, in tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm2-6b", trust_remote_code=True) File "D:\Program Files\python3.8\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 643, in from_pretrained tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, kwargs) File "D:\Program Files\python3.8\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 487, in get_tokenizer_config resolved_config_file = cached_file( File "D:\Program Files\python3.8\lib\site-packages\transformers\utils\hub.py", line 417, in cached_file resolved_file = hf_hub_download( File "D:\Program Files\python3.8\lib\site-packages\huggingface_hub\utils_validators.py", line 118, in _inner_fn return fn(*args, *kwargs) File "D:\Program Files\python3.8\lib\site-packages\huggingface_hub\file_download.py", line 1247, in hf_hub_download metadata = get_hf_file_metadata( File "D:\Program Files\python3.8\lib\site-packages\huggingface_hub\utils_validators.py", line 118, in _inner_fn return fn(args, kwargs) File "D:\Program Files\python3.8\lib\site-packages\huggingface_hub\file_download.py", line 1624, in get_hf_file_metadata r = _request_wrapper( File "D:\Program Files\python3.8\lib\site-packages\huggingface_hub\file_download.py", line 402, in _request_wrapper response = _request_wrapper( File "D:\Program Files\python3.8\lib\site-packages\huggingface_hub\file_download.py", line 425, in _request_wrapper response = get_session().request(method=method, url=url, params) File "D:\Program Files\python3.8\lib\site-packages\requests\sessions.py", line 587, in request resp = self.send(prep, send_kwargs) File "D:\Program Files\python3.8\lib\site-packages\requests\sessions.py", line 701, in send r = adapter.send(request, *kwargs) File "D:\Program Files\python3.8\lib\site-packages\huggingface_hub\utils_http.py", line 63, in send return super().send(request, args, **kwargs) File "D:\Program Files\python3.8\lib\site-packages\requests\adapters.py", line 559, in send raise ProxyError(e, request=request) requests.exceptions.ProxyError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /THUDM/chatglm2-6b/resolve/main/tokenizer_config.json (Caused by ProxyError('Cannot connect to proxy.', OSError(0, 'Error')))"), '(Request ID: 793eb597-26da-4adf-9409-e1aa2b509d83)')

Expected Behavior

Successfully run web_demo.py

Steps To Reproduce

  1. 按照requirements.txt已下载更新到指定版本
  2. python web.demp.py

Environment

- OS:window10
- Python:3.8
- PyTorch:2.1.1
-Transformers: 4.30.2
-CUDA Support: True

Anything else?

No response