shunyaoshih / TPA-LSTM

Temporal Pattern Attention for Multivariate Time Series Forecasting
699 stars 187 forks source link

Musedataset cannot download #2

Open Bingohong opened 5 years ago

Bingohong commented 5 years ago

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='docs.google.com', port=443): Max retries exceeded with url: /uc?export=download&id=1a5361IfxxEY1mmTfqAviiIkq6u2OYFJ7 (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7f992b7dfa58>: Failed to establish a new connection: [Errno 111] Connection refused',))

I can't get musedata from docs.google.com by url. how to solve it ?

hqy7777 commented 3 years ago

Downloading muse dataset from Google drive... Traceback (most recent call last): File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\urllib3\connection.py", line 159, in _new_conn (self._dns_host, self.port), self.timeout, **extra_kw) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\urllib3\util\connection.py", line 80, in create_connection raise err File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\urllib3\util\connection.py", line 70, in create_connection sock.connect(sa) socket.timeout: timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\urllib3\connectionpool.py", line 600, in urlopen chunked=chunked) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\urllib3\connectionpool.py", line 343, in _make_request self._validate_conn(conn) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\urllib3\connectionpool.py", line 839, in _validate_conn conn.connect() File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\urllib3\connection.py", line 301, in connect conn = self._new_conn() File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\urllib3\connection.py", line 164, in _new_conn (self.host, self.timeout)) urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.VerifiedHTTPSConnection object at 0x00000000184CE4E0>, 'Connection to docs.google.com timed out. (connect timeout=3)')

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\requests\adapters.py", line 449, in send timeout=timeout File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\urllib3\connectionpool.py", line 638, in urlopen _stacktrace=sys.exc_info()[2]) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\urllib3\util\retry.py", line 399, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='docs.google.com', port=443): Max retries exceeded with url: /uc?export=download&id=1a5361IfxxEY1mmTfqAviiIkq6u2OYFJ7 (Caused by ConnectTimeoutError(<urllib3.connection.VerifiedHTTPSConnection object at 0x00000000184CE4E0>, 'Connection to docs.google.com timed out. (connect timeout=3)'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:/Users/Administrator/Desktop/TPA-LSTM-master/TPA-LSTM-master/main.py", line 35, in main() File "C:/Users/Administrator/Desktop/TPA-LSTM-master/TPA-LSTM-master/main.py", line 14, in main graph, model, data_generator = create_graph(para) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\lib\model_utils.py", line 28, in create_graph data_generator = create_data_generator(para) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\lib\model_utils.py", line 15, in create_data_generator return MuseDataGenerator(para) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\lib\data_generator.py", line 285, in init self._download_file() File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\lib\data_generator.py", line 120, in _download_file self.DATA_FULL_PATH + ".tar") File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\lib\utils.py", line 65, in download_file_from_google_drive response = session.get(URL, params={'id': id}, stream=True,timeout=(3,7)) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\requests\sessions.py", line 537, in get return self.request('GET', url, kwargs) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\requests\sessions.py", line 524, in request resp = self.send(prep, send_kwargs) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\requests\sessions.py", line 637, in send r = adapter.send(request, **kwargs) File "C:\Users\Administrator\Desktop\TPA-LSTM-master\TPA-LSTM-master\venv\lib\site-packages\requests\adapters.py", line 504, in send raise ConnectTimeout(e, request=request) requests.exceptions.ConnectTimeout: HTTPSConnectionPool(host='docs.google.com', port=443): Max retries exceeded with url: /uc?export=download&id=1a5361IfxxEY1mmTfqAviiIkq6u2OYFJ7 (Caused by ConnectTimeoutError(<urllib3.connection.VerifiedHTTPSConnection object at 0x00000000184CE4E0>, 'Connection to docs.google.com timed out. (connect timeout=3)'))

——I tried setting timeout=(3,7) for request,but it don't make and yield the same problem.Besides,sciencific surfing is allowed.How can i solve it?Please help me.Thanks anyway.

astraeus1258 commented 3 years ago

I met the same error once, maybe you can try to restart your computer. By the way ,if you need ,I can send the dataset to you.

zhouzexufly commented 3 years ago

我遇到过一次同样的错误,也许您可​​以尝试重新启动计算机。顺便说一句,如果您需要,我可以将数据集发送给您。

If you can, can you send me a copy of the data? Thank you.

zorrozjr commented 3 years ago

I met the same error once, maybe you can try to restart your computer. By the way ,if you need ,I can send the dataset to you.

I can not download the data set, could you send me a copy of the data, too?

Thank you very much.

RuiruiKang commented 3 years ago

我遇到过一次同样的错误,也许您可​​以尝试重新启动计算机。顺便说一句,如果您需要,我可以将数据集发送给您。

If you can, can you send me a copy of the data? Thank you.

可以把数据集发我一份儿吗,遇到了同样的错误 谢谢

RuiruiKang commented 3 years ago

ConnectionError: HTTPSConnectionPool(host='docs.google.com', port=443): Max retries exceeded with url: /uc?export=download&id=1a5361IfxxEY1mmTfqAviiIkq6u2OYFJ7 (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x00000226A7E4A0B8>: Failed to establish a new connection: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。',))

jiangjie17754834400 commented 2 years ago

I can not download the data set, could you send me a copy of the data?my Email is jiangjie552198227@163.com

Thank you very much.

jiangjie17754834400 commented 2 years ago

我遇到过一次同样的错误,也许您可​​以尝试重新启动计算机。顺便说一句,如果您需要,我可以将数据集发送给您。

If you can, can you send me a copy of the data? Thank you.

I can not download the data set, could you send me a copy of the data?my Email is jiangjie552198227@163.com

Thank you very much.

EveanLu commented 2 years ago

I met the same error once, maybe you can try to restart your computer. By the way ,if you need ,I can send the dataset to you.

Could you send me a copy of the data?Thank you!

WKaiH123 commented 2 years ago

I met the same error once, maybe you can try to restart your computer. By the way ,if you need ,I can send the dataset to you.

I can not download the data set, could you send me a copy of the data?my Email is 1574723418@qq.com Thank you very much.

koi-boy commented 2 years ago

I can not download the data set, could you send me a copy of the data?my Email is 709818574@qq.com Thank you very much.

ZERO-15 commented 1 year ago

我遇到过一次同样的错误,也许你可以尝试重新启动你的电脑。顺便说一句,如果你需要,我可以把数据集发给你。

I can not download the data set, could you send me a copy of the data?my Email is m13940142960@163.com Thank you very much.