aio-libs / aiohttp

Asynchronous HTTP client/server framework for asyncio and Python
https://docs.aiohttp.org
Other
14.97k stars 2k forks source link

ConnectionAbortedError: SSL handshake is taking longer than 60.0 seconds: aborting the connection after increasing total timeout to 30 minutes from 10 minutes #6795

Open jigar07 opened 2 years ago

jigar07 commented 2 years ago

Describe the bug

ConnectionAbortedError: SSL handshake is taking longer than 60.0 seconds: aborting the connection after increasing total timeout to 30 minutes from 10 minutes. I am creating client session as follow: async with ClientSession( timeout=ClientTimeout(total=60*30), headers=headers ) as session:

I checked the https://github.com/aio-libs/aiohttp/issues/3651, where it is mentioned that "If SSL handshake takes longer than a minute that means something is going wrong with your code. Try to reduce the number of parallel handshakes if it helps". I checked number of http connections and it is not going beyond 300 at any time. I guess this would be acceptable.

Please let me know why am I getting this error only after increasing timeout from 10 minutes to 30 minutes.

To Reproduce

import asyncio import logging from dataclasses import dataclass

import aiofiles from aiohttp import ClientSession, BasicAuth, ClientTimeout from django.conf import settings from jobs.utils import file

logger = logging.getLogger(name)

@dataclass class FetchErrorResponse: url: str = None status: str = None

@dataclass class FetchResponse: url: str = None file_dir: str = None content: str = None

async def fetch_and_download(url, session): retry = settings.RESULT_API_MAX_RETRIES backupoff = settings.RESULT_API_TIMEOUT try: async with session.request('GET',url["url"]) as response: if response.status == 200 or response.status == 207: try: file.create_non_executable_file(url["file_dir"]) async with aiofiles.open(url["file_dir"], mode="wb") as f: read_content = await response.read() await f.write(read_content) logger.info(f"Downloaded file: {url['file_dir'].encode('utf-8')}, url: {url['url']}") return True, len(read_content) except Exception as ex: logging.exception(ex) elif response.status < 500: logger.error( f"Failed downloading file. Error response: {response.status}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], response.status) else: else: logger.error( f"Failed downloading file. Error response: {response.status}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], response.status) except Exception as ex: logging.exception(ex) logger.error( f"Failed downloading file. Async exception: {ex}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], f"TIMEOUT")

async def fetch(url, session): try: async with session.request('PROPFIND',url["url"]) as response: if response.status == 200 or response.status == 207: logger.info(f"Scanned directory: {url['file_dir'].encode('utf-8')}, url: {url['url']}, status:{response.status}") return FetchResponse(url["url"], url["file_dir"], await response.read()) elif response.status < 500: logger.error( f"Failed scanning directory. Error response: {response.status}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], response.status) else: else: logger.error( f"Failed scanning directory. Error response: {response.status}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], response.status) except Exception as ex: logging.exception(ex) logger.error( f"Failed scanning directory. Async exception: {ex}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], f"TIMEOUT")

async def bound_fetch(sem, url, session, download): if download: async with sem: x = await fetch_and_download(url, session) return x else: async with sem: x = await fetch(url, session) return x

async def process_async(urls, access_token, download): tasks = [] sem = asyncio.Semaphore(500) headers = {"Authorization": "Bearer " + access_token, "depth":"1"}

if (download):
    headers = {"Authorization": "Bearer " + access_token}
async with ClientSession(
    timeout=ClientTimeout(total=60*30),
    headers=headers
) as session:
    for i in urls:
        task = asyncio.ensure_future(bound_fetch(sem, i, session, download))
        tasks.append(task)
    responses = asyncio.gather(*tasks, return_exceptions=True)
    await responses
    return responses

Expected behavior

It should work without any error

Logs/tracebacks

File "c:\program files\python37\lib\site-packages\aiohttp\connector.py", line 985, in _wrap_create_connection
    return await self._loop.create_connection(*args, **kwargs)  # type: ignore[return-value]  # noqa
  File "c:\program files\python37\lib\asyncio\base_events.py", line 989, in create_connection
    ssl_handshake_timeout=ssl_handshake_timeout)
  File "c:\program files\python37\lib\asyncio\base_events.py", line 1017, in _create_connection_transport
    await waiter
ConnectionAbortedError: SSL handshake is taking longer than 60.0 seconds: aborting the connection
Traceback (most recent call last):
  File "C:\Users\Administrator\App\src\jobs\utils\async_download.py", line 31, in fetch_and_download
    async with session.request('GET',url["url"]) as response:
  File "c:\program files\python37\lib\site-packages\aiohttp\client.py", line 1140, in __aenter__
    self._resp = await self._coro
  File "c:\program files\python37\lib\site-packages\aiohttp\client.py", line 536, in _request
    req, traces=traces, timeout=real_timeout
  File "c:\program files\python37\lib\site-packages\aiohttp\connector.py", line 543, in connect
    proto = await self._create_connection(req, traces, timeout)
  File "c:\program files\python37\lib\site-packages\aiohttp\connector.py", line 906, in _create_connection
    _, proto = await self._create_direct_connection(req, traces, timeout)
  File "c:\program files\python37\lib\site-packages\aiohttp\connector.py", line 1205, in _create_direct_connection
    raise last_exc
  File "c:\program files\python37\lib\site-packages\aiohttp\connector.py", line 1186, in _create_direct_connection
    client_error=client_error,
  File "c:\program files\python37\lib\site-packages\aiohttp\connector.py", line 991, in _wrap_create_connection
    raise client_error(req.connection_key, exc) from exc
aiohttp.client_exceptions.ClientConnectorError: Cannot connect to host <host> ssl:default [None]

Python Version

$ python --version
3.7.8

aiohttp Version

$ python -m pip show aiohttp

multidict Version

$ python -m pip show multidict
Name: aiohttp
Version: 3.8.0
Summary: Async http client/server framework (asyncio)
Home-page: https://github.com/aio-libs/aiohttp
Author:
Author-email:
License: Apache 2
Location: c:\users\admin\appdata\local\programs\python\python37\lib\site-packages
Requires: aiosignal, async-timeout, asynctest, attrs, charset-normalizer, frozenlist, multidict, typing-extensions, yarl
Required-by:

yarl Version

$ python -m pip show yarl
Name: yarl
Version: 1.4.2
Summary: Yet another URL library
Home-page: https://github.com/aio-libs/yarl/
Author: Andrew Svetlov
Author-email: andrew.svetlov@gmail.com
License: Apache 2
Location: c:\users\admin\appdata\local\programs\python\python37\lib\site-packages
Requires: idna, multidict
Required-by: aiohttp

OS

Windows

Related component

Client

Additional context

No response

Code of Conduct

Lurrobert commented 2 years ago

Same problem here. Found any solution?

theguly commented 2 years ago

i faced this issue a couple of weeks ago as well and i've found that the root cause was not an issue with aiohttp but a network problem (ok, the message could be a bit misleading because it's not about SSL but it's a different timeout).

could you please run a tcpdump or like on your gateway, or on a target that you control that gives you this error, and see if there is message about MTU issues? something like: tcpdump -n host {target_ip} and port {target_port} on your gateway or on your target tcpdump -n host {your_ip} and port {target_port}