Open jigar07 opened 2 years ago
Same problem here. Found any solution?
i faced this issue a couple of weeks ago as well and i've found that the root cause was not an issue with aiohttp but a network problem (ok, the message could be a bit misleading because it's not about SSL but it's a different timeout).
could you please run a tcpdump or like on your gateway, or on a target that you control that gives you this error, and see if there is message about MTU issues?
something like: tcpdump -n host {target_ip} and port {target_port}
on your gateway or on your target tcpdump -n host {your_ip} and port {target_port}
Describe the bug
ConnectionAbortedError: SSL handshake is taking longer than 60.0 seconds: aborting the connection after increasing total timeout to 30 minutes from 10 minutes. I am creating client session as follow: async with ClientSession( timeout=ClientTimeout(total=60*30), headers=headers ) as session:
I checked the https://github.com/aio-libs/aiohttp/issues/3651, where it is mentioned that "If SSL handshake takes longer than a minute that means something is going wrong with your code. Try to reduce the number of parallel handshakes if it helps". I checked number of http connections and it is not going beyond 300 at any time. I guess this would be acceptable.
Please let me know why am I getting this error only after increasing timeout from 10 minutes to 30 minutes.
To Reproduce
import asyncio import logging from dataclasses import dataclass
import aiofiles from aiohttp import ClientSession, BasicAuth, ClientTimeout from django.conf import settings from jobs.utils import file
logger = logging.getLogger(name)
@dataclass class FetchErrorResponse: url: str = None status: str = None
@dataclass class FetchResponse: url: str = None file_dir: str = None content: str = None
async def fetch_and_download(url, session): retry = settings.RESULT_API_MAX_RETRIES backupoff = settings.RESULT_API_TIMEOUT try: async with session.request('GET',url["url"]) as response: if response.status == 200 or response.status == 207: try: file.create_non_executable_file(url["file_dir"]) async with aiofiles.open(url["file_dir"], mode="wb") as f: read_content = await response.read() await f.write(read_content) logger.info(f"Downloaded file: {url['file_dir'].encode('utf-8')}, url: {url['url']}") return True, len(read_content) except Exception as ex: logging.exception(ex) elif response.status < 500: logger.error( f"Failed downloading file. Error response: {response.status}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], response.status) else: else: logger.error( f"Failed downloading file. Error response: {response.status}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], response.status) except Exception as ex: logging.exception(ex) logger.error( f"Failed downloading file. Async exception: {ex}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], f"TIMEOUT")
async def fetch(url, session): try: async with session.request('PROPFIND',url["url"]) as response: if response.status == 200 or response.status == 207: logger.info(f"Scanned directory: {url['file_dir'].encode('utf-8')}, url: {url['url']}, status:{response.status}") return FetchResponse(url["url"], url["file_dir"], await response.read()) elif response.status < 500: logger.error( f"Failed scanning directory. Error response: {response.status}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], response.status) else: else: logger.error( f"Failed scanning directory. Error response: {response.status}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], response.status) except Exception as ex: logging.exception(ex) logger.error( f"Failed scanning directory. Async exception: {ex}, file: {url['file_dir'].encode('utf-8')}, url: {url['url']}" ) return FetchErrorResponse(url["url"], f"TIMEOUT")
async def bound_fetch(sem, url, session, download): if download: async with sem: x = await fetch_and_download(url, session) return x else: async with sem: x = await fetch(url, session) return x
async def process_async(urls, access_token, download): tasks = [] sem = asyncio.Semaphore(500) headers = {"Authorization": "Bearer " + access_token, "depth":"1"}
Expected behavior
It should work without any error
Logs/tracebacks
Python Version
aiohttp Version
multidict Version
yarl Version
OS
Windows
Related component
Client
Additional context
No response
Code of Conduct