Closed dschepler closed 4 months ago
Please review the documentation on using Requests with respect to timeouts.
OK, x = requests.get("https://openssl.org/source/", timeout=30)
does seem to time out instead of hanging indefinitely. The question remains, though, why it frequently times out when none of the command line tools or web browsers I've tested have any such issues fetching that page.
It never hangs for me or times out. This no way for us to debug this for you or provide you an answer
As part of a software version webscraper I use requests. However, recently, it's started to be the case that for one particular site, https://openssl.org/source/ , it frequently hangs or times out when trying to fetch that page.
I was able to reproduce this simply by running:
python3 -c 'import requests; x = requests.get("https://openssl.org/source/", timeout=30)'
Expected Result
Should finish the request promptly (unless, of course, there's some connectivity issue; however, that doesn't seem to be the case since wget, curl, and Firefox have no issues fetching the same page).
Actual Result
Frequently, the fetch fails due to a timeout. Traceback:
(If no timeout is specified, then it seems to hang indefinitely -- at least for several hours.)
Reproduction Steps
System Information
(Note that I'm not sure whether this is actually an issue with requests, or maybe just the openssl.org website doing something weird. However, I haven't been able to reproduce similar problems using curl, wget, or Firefox.)