Thanks for the excellent tool! Just letting you know that this sometimes happens:
File "twitch-dl/twitchdl/http.py", line 129, in download_all
File "twitch-dl/twitchdl/http.py", line 106, in download_with_retries
File "twitch-dl/twitchdl/http.py", line 78, in download
TypeError: int() argument must be a string, a bytes-like object or a number, not 'NoneType'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "twitch-dl/__main__.py", line 3, in <module>
File "twitch-dl/twitchdl/console.py", line 321, in main
File "twitch-dl/twitchdl/commands/download.py", line 170, in download
File "twitch-dl/twitchdl/commands/download.py", line 176, in download_one
File "twitch-dl/twitchdl/commands/download.py", line 316, in _download_video
File "/usr/lib/python3.8/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "/usr/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
return future.result()
File "twitch-dl/twitchdl/http.py", line 129, in download_all
File "twitch-dl/httpx/_client.py", line 2003, in __aexit__
File "twitch-dl/httpx/_transports/default.py", line 332, in __aexit__
File "twitch-dl/httpcore/_async/connection_pool.py", line 326, in __aexit__
File "twitch-dl/httpcore/_async/connection_pool.py", line 312, in aclose
RuntimeError: The connection pool was closed while 5 HTTP requests/responses were still in-flight.
Not a huge issue because I can just re-run the command and it picks up where it left off 😌 (important because I'm downloading ~15GB files), but I figured I'd report this since it looks like something that should have been caught and retried.
Thanks for the excellent tool! Just letting you know that this sometimes happens:
Not a huge issue because I can just re-run the command and it picks up where it left off 😌 (important because I'm downloading ~15GB files), but I figured I'd report this since it looks like something that should have been caught and retried.