iw4p / proxy-scraper

scrape proxies from more than 5 different sources and check which ones are still alive
MIT License
489 stars 127 forks source link

Traceback (most recent call last): File "C:\Users\acer\Downloads\iw4p-proxy-scraper-44df425\proxyScraper.py", line 192, in <module> loop.run_until_complete(scrape(args.proxy, args.output, args.verbose)) File "D:\Programs\Python311\Lib\asyncio\base_events.py", line 653, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "C:\Users\acer\Downloads\Telegram Desktop\iw4p-proxy-scraper-44df425\proxyScraper.py", line 146, in scrape client = httpx.AsyncClient(follow_redirects=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: AsyncClient.__init__() got an unexpected keyword argument 'follow_redirects' #38

Open iiAmeer opened 3 months ago

sharifulgeo commented 1 month ago

Have a look https://github.com/mxrch/GHunt/issues/276 if you manipulate the default code