I am making tons of requests at one time, for example say 100k. I am using a semaphore to limit the requests, but I'm still getting the address already in use error.
I am pretty sure the answer lies in setting SO_REUSEPORT or SO_REUSEADDR somewhere, but I'm not 100% sure where to do this. I'm using asyncio and uvloop to make these requests with async_dns. If you can help me I'd appreciate it! THANKS!
Here is the (very rough) code:
import asyncio
import uvloop
from async_dns.resolver import ProxyResolver
from async_dns import types
tasks = []
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
resolver = ProxyResolver()
async def lookup(name):
with (await sem):
response = await resolver.query(name, types.A)
try:
if response.__dict__['an']:
print(response)
except AttributeError:
pass
with open('subdomains.txt') as f:
names = f.read().splitlines()
print("Looking up {} subdomains...".format(len(names)))
for n in names:
task = asyncio.ensure_future(lookup('{}.testdomain.com'.format(n)))
tasks.append(task)
sem = asyncio.Semaphore(500)
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))
I am making tons of requests at one time, for example say 100k. I am using a semaphore to limit the requests, but I'm still getting the address already in use error.
I am pretty sure the answer lies in setting SO_REUSEPORT or SO_REUSEADDR somewhere, but I'm not 100% sure where to do this. I'm using asyncio and uvloop to make these requests with async_dns. If you can help me I'd appreciate it! THANKS!
Here is the (very rough) code: