Closed Ycirn closed 2 years ago
sites that are down (or geoblocked) are still getting requested, this is a waste of resources, you can request them, but slow the requests down to them
Downed ones need to keep getting hit, is there a way to know which are geoblocked without manually testing them?
No there isn't a way. Down ones have to be pinged lets say every 10s to see if they are still down and if they are not, add to the full target list. I would suggest if 90% of responses are 200, they should be added to the full atack mode.
I notice a lot of traffic is http1/1, forcing it to http2 in the browser would minimalistically improve it. Is there a way to force http2/0 from the site? doubt it
Speed will be determined by the concurrent requests and timeout. As people keep pull requesting sites, there will be less request to individual sites, therefore those have to be changed.
However, it requires hefty amounts of testing to ensure effectiveness.
Speed will be determined by the concurrent requests and timeout. As people keep pull requesting sites, there will be less request to individual sites, therefore those have to be changed.
However, it requires hefty amounts of testing to ensure effectiveness.
Its at 2 requests per seconds now @ajax-lives
What if somebody just goes through the list and removes all the less-important sites? I'm certain that lenta.ru and other news sites are a bit less important than bank websites and government record sites like gosuslugi.ru.
Removed useless and duplicate links #51
Can anyone and try to speed it up?