Open gielAdformatic opened 7 years ago
I would suggest the rate of checking is too high so the websites in question are returning 405's to protect against what is perceived to be a Denial of Service attack?
In the configuration spreadsheet, you can control both the number of requests to be issuing in parallel, and the time between requests. These can be found on the "App Engine Performance" tab:
Max number of concurrent URL checking tasks: Set this to 1 to begin with (default is 6 in parallel) Max number of URLs to check per minute, in each task: Set this to something like 60 to begin with
Hi,
I'm trying to use the Large Scale link checker and so this solution, for I client. With lower volume tests everything seemed to work ok. But when I try to evaluate more url's 2000 <, I get a lot of 405 responses. In some cases more then 50% of the responses returned is a 405. When I later recheck the same url's or use the UrlFetch direct from the script I get a 200 or 404 or whatever the state is. Any idea what can cause this? Or what could be a solution? Many thanks for your help in advance.
Best regards, Giel