Closed MustafaSky closed 4 days ago
This one seems to be a limitation of your internet speed. It takes a lot of time to upload thousands of chunks when the input file is so large. I was able to run a file with 2M+ lines and the same nuclei template within 30 minutes with the default 512mb memory and 300s timeout configuration.
As a workaround, you can continue using the split file approach but make sure your chunks are <4mb
Description: I encountered an issue while running the ShadowClone tool with a large list of subdomains and a Nuclei template. The list contains approximately 2.6 million subdomains, and the Nuclei template sends 4 requests per URL. The commands used, and the errors encountered are detailed below.
Commands Executed:
Initial Command:
runtime function exceeded maximum time of 295
.Updated Runtime Timeout:
runtime_timeout
to 900, but the problem stayed.Split File Approach:
could not execute the runtime
(sometimes occurs near the final instance like 299/300).Observations:
runtime_timeout
was set to 900.runtime_timeout
of 300 did not resolve the issue.Request for Help: I am seeking guidance to understand if there is an error in my approach or if there are any optimization steps I can take to run this process without upgrading memory and incurring high costs.
Environment:
Note: The issue occurs when the list size is large, and splitting the list into smaller chunks still results in the runtime error near the final instances.