utiso / dorkbot

Command-line tool to scan Google search results for vulnerabilities
http://dorkbot.io
Other
506 stars 102 forks source link

slow if scanning in bulk #5

Closed ThibeauM closed 5 years ago

ThibeauM commented 5 years ago

Hello sometimes it takes 2-5 sec each dork to scan, this is fairly slow if i want to scan 100 dorks , any way to fix this ?

jgor commented 5 years ago

I would expect most scans to take minutes at least, not seconds. Which scanner module are you using, arachni or wapiti?

Or are you talking about an indexer module instead of a scanner module? Feel free to include the command you're using and I can perhaps comment further.

ThibeauM commented 5 years ago

I have the following tools installed: arachni phantom

i use the following code to TRY and run multiple dork scans at once (im noob at python dont blame me :/) ...

import subprocess
import sys
from colorama import Fore, Style
keys = 0
threads = 0
keylist = []
currentkeynum = 0
k = open("keys.txt")
f = open("input.txt")
with open("keys.txt") as k:
    for line in k:
        keys += 1
        keylist.insert(0,line)

print(keylist)
print("Found " + str(keys) + "keys.")
with open("input.txt", 'r', encoding="ISO-8859-1") as f:
    for line in f:
        if threads >= 40: #trying to run 40 dorkbots at once here but dont think it works 
            p.wait()
            threads -= 1
        if currentkeynum > keys - 1:
            currentkeynum = 0
        currentkey = keylist[currentkeynum]
        currentkey = currentkey.rstrip('\n')
        l = line.rstrip('\n')
        print(Fore.LIGHTGREEN_EX + "Using key " + currentkey + Style.RESET_ALL)
        print(Fore.LIGHTGREEN_EX + "Started searching for " + l + Style.RESET_ALL)
        p = subprocess.Popen(["python", "dorkbot.py", "-i", "google", "-o", "engine=" + currentkey + ",query=" + l])
        threads += 1
        currentkeynum += 1
ThibeauM commented 5 years ago

i wonder if i better use https://github.com/jgor/dork-cli for this what im trying ?

ThibeauM commented 5 years ago

ok i got multithreading on my side working altrough if i do to many threads i start getting errors

ThibeauM commented 5 years ago

Prob because the main program isnt meant to write many things at the database at once , idk its going to fast to spot the error

ThibeauM commented 5 years ago

Traceback (most recent call last): File "dorkbot.py", line 272, in main() File "dorkbot.py", line 32, in main index(db, args.indexer, parse_options(args.indexer_options)) File "dorkbot.py", line 124, in index results = module.run(options) File "/root/dorkbot/indexers/google.py", line 40, in run results = get_results(index_cmd) File "/root/dorkbot/indexers/google.py", line 49, in get_results output = subprocess.check_output(index_cmd) File "/usr/lib/python2.7/subprocess.py", line 574, in check_output raise CalledProcessError(retcode, cmd, output=output) subprocess.CalledProcessError: Command '['/root/dorkbot/indexers/../tools/phantomjs/bin/phantomjs', '--ignore-ssl-errors=true', '/root/dorkbot/indexers/google.js', 'myhotkey69', 'hotdork69']' returned non-zero exit status 1

is one of the errors i often get

ThibeauM commented 5 years ago

could this be because the key has a daily limit ? i use many keys and some work some stop working after some time

jgor commented 5 years ago

Yeah, I think this is some sort of (unpublished?) limit on Google's side. I get similar errors when running too many instances, and it's because the website has stopped returning results for that engine.

ThibeauM commented 5 years ago

You think multiple keys / gcse will help fix the problem ?