OWASP / D4N155

OWASP D4N155 - Intelligent and dynamic wordlist using OSINT
https://owasp.org/www-project-d4n155/
GNU General Public License v3.0
226 stars 47 forks source link
crawler dorking duckduckgo dynamic google osint owasp-d4n155 raport scraping tool wordlist

OWASP D4N155

made-with-bash GPLv3 license OWASP project

It's an information security audit tool that creates intelligent wordlists based on the content of the target page.

asciicast

Help us, See some calculations used

Ongoing projects :construction_worker:: D4N155 in docker :gift:, Web API D4N155 :cloud:, Telegram bot :robot:

Install

Need to: Python3.6, Bash (GNU Bourne-Again SHell), Go

Optional: Git

Source

git clone https://github.com/owasp/D4N155.git
cd D4N155
pip3 install -r requirements.txt
bash main

Or whithout git

wget -qO- https://github.com/owasp/D4N155/archive/master.zip | bsdtar -xf-
cd D4N155-master
pip3 install -r requirements.txt
bash main

Docker

In image:

FROM docker.pkg.github.com/owasp/d4n155/d4n155:latest

Cli:

docker pull docker.pkg.github.com/owasp/d4n155/d4n155:latest
docker run -it d4n155

Manual

    D4N155: Tool for smart audit security

    Usage: bash main <option> <value>
    All options are optionals

    Options:
    -w, --wordlist  <url|ip>    Make the smartwordlist based in informations
                    on website.
    -t, --targets   <file>      Make the smart-wordlist based in your passed
                    source informations in urls.
    -b, --based <file>      Analyze texts to generate the
                    custom wordlist
    -r, --rate  <time>      Defines time interval between requests
    -o, --output    <file>      For to store the all wordlist.
    -?a, --aggressive      Aggressive reading with headless
    -h, --help          Show this mensage.

     Value: <url | ip | source | file | time>
    URL             URL target, example: scanme.nmap.org
    IP              IP address
    TIME                Time, example: 2.5. I.e: 00:00:02:30.. 0 are default
    FILE                File, for save the result, get urls or using in
                    wordlist