epi052 / feroxbuster

A fast, simple, recursive content discovery tool written in Rust.
https://epi052.github.io/feroxbuster/
MIT License
5.56k stars 470 forks source link

[BUG] Redirect loop at /robots.txt causes application to hang #1084

Closed n0kovo closed 4 months ago

n0kovo commented 4 months ago

The logic flow somewhere along these lines, leads to what I assume is an infinite loop, if a redirection loop occurs at /robots.txt, and stalls Feroxbuster competely. I wanted to do a PR, but it turns out I'm not a Rust programmer.

Version: 2.10.1

Steps to reproduce:

Server on http://127.0.0.1:5000:

from flask import Flask, redirect
app = Flask(__name__)

@app.route('/robots.txt')
def robots():
    return redirect('/robots.txt')

if __name__ == '__main__':
    app.run(debug=True)
$ echo -e "foo\nbar" > test.txt
$ feroxbuster -u  'http://127.0.0.1:5000' -w test

Trace output:

TRC      1.003 feroxbuster::scanner::ferox_scanner enter: scan_url
INF      1.003 feroxbuster::scanner::ferox_scanner Starting scan against: http://127.0.0.1:5000
TRC      1.004 feroxbuster::extractor::container enter: extract(RobotsTxt) (this fn has no associated trace exit msg)
TRC      1.004 feroxbuster::extractor::container enter: extract_robots_txt
TRC      1.004 feroxbuster::extractor::container enter: make_extract_request
TRC      1.004 feroxbuster::utils enter: parse_url_with_raw_path(http://127.0.0.1:5000)
TRC      1.004 feroxbuster::utils enter: make_request(Configuration::Client, http://127.0.0.1:5000/robots.txt, Default, UnboundedSender { chan: Tx { inner: Chan { tx: Tx { block_tail: 0x7ff04600a000, tail_position: 6 }, semaphore: Semaphore(0), rx_waker: AtomicWaker, tx_count: 3, rx_fields: "..." } } })
TRC      1.015 feroxbuster::utils exit: make_request -> error following redirect for url (http://127.0.0.1:5000/robots.txt): too many redirects
ERR      GET       -1l       -1w       -1c http://127.0.0.1:5000/robots.txt !=> http://127.0.0.1:5000/robots.txt (too many redirects)
WRN      1.015 feroxbuster::utils Error while making request: error following redirect for url (http://127.0.0.1:5000/robots.txt): too many redirects
WRN      1.015 feroxbuster::event_handlers::scans error following redirect for url (http://127.0.0.1:5000/robots.txt): too many redirects
[######>-------------] - 0s         1/3       0s      found:0       errors:1

frozen ^

Thanks for an awesome tool!

epi052 commented 4 months ago

yea, looking through the code, it makes sense that this would hang things (though that's not intended/desirable, lol).

the robots request happens very early in the scan. when this errors out, it exits the scan event handler, making it so that no other directories can be queued/processed.

really appreciate the repro code! ill lyk when it's fixed

epi052 commented 4 months ago

give the build here a try and lmk how it goes

https://github.com/epi052/feroxbuster/actions/runs/8019002224

epi052 commented 4 months ago

@all-contributors add @n0kovo for bug

allcontributors[bot] commented 4 months ago

@epi052

I've put up a pull request to add @n0kovo! :tada:

epi052 commented 4 months ago

merged into main, dropping new release soon