bitquark / shortscan

An IIS short filename enumeration tool
MIT License
814 stars 79 forks source link

[Feature Request] Run shortscan on multiple folders #13

Open nigawtester opened 9 months ago

nigawtester commented 9 months ago

When I use shortscan on the webroot, I sometimes get few results and the tool cannot find certain folders, unless you know the name. For example, even though I have the folder "handlers" in my wordlist, shortscan does not find it. If I point shortscan to site/handlers though, shortscan see the directory as vulnerable and finds files/folders.

I was wondering if it would be possible to add another flag so that shortscan goes through a wordlist of folders so that instead of me scripting it via bash, shortscan will try its magic with: site/admin site/js site/docs site/upload site/...

You could have a short check to find vulnerable folders first and then do a complete check only on those folders that are vulnerable to reduce the amount of requests.

bitquark commented 9 months ago

Yup "handlers" won't have an 8.3 filename created by Windows (it's too short) so it won't get picked up by shortscan. I've just added support for multiple URLs in v0.8.0 so you can pass in several paths to try at once which would work here, but I'll have a think about how best to support loading multiple paths from a file.

nigawtester commented 9 months ago

How do you pass multiple URLs? What's the flag? Also, you could use the list to check which folders return 403 (it has nothing to do with shortname I know) and the folders that return 403 can be scanned with shortscan. The list to be used could be default rainbow table.

bitquark commented 9 months ago

To pass in multiple URLs just pass them in the same way as the first:

shortscan https://example.org/ https://example.com/ https://example.net/

Regarding the directory check, normal wordlist parsing skips anything that wouldn't produce a short filename so it'll need its own mechanism. For now though you can juts pass in multiple paths using the above mechanism.

thezakman commented 4 months ago

To pass in multiple URLs just pass them in the same way as the first:

shortscan https://example.org/ https://example.com/ https://example.net/

Regarding the directory check, normal wordlist parsing skips anything that wouldn't produce a short filename so it'll need its own mechanism. For now though you can juts pass in multiple paths using the above mechanism.

Wouldn't it be better if it was an argument: --list FILE, -l FILE a list of URLs ?

bitquark commented 4 months ago

Better depends on what you're doing, but if you have a large number of URLs to scan it could certainly be a useful option. It's on the list!

thezakman commented 4 months ago

I tried to add it myself and pushed a pull (tested and it's all working like expected), check if you think it fits your whole code.

Thanks in advance for such a great tool!

boringthegod commented 2 months ago

It would be nice if you would accept his PR, I also have the same need as him to scan a lot of URLs, so since the job is already done it would be cool :D

bitquark commented 5 days ago

I've now added functionality to read a list of URLs from a file!

I handled things slightly differently to the commit from @thezakman (nice code, thank you!). Rather than using a separate option, you can now specify URL file lists using @-syntax, for example to read from urls.txt:

$ shortscan @urls.txt

Or even specify multiple files, or combine the two syntaxes:

$ shortscan http://example.org/ @urls1.txt @urls2.txt

That should do the trick, but let me know if you have any ideas for improvement :slightly_smiling_face:

Regarding the original (slightly different) ask, I'll leave this issue open until that functionality has been added.