wpscanteam / wpscan

WPScan WordPress security scanner. Written for security professionals and blog maintainers to test the security of their WordPress websites. Contact us via contact@wpscan.com
https://wpscan.com/wordpress-cli-scanner
Other
8.54k stars 1.26k forks source link

Valums uploader detection in theme #98

Closed erwanlr closed 11 years ago

erwanlr commented 11 years ago

Some themes are vulnerable to arbitrary file upload : http://packetstormsecurity.com/files/119241/wpvalums-shell.txt

Any ideas how we could get all the vulnerable themes ? (most of them are premium ones)

firefart commented 11 years ago

Google dork inurl:valums_uploader

reveals exactly the 3 themes mentioned in the packetstorm advisory

I think there is no way in getting the theme names without a search engine.

BTW: How is the Wordpress Vuln DB project going? Would be a great help for these problems.

firefart commented 11 years ago

o and for the public themes: crawl the wordpress theme svn repo I think it would be great to add a "create file db" option to wptools which grabs all filenames from the wordpress svn repos and puts them in a db (upgradeable) so we can search the local db for occurences of this folder without enumerating the whole svn repo again

erwanlr commented 11 years ago

Yep i've done a dork, but i'm sure there is a lot of other themes which uses it :p

As far as the WP Vuln DB is concerned, we - unfortunately - have a lack of time :/. If you want to help (with ideas, code, comments etc), ask @gbrindisi to add you to the repo :)

I did not thought about the wordpress theme svn repo

firefart commented 11 years ago

ok since there are not private messages on github @gbrindisi can you please add me to the repo? :P

erwanlr commented 11 years ago

I've just tested the anemone gem with the following code :

require 'anemone'

Anemone.crawl("http://themes.svn.wordpress.org/", :threads => 20) do |anemone|
  anemone.on_pages_like(/\/$/) do |page|
    puts "#{page.url} : "
    puts page.links
    puts
  end
end

The main problem is the way that the links are followed : it does not retrieve all links for the first directory then for the second and so on :/

IE : you have the following tree of links :

L1
  L11
    L111
  L12
L2
  L21
  L22
  L23
    L231

Anemone will do that : L1, L2, L11, L12, L21, L22, L23, L111, L231 Do not know if we could have this : L1, L11, L111, L12, L2, L21, L22, L23, L231 with Anemone :(

firefart commented 11 years ago

why not put page.links into an array/hash and sort it afterwards?

ethicalhack3r commented 11 years ago

@gbrindisi wrote a script to spider the themes svn repo

ethicalhack3r commented 11 years ago

Will private themes be in the WordPress SVN repo? Quick search seems to point to no. :(

gbrindisi commented 11 years ago

Hey, @FireFart I've DMd you on twitter, I need your email.

@erwanlr for crawling the svn the bot I wrote is on this post, really basic but should be a valid starting point. But it's in python!

erwanlr commented 11 years ago

Yep i saw your crawler, that's really impressive. However, as @FireFart said, have a local DB of all those files could be great.

I tried to do the following :

for each themes in the repo
  enumerate all files / dirs & compute their md5sum or sha1sum
  put the result into a json file

But it was unsuccessful with anemone

gbrindisi commented 11 years ago

I am missing the point, I guess. Why having a local db of the svn repos would be useful?

erwanlr commented 11 years ago

To be able to quickly search a file or a directory w/o recrawling all the repo

And with the files hash, be able to detect vulnerable files (like some version of swfupload.swf etc)

gbrindisi commented 11 years ago

I don't know Erwan, first the repositories are changing every day and a local db will become outdated very quickly, in second place they are huge and having a local index/db/whatever might be a big waste of space.

Also why an user should care to crawl them in the first place? If someone find a vulnerable script it's not the scope of Wpscan to let them find every plugin/theme which includes it.

I think it would be better (and easier) to build a general purpose crawler and use it when we need it.

erwanlr commented 11 years ago

It's not designed for a user but for us :p

For the local DB, of course it will be quickly outdated, however, when you do a lot a research about files in themes / plugin the same day, you do not want to recrawl all the repo. It will also take space, you are right, however if it can save me a few hour i take it :)

gbrindisi commented 11 years ago

Ha! For us, ok got it. So you want something like an incremental crawler which crawls one time, saves url/filename/md5 and on next iterations skips those files already scanned.

It might work... have a try!

firefart commented 11 years ago

jeah or we implement this in the wpscan.org database with a public search form. what do you think about that?

gbrindisi commented 11 years ago

@FireFart A search engine for the repos umh.... would be awesome to build! But first we need the basic things up and running: i.e. the api. :)

gbrindisi commented 11 years ago

Something we can include right now in wpscan is the exact opposite: given the hash of a file, brute the target installation searching for it.

If I am not wrong we discussed about this a while ago? Can't remember how it ended.

erwanlr commented 11 years ago

A search engine for the repo and implemented in the wpscan.org db is a good idea

However, i think we will have to clone the repo for best performance : this kind of command takes too long :

svn ls -R http://themes.svn.wordpress.org | grep swfupload.swf

Then, once the repo is cloned, we will just have to update it and do some find or svnlook server-side

Something we can include right now in wpscan is the exact opposite: given the hash of a file, brute the target installation searching for it.

I've thought about that

If I am not wrong we discussed about this a while ago?

I do not remember it :/

Ps : How long did it takes you to crawl the wp plugins repo with your bot ?

gbrindisi commented 11 years ago

Ps : How long did it takes you to crawl the wp plugins repo with your bot ?

If I remember correctly it took a good night to scan the plugins repo but it was a dumb scan since I included every file type possible.

For the themes I instructed the crawler to search only for swf files and it was quite fast then (just a couple hours more or less). And I did everything from my laptop with no more than 10 threads per 1 process.

Why would you still want to replicate the repositories? We can just build a bunch of spider to collect filenames, paths and md5s and report back to a central db and then build a frontend from it. No needs to index the code too, right?

erwanlr commented 11 years ago

I want to clone the repo because i can't find a way to quickly update the crawling :/