andresriancho / w3af

w3af: web application attack and audit framework, the open source web vulnerability scanner.
http://w3af.org/
4.52k stars 1.21k forks source link

NoneType' object has no attribute 'makefile' - pool.py hiding real exception #229

Closed ericsesterhenn closed 10 years ago

ericsesterhenn commented 11 years ago

Hi,

another issue where one of the plugins seems to cause a problem :( I am currently testing against the metasploitable testsetup with current threading2 -git

New URL found by web_spider plugin: "http://192.168.122.80/mutillidae/javascript/ddsmoothmenu/" An exception was found while running audit.format_string on "http://domain/dvwa/ | Method: GET | Parameters: (view="phpinfo")". The exception was: "'NoneType' object has no attribute 'makefile'" at pool.py:next():653.The full traceback is: File "/home/user/arbeit/tools/w3af/core/controllers/core_helpers/consumers/audit.py", line 111, in _audit plugin.audit_with_copy(fuzzable_request, orig_resp) File "/home/user/arbeit/tools/w3af/core/controllers/plugins/audit_plugin.py", line 126, in audit_with_copy return self.audit(fuzzable_request.copy(), orig_resp) File "/home/user/arbeit/tools/w3af/plugins/audit/format_string.py", line 56, in audit self._analyze_result) File "/home/user/arbeit/tools/w3af/core/controllers/plugins/plugin.py", line 199, in _send_mutants_in_threads for (mutant,), http_response in self.worker_pool.imap_unordered(func, iterable): File "/usr/lib/python2.7/multiprocessing/pool.py", line 653, in next raise value The scan will continue but some vulnerabilities might not be identified. Cross Site Request Forgery has been found at: http://192.168.122.80/dvwa/. This vulnerability was found in the request with id 12674.

andresriancho commented 11 years ago

That's an ugly one. Thanks for reporting it! Are you able to reproduce it every time you scan the site?

ericsesterhenn commented 11 years ago

I will try to reproduce, at the moment I am testing with a stripped down script, since the scan stopped somewhen with the python process consuming 100% cpu and no longer sending requests or responding to enter presses ( unfortunately I was not able to get any information which might help debug this ) . I will let you know if i can trigger the pool.py issue again.

andresriancho commented 11 years ago

Thanks, your help is very much appreciated.

This bug will be hard to fix because it seems that it's not in w3af's source:

foo@bar:~/workspace/w3af$ find . -name '*.py' | xargs grep "makefile"
./core/controllers/daemons/proxy.py:    This is a wrapper around an SSL connection which also implements a makefile
./core/controllers/daemons/proxy.py:    def makefile(self, perm, buf):

Maybe it's in python, maybe in the way we use some python... we'll see. Being able to reproduce is the first step.

ericsesterhenn commented 11 years ago

I am currently trying to reproduce this and have hit this issue:

--------8<------------------------- New URL found by phpinfo plugin: "http://192.168.122.80/phpMyAdmin/themes/" Cross Site Request Forgery has been found at: http://192.168.122.80/dvwa/index.php. This vulnerability was found in the request with id 12437. An exception was found while running audit.blind_sqli on "http://domain/dvwa/ | Method: GET | Parameters: (view="phpinfo")". The exception was: "local variable 'resp' referenced before assignment" at logHandler.py:log_req_resp():76.The full traceback is: File "/home/user/arbeit/tools/w3af/core/controllers/core_helpers/consumers/audit.py", line 111, in _audit plugin.audit_with_copy(fuzzable_request, orig_resp) File "/home/user/arbeit/tools/w3af/core/controllers/plugins/audit_plugin.py", line 126, in audit_with_copy return self.audit(fuzzable_request.copy(), orig_resp) File "/home/user/arbeit/tools/w3af/plugins/audit/blind_sqli.py", line 80, in audit found_vuln = method.is_injectable(mutant) File "/home/user/arbeit/tools/w3af/core/controllers/sql_tools/blind_sqli_time_delay.py", line 52, in is_injectable success, responses = ed.delay_is_controlled() File "/home/user/arbeit/tools/w3af/core/controllers/delay_detection/exact_delay_controller.py", line 83, in delay_is_controlled original_wait_time = self.get_original_time() File "/home/user/arbeit/tools/w3af/core/controllers/delay_detection/delay_mixin.py", line 28, in get_original_time cache=False).get_wait_time() File "/home/user/arbeit/tools/w3af/core/controllers/plugins/plugin.py", line 215, in meth return attr(_args, _kwargs) File "/home/user/arbeit/tools/w3af/core/data/url/extended_urllib.py", line 270, in send_mutant res = functor(_args, _kwargs) File "/home/user/arbeit/tools/w3af/core/data/url/extended_urllib.py", line 323, in GET return self._send(req, grep=grep) File "/home/user/arbeit/tools/w3af/core/data/url/extended_urllib.py", line 618, in _send return self._new_no_content_resp(original_url_inst, log_it=True) File "/home/user/arbeit/tools/w3af/core/data/url/extended_urllib.py", line 350, in _new_no_content_resp LogHandler.log_req_resp(req, no_content_response) File "/home/user/arbeit/tools/w3af/core/data/url/handlers/logHandler.py", line 76, in log_req_resp om.out.log_http(request, resp) The scan will continue but some vulnerabilities might not be identified. --------8<-------------------------

The script I am using is:

--------8<------------------------- plugins output !all, xml_file, console output config xml_file set output_file myoutfile3.xml back

output config console

set verbose True

back

grep !all, click_jacking, code_disclosure, cross_domain_js, csp, directory_indexing, dom_xss, dot_net_event_validation, error_500, error_pages, feeds, file_upload, form_autocomplete, get_emails, hash_analysis, http_auth_detect, http_in_body, oracle, strange_headers, strange_http_codes, strange_param eters, strange_reason, svn_users, url_session, wsdl_greper, xss_protection_header grep

infrastructure !all, afd, detect_reverse_proxy, detect_transparent_proxy, dot_net_errors, favicon_identification, find_jboss, fingerprint_WAF, fingerp rint_os, frontpage_version, hmap, php_eggs, server_header, server_status infrastructure

crawl !all, content_negotiation, digit_sum, dir_file_bruter, dot_listing, find_backdoors, find_captchas, phpinfo, ria_enumerator, robotstxt, sitemap xml, urllist_txt, web_spider, wordnet, wordpress_enumerate_users, wordpress_fingerprint, wordpress_fullpathdisclosure, wsdl_finder crawl

bruteforce !all bruteforce

mangle !all mangle

evasion !all evasion

auth !all auth

audit !all, blind_sqli, buffer_overflow, cors_origin, csrf, dav, eval, file_upload, format_string, frontpage, generic, global_redirect, htaccess_metho ds, ldapi, lfi, mx_injection, os_commanding, phishing_vector, preg_replace, redos, response_splitting, sqli, ssi, ssl_certificate, xpath, xss, xst audit back

target set target http://192.168.122.80/ back start exit --------8<-------------------------

I am currently hitting the 100% CPU issue again and will restart the scan

ericsesterhenn commented 11 years ago

I am beginning to suspect something is broken on my end, another run, another error:

--------8<------------------------- New URL found by dir_file_bruter plugin: "http://192.168.122.80/index/" The remote Web server sent a strange HTTP response code: "405" with the message: "Method Not Allowed", manual inspection is advised. This information was found in the request with id 58. An exception was found while running crawl.ria_enumerator on "http://domain/ | Method: GET". The exception was: "The ria_enumerator plugin did NOT return None." at exception_handler.py:init():225.The full traceback is: File "/usr/lib/python2.7/multiprocessing/pool.py", line 113, in worker result = (True, func(_args, _kwds)) File "/home/user/arbeit/tools/w3af/core/controllers/threads/threadpool.py", line 54, in call return args, self.func(_args, _kwds) File "/home/user/arbeit/tools/w3af/core/controllers/core_helpers/consumers/crawl_infrastructure.py", line 411, in _discover_worker fuzzable_request, ve) File "/home/user/arbeit/tools/w3af/core/controllers/core_helpers/consumers/base_consumer.py", line 217, in handle_exception enabled_plugins) File "/home/user/arbeit/tools/w3af/core/controllers/core_helpers/exception_handler.py", line 225, in init filepath = traceback.extract_tb(tb)[-1][0] The scan will continue but some vulnerabilities might not be identified. dir_file_brute plugin found "http://192.168.122.80/icons/" with HTTP response code 200 and Content-Length: 69404. dir_file_brute plugin found "http://192.168.122.80/test/" with HTTP response code 200 and Content-Length: 885. Your ISP has no transparent proxy. --------8<-------------------------

ericsesterhenn commented 11 years ago

A scan later, I was able to reproduce this issue:

--------8<------------------------- New URL found by phpinfo plugin: "http://192.168.122.80/twiki/bin/upload/" New URL found by web_spider plugin: "http://192.168.122.80/twiki/bin/viewfile/Main/WebHome" An exception was found while running crawl.phpinfo on "http://domain/twiki/bin/attach/Main/WebHome | Method: GET | Parameters: (filename="clXotWK.Tr...", amp="", revInfo="1")". The exception was: "'NoneType' object has no attribute 'makefile'" at pool.py:get():554.The full traceback is: File "/home/user/arbeit/tools/w3af/core/controllers/core_helpers/consumers/crawl_infrastructure.py", line 389, in _discover_worker result = plugin.discover_wrapper(fuzzable_request) File "/home/user/arbeit/tools/w3af/core/controllers/plugins/crawl_plugin.py", line 47, in crawl_wrapper return self.crawl(fuzzable_request_copy) File "/home/user/arbeit/tools/w3af/plugins/crawl/phpinfo.py", line 74, in crawl self.worker_pool.map_multi_args(self._check_and_analyze, args) File "/home/user/arbeit/tools/w3af/core/controllers/threads/threadpool.py", line 76, in map_multi_args return self.map_async(one_to_many(func), iterable, chunksize).get() File "/usr/lib/python2.7/multiprocessing/pool.py", line 554, in get raise self._value The scan will continue but some vulnerabilities might not be identified. --------8<-------------------------

andresriancho commented 11 years ago

Sorry for the late reply, some comments on your issues:

An exception was found while running audit.blind_sqli on "http://domain/dvwa/ | Method: GET | Parameters: (view="phpinfo")". The exception was: "local variable 'resp' referenced before assignment" at logHandler.py:log_req_resp():76.The full traceback is:

This was a bug, identified and fixed a couple of days ago.

An exception was found while running crawl.ria_enumerator on "http://domain/ | Method: GET". The exception was: "The ria_enumerator plugin did NOT return None." at exception_handler.py:init():225.The full traceback is:

This was a bug, fixed today/yesterday.

Regarding your initial bug report:

An exception was found while running audit.format_string on "http://domain/dvwa/ | Method: GET | Parameters: (view="phpinfo")". The exception was: "'NoneType' object has no attribute 'makefile'" at pool.py:next():653.The full traceback is:

I don't have good news for you yet, but will try to spend a couple of hours with this today.

andresriancho commented 11 years ago

An exception was found while running crawl.phpinfo on "http://domain/twiki/bin/attach/Main/WebHome | Method: GET | Parameters: (filename="clXotWK.Tr...", amp="", revInfo="1")". The exception was: "'NoneType' object has no attribute 'makefile'" at pool.py:get():554.The full traceback is:

Seems to be related with #265

andresriancho commented 11 years ago

Created a ticket for python related to this issue: http://bugs.python.org/issue17836

@ericsesterhenn any chance you modify your /usr/lib/python2.7/multiprocessing/pool.py file using the patch in the reported issue and try to reproduce again? The real traceback, the one we want to read and understand, will appear in the console when you're able to see the exception in the GUI.

I'm adding a modified pool.py file in order to reduce your work. Just download it and copy to the right path. Make sure to make a backup!

https://gist.github.com/andresriancho/5456482/download

ericsesterhenn commented 11 years ago

Using the pool.py file you provided, the following error triggered:

New URL found by phpinfo plugin: "http://192.168.122.80/index/" Traceback (most recent call last): File "/usr/lib/python2.7/multiprocessing/pool.py", line 98, in worker result = (True, func(_args, _kwds)) File "/usr/lib/python2.7/multiprocessing/pool.py", line 67, in mapstar return map(_args) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/controllers/threads/threadpool.py", line 42, in call return self.func(_args) File "/home/snakebyte/arbeit/tools/tmp/w3af/plugins/crawl/web_spider.py", line 282, in _verify_reference headers=headers) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/controllers/plugins/plugin.py", line 215, in meth return attr(_args, _kwargs) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/extended_urllib.py", line 295, in GET return self._send(req, grep=grep) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/extended_urllib.py", line 454, in _send res = self._opener.open(req) File "/usr/lib/python2.7/urllib2.py", line 407, in open response = meth(req, response) File "/usr/lib/python2.7/urllib2.py", line 520, in http_response 'http', request, response, code, msg, hdrs) File "/usr/lib/python2.7/urllib2.py", line 439, in error result = self._call_chain(_args) File "/usr/lib/python2.7/urllib2.py", line 379, in _call_chain result = func(_args) File "/usr/lib/python2.7/urllib2.py", line 626, in http_error_302 return self.parent.open(new, timeout=req.timeout) File "/usr/lib/python2.7/urllib2.py", line 399, in open req = meth(req) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/handlers/cookie_handler.py", line 33, in http_request if request.cookies: File "/usr/lib/python2.7/urllib2.py", line 226, in getattr raise AttributeError, attr AttributeError: cookies

This is not with the threadin2 branch ( it seems to have disappeared ), but with origin/master.

andresriancho commented 11 years ago

AttributeError: cookies

Sorry about this issue, my mistake. Added that bug while trying to fix another one. Please git pull and try again until you can reproduce the original bug.

This is not with the threadin2 branch ( it seems to have disappeared ), but with origin/master.

Yes, threading2 dissapeared and master is where you should be.

ericsesterhenn commented 11 years ago

After a pull, I get the following traceback from pool.py

Cross Site Request Forgery has been found at: http://192.168.122.80/mutillidae/. This vulnerability was found in the request with id 13100. Traceback (most recent call last): File "/usr/lib/python2.7/multiprocessing/pool.py", line 98, in worker result = (True, func(_args, _kwds)) File "/usr/lib/python2.7/multiprocessing/pool.py", line 67, in mapstar return map(_args) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/controllers/threads/threadpool.py", line 42, in call return self.func(_args) File "/home/snakebyte/arbeit/tools/tmp/w3af/plugins/crawl/web_spider.py", line 282, in _verify_reference headers=headers) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/controllers/plugins/plugin.py", line 215, in meth return attr(_args, _kwargs) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/extended_urllib.py", line 295, in GET return self._send(req, grep=grep) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/extended_urllib.py", line 552, in _send CacheClass.store_in_cache(req, resp) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/handlers/cache.py", line 364, in store_in_cache original_url=request.url_object) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/HTTPResponse.py", line 130, in from_httplib_resp code, msg, hdrs, body = (resp.code, resp.msg, resp.info(), resp.read()) AttributeError: 'HTTPResponse' object has no attribute 'code'

Traceback (most recent call last): File "/usr/lib/python2.7/multiprocessing/pool.py", line 98, in worker result = (True, func(_args, _kwds)) File "/usr/lib/python2.7/multiprocessing/pool.py", line 67, in mapstar return map(_args) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/controllers/threads/threadpool.py", line 42, in call return self.func(_args) File "/home/snakebyte/arbeit/tools/tmp/w3af/plugins/crawl/web_spider.py", line 282, in _verify_reference headers=headers) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/controllers/plugins/plugin.py", line 215, in meth return attr(_args, _kwargs) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/extended_urllib.py", line 295, in GET return self._send(req, grep=grep) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/extended_urllib.py", line 552, in _send CacheClass.store_in_cache(req, resp) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/handlers/cache.py", line 364, in store_in_cache original_url=request.url_object) File "/home/snakebyte/arbeit/tools/tmp/w3af/core/data/url/HTTPResponse.py", line 130, in from_httplib_resp code, msg, hdrs, body = (resp.code, resp.msg, resp.info(), resp.read()) AttributeError: 'HTTPResponse' object has no attribute 'code'New URL found by phpinfo plugin: "http://192.168.122.80/twiki/bin/search/TWiki/" New URL found by phpinfo plugin: "http://192.168.122.80/twiki/bin/search/TWiki/"

andresriancho commented 11 years ago

I'm totally embarrassed at this point. Each time I ask you to do something you find a bug that's shouldn't be there. Sorry for wasting your time. Will work on this and maybe in a couple of days ask you again.

andresriancho commented 11 years ago

PS: I swear I'm running unittests before asking you to test these things.

ericsesterhenn commented 11 years ago

hehe, no problem. software just is complex and I know how hard it can be to debug problems you cant reproduce yourself. Just drop me a note if there is something i can test.

andresriancho commented 10 years ago

Lowering severity since it is the only time that this bug has came up, no other issues are returned after searching for "makefile"

andresriancho commented 10 years ago

@ericsesterhenn I've released a new version of w3af, and almost a year went by after your bug report. This makes it almost impossible for you to reproduce in a similar/same environment as before, but if you've got some time please run w3af against a couple of different targets and report any issues you find. I'll gladly fix them.

Closing since there were no more bug reports mentioning "makefile"