cnHackintosh / ratproxy

Automatically exported from code.google.com/p/ratproxy
0 stars 0 forks source link

=========================================================== ratproxy - passive web application security assessment tool

http://code.google.com/p/ratproxy


What is ratproxy?

Ratproxy is a semi-automated, largely passive web application security audit tool. It is meant to complement active crawlers and manual proxies more commonly used for this task, and is optimized specifically for an accurate and sensitive detection, and automatic annotation, of potential problems and security-relevant design patterns based on the observation of existing, user-initiated traffic in complex web 2.0 environments. The approach taken with ratproxy offers several important advantages over more traditional methods:


Is it worth trying out?

There are numerous alternative proxy tools meant to aid security auditors - most notably WebScarab, Paros, Burp, and ProxMon. Stick with whatever suits your needs, as long as you get the data you need in the format you like.

That said, ratproxy is there for a reason. It is designed specifically to deliver concise reports that focus on prioritized issues of clear relevance to contemporary web 2.0 applications, and to do so in a hands-off, repeatable manner. It should not overwhelm you with raw HTTP traffic dumps, and it goes far beyond simply providing a framework to tamper with the application by hand.

Ratproxy implements a number of fairly advanced and unique checks based on our experience with these applications, as well as all the related browser quirks and content handling oddities. It features a sophisticated content-sniffing functionality capable of distinguishing between stylesheets and Javascript code snippets, supports SSL man-in-the-middle, on the fly Flash ActionScript decompilation, and even offers an option to confirm high-likelihood flaw candidates with very lightweight, a built-in active testing module.

Last but not least, if you are undecided, the proxy may be easily chained with third-party security testing proxies of your choice.


How does it avoid false positives?

Operating in a non-disruptive mode makes the process of discovering security flaws particularly challenging, as the presence of some vulnerabilities must be deduced based on very subtle, not always reliable cues - and even in active testing modes, ratproxy strives to minimize the amount of rogue traffic generated, and side effects caused.

The set of checks implemented by ratproxy is outlined later on - but just as importantly, underneath all the individual check logic, the proxy uses a number of passively or semi-passively gathered signals to more accurately prioritize reported problems and reduce the number of false alarms as much as possible. The five core properties examined for a large number of checks are:

In addition to this, several places employ check-specific logic to further fine-tune the results.


What specific tests are implemented?

Key low-level check groups implemented by ratproxy are:

For a full list of individual issues reported, please see messages.list in the source tarball.


What is the accuracy of reported findings?

Ratproxy usually fares very well with typical, rich, modern web applications - that said, by the virtue of operating in passive mode most of the time, all the findings reported merely highlight areas of concern, and are not necessarily indicative of actual security flaws. The information gathered during a testing session should be then interpreted by a security professional with a good understanding of the common problems and security models employed in web applications.

Please keep in mind that the tool is still in beta, and you may run into problems with technologies we had no chance to examine, or that were not a priority at this time. Please contact the author to report any issues encountered.


How to run the proxy?

NOTE: Please do not be evil. Use ratproxy only against services you own, or have a permission to test. Keep in mind that although the proxy is mostly passive and unlikely to cause disruptions, it is not stealth. Furthermore, the proxy is not designed for dealing with rogue and misbehaving HTTP servers and clients - and offers no guarantees of safe (or sane) behavior there.

Initiating ratproxy sessions is fairly straigtforward, once an appropriate set of runtime options is dediced upon. Please familiarize yourself with these settings, as they have a very significant impact on the quality of produced reports.

The main binary, ./ratproxy, takes the following arguments:

-w logfile - this option causes raw, machine-readable proxy logs to be written to a specified file. By default, all data is written to stdout only. The log produced this way is not meant for human consumption - it might be postprocessed with third-party utilities, or pretty-printed using 'ratproxy-report.sh', however.

-v logdir - prompts ratproxy to store full HTTP traces of all requests featured in the logfile, writing them to a specified directory. In most cases, it is advisable to enable this option, as it provides useful hints for further analysis.

-p port - causes ratproxy to listen for browser connections on a TCP port different than the default 8080.

-r - instructs ratproxy to accept remote connections. By default, the proxy listens on loopback interfaces only. This option enables remote access to the service.

              WARNING: Ratproxy does not feature any specific access 
              control mechanisms, and may be abused if exposed to the 
              Internet. Please make sure to use proper firewall controls 
              whenever using -r option to prevent this.

-d domain - specifies a domain name suffix used to distinguish between the audited infrastructure and third-party sites. Host names that match -d values will be subjected to analysis, and ones that do not will be considered the outside world. Interactions between these two classes will be subjected to additional checks.

              NOTE: This feature is extremely important for several of the 
              checks implemented by ratproxy. If -d option is missing, 
              ratproxy will treat all URLs as being a part of the audited 
              service, and cross-domain interaction checks will not be 
              carried out at all. If it is set incorrectly, report coverage 
              may decrease.

              Multiple -d options may and often should be combined to 
              define the perimeter for testing and flow analysis (e.g., -d 
              example.com -d example-ad-service.com -d example-ng.com).

-P host:port - causes ratproxy to talk to an upstream proxy instead of directly routing requests to target services. Useful for testing systems behind corporate proxies, or chaining multiple proxy-type security testing tools together.

-l - ratproxy sometimes needs to tell if a page has substantially changed between two requests to better qualify the risks associated with some observations. By default, this is achieved through strict page checksum comparison (MD5). This options enables an alternative, relaxed checking mode that relies on page length comparison instead.

              Since some services tend to place dynamically generated 
              tokens on rendered pages, it is generally advisable to enable
              this mode most of the time.

-2 - several services are known to render the same page with dynamic content of variable length in response to two subsequent, otherwise identical requests. This might be a result of inline ad rendering, or other content randomization.

              When dealing with such services, ratproxy might be instructed 
              to acquire three, not two, samples for page comparison for some 
              checks, to further minimize the number of false positives.

-e - enables pedantic caching header validation. Security problems may arise when documents clearly not meant to be cached are served in a way that permits public proxies to store them. By default, ratproxy detects poorly chosen HTTP/1.1 caching directives that are most likely to affect general population.

              Some additional issues may appear with users behind legacy 
              proxies that support HTTP/1.0 only, however - as is the case
              with several commercial solutions. These proxies may ignore 
              HTTP/1.1 directives and interpret HTTP/1.0 cues only. In -e
              mode, ratproxy will complain about all cases where there 
              appears to be a mismatch between HTTP/1.0 and HTTP/1.1 caching
              intents.

              This tends to generate a large number of warnings for many 
              services; if you prefer to focus on more pressing issues first,
              you might want to keep it off at first.

-x - tells the proxy to log all URLs that seem to be particularly well-suited for further, external XSS testing (by the virtue of being echoed on the page in a particular manner). By default, ratproxy will not actually attempt to confirm these vectors (-X option enables disruptive checking, however) - but you will be able to use the data for manual testing or as input to third-party software.

              Generally recommended, unless it proves to be too noisy.

-t - by default, ratproxy logs some of the most likely directory traversal candidates. This option tells the proxy to log less probable guesses, too. These are good leads for manual testing or as input to an external application.

              Generally recommended, unless it proves to be too noisy.

-i - with this option supplied, ratproxy will log all PNG files served inline. PNG files are a cross-site scripting vector in some legacy browsers. The default behavior is to log these images that require authentication only, based on the assumption that such images are most likely to be user-controlled.

              This option should be enabled when auditing applications
              that permit picture uploads and sharing; otherwise, it may 
              just generate noise.

-f - with this option enabled, the proxy will log all Flash applications encountered for further analysis. This is particularly useful when combined with -v, in which case, Flash files will be automatically disassembled and conveniently included in 'ratproxy-report.sh' output.

              Since recent Flash vulnerabilities make the platform a major 
              potential cross-site scripting vector, it is advisable to 
              enable this feature.

-s - tells ratproxy to log all POST requests for further analysis and processing, in a separate section of the final report. This is useful for bookkeeping and manual review, since POST features are particularly likely to expose certain security design flaws.

-c - enables logging of all URLs that seem to set cookies, regardless of their presumed security impact. Again, useful for manual design analysis and bookkeeping. Not expected to contribute much noise to the report.

-g - extends XSRF token validation checks to GET requests. By default, the proxy requires anti-XSRF protection on POST requests and cookie setters only. Some applications tend to perform state changing operations via GET requests, too, and so with this option enabled, additional data will be collected and analyzed.

              This feature is verbose, but useful for certain application 
              designs.

-j - enables detection of discouraged Javascript syntax, such as eval() calls or .innerHTML operations. Javascript code that makes use of these will be tagged for manual inspection.

-m - enables logging of "active" content referenced across domain boundaries to detect patterns such as remote image inclusion or remote linking (note that logging of remote script or stylesheet inclusion is enabled at all times).

              This option has an effect only when a proper set of domains 
              is specified with -d command-line parameter - and is 
              recommended for sites where a careful control of cross-domain 
              trust relationships needs to be ensured.

-X - enables active testing. When this option is provided, ratproxy will attempt to actively, disruptively validate the robustness of XSS and XSRF defenses whenever such a check is deemed necessary.

              By the virtue of doing passive preselection, this does not
              generate excessive traffic and maintains the same level of 
              coverage as afforded in passive mode.

              The downside is that these additional requests may disrupt 
              the application or even trigger persistent problems; as such,
              please exercise caution when using it against mission-critical
              production systems.

-C - in disruptive testing mode, ratproxy will replay some requests with modified parameters. This may disrupt the state of some applications and make them difficult to navigate. To remediate this, -C option enables additional replaying of the unmodified request at the end of the process, in hopes of restoring the original server-side state.

              This option is generally recommended in -X mode.

-k - instructs ratproxy that the application is expected to use HTTPS exclusively; any downgrades to HTTP will be reported and prioritized depending on potential impact.

              This option obviously makes sense only if the application is 
              indeed meant to use HTTPS and HTTPS only.

-a - tells ratproxy to indiscriminately log all visited URLs. Useful for assessing the coverage achieved.

In practice, for low verbosity reporting that looks for high-probability issues only, a good starting point is:

./ratproxy -v <outdir> -w <outfile> -d <domain> -lfscm 

To increase verbosity and include output from some less specific checks, the following set of options is a good idea:

./ratproxy -v <outdir> -w <outfile> -d <domain> -lextifscgjm 

For active testing, simply add -XC options as needed.

Once the proxy is running, you need to configure your web browser to point to the appropriate machine and port (a simple Firefox extension such as QuickProxy may come handy in the long run); it is advisable to close any non-essential browser windows and purge browser cache, as to maximize coverage and minimize noise.

The next step is to open the tested service in your browser, log in if necessary, then interact with it in a regular, reasonably exhaustive manner: try all available views, features, upload and download files, add and delete data, and so forth - then log out gracefully and terminate ratproxy with Ctrl-C.

NOTE: Do not be tempted to tunnel automated spider traffic (e.g. wget -r or active scanners) via ratproxy. This will not have the desired effect. The tool depends strictly on being able to observe well-behaved, valid user-application interaction.

SECURITY WARNING: When interacting with SSL applications, ratproxy will substitute its own, dummy, self-signed certificate in place of that legitimately returned by the service. This is expected to generate browser warnings - click through them to accept the key temporarily for the site. Do not add the key permanently to your browser configuration - the key is known to anyone who ever downloaded the tool. Furthermore, please note that ratproxy will also forego any server certificate validation steps - so while interacting with the service in this mode, you can have no expectation of server identity, transmission integrity, or data privacy. Do not use important accounts and do not enter sensitive data while running ratproxy tests.

Once the proxy is terminated, you may further process its pipe-delimited (|), machine-readable, greppable output with third party tools if so desired, then generate a human-readable HTML report:

./ratproxy-report.sh ratproxy.log >report.html 

This will produce an annotated, prioritized report with all the identified issues. When opened in a browser, you will have an opportunity to replay GET and POST requests, tweak their parameters, view traces, and inspect Flash disassemblies, too.

Enjoy :-)


Credits, contributions, suggestions

If you are interested in contributing to the project, a list of features and improvements for the proxy can be found in doc/TODO in the source tarball.

If you have any questions, suggestions, or concerns regarding the application, the author can be reached at lcamtuf@google.com.

Ratproxy was made possible by the contributions of, and valuable feedback from, Google's information security engineering team.