Closed Zapotek closed 8 years ago
HI Tasos, have you been able to make any progress? We've been looking at the scan logs closer and noticed that even when providing in a manual sitemap in the configuration, the parameters Arachni are hitting aren't valid. It doesn't seem to fuzz any JSON or any AJAX key/values at all. We believe we may be getting no coverage at this point based on some cross comparisons with AppSpider (the other scan engine we are using).
Thanks and if there is anything I can do to help (logs, configs, etc.) please let me know.
I haven't started work on this yet, I'll be keeping this issue updated when I do.
This is new though right? Was it working in the past? I'd like to make sure it's not a regression on Arachni's part but if I recall correctly you had some issues logging in with the stable release so we can't compare the two, right?
Ill run a stable scan and see if something has changed. It defintley seems that the latest builds have had issues crawling more links but I think both stable/latest have had issues with fuzzing the JSON/AJAX components. Will update this ticket once I get results.
@sbehrens Btw, I updated the way the cookiejar operates to more resemble dumb storage and just send back cookies in the encoding format in which they were received. Let me know if you notice any login/session issues.
Working on this now, should be in the nightlies in a few days if all goes well.
Nightlies now include this feature, although I didn't try it yet against Netflix. Anyhow, give them a try and let me know if you see an improvement or if something explodes.
I didn't really observe any difference with the nightly. Pretty much every request to netflix.com fuzzes the following:
`raw_query, ac_category_type, ac_rel_posn, ac_abs_posn, v1, search_submit, search-input-toggle`
I'm not sure what these parameters are. IT seems to find 'search-input-toggle' on almost every request (never seen this parameter).
They aren't valid, and they don't result in any findings. Most requests via our API happen via Ajax calls to JSON endpoints.
Here are some findings to different pages with the same parameters that it fuzzes
Cross-Site Request Forgery at https://www.netflix.com/mylistorder in form with inputs `search-input-toggle`.
[+] 2 | Cross-Site Request Forgery at https://www.netflix.com/pin in form with inputs `search-input-toggle`.
[+] 3 | Cross-Site Request Forgery at https://www.netflix.com/WiSearch in form with inputs `raw_query, ac_category_type, ac_rel_posn, ac_abs_posn, v1, search_submit`.
[+] 4 | Cross-Site Request Forgery at https://www.netflix.com/managedevices in form with inputs `search-input-toggle`.
[+] 5 | Cross-Site Request Forgery at https://www.netflix.com/languagepreferences in form with inputs `language, languagePrefsSave, languagePrefsCancel`.
[+] 6 | Cross-Site Request Forgery at https://www.netflix.com/languagepreferences in form with inputs `search-input-toggle`.
[+] 7 | Cross-Site Request Forgery at https://www.netflix.com/hdtoggle in form with inputs `videoQuality, autoplay, playbackPrefsSave, playbackPrefsCancel`.
[+] 8 | Cross-Site Request Forgery at https://www.netflix.com/hdtoggle in form with inputs `search-input-toggle`.
[+] 9 | Cross-Site Request Forgery at https://www.netflix.com/email in form with inputs `search-input-toggle`.
[+] 10 | Cross-Site Request Forgery at https://www.netflix.com/changeplan in form with inputs `search-input-toggle`.
[+] 11 | Cross-Site Request Forgery at https://www.netflix.com/emailpreferences in form with inputs `NetflixNews, MemberNews, NetflixOffersViaEmail, NetflixSurveys, SMS_TRANSACTIONAL, uncheckAll`.
[+] 12 | Cross-Site Request Forgery at https://www.netflix.com/emailpreferences in form with inputs `search-input-toggle`.
[+] 13 | Cross-Site Request Forgery at https://www.netflix.com/donottest in form with inputs `search-input-toggle`.
[+] 14 | Cross-Site Request Forgery at https://www.netflix.com/billingactivity in form with inputs `search-input-toggle`.
[+] 15 | Cross-Site Request Forgery at https://www.netflix.com/activate in form with inputs `search-input-toggle`.
[+] 16 | Cross-Site Request Forgery at http://www.netflix.com/Kids/search in form with inputs `raw_query, ac_category_type, ac_rel_posn, ac_abs_posn, v1, search_submit`.
It seems Arachni isn't scanning any of the Ajax/JSON and/or it's having some issue finding the parameters to fuzz.
As an example, let's look at /activate using BurpSuite to see what we should be seeing. A GET request to /activate returns a form expecting a device number. When clicking submit an Ajax call is made to this system:
https://www.netflix.com/api/shakti/723370cb/account/devices
With the following POST body:
{"activationCode":"112345","authURL":"REDACTED+tj45taRZQ8iE="}
The parameters it is trying to fuzz for /activate, I can't even find so I'm not sure what those are and they aren't being used obviously (never found any bugs).
When running Arachni through a proxy it seems none of the Ajax calls are being parsed or checked.
The button is missing the click
inherited event although it did inherit others.
I'm on the right track at least, I'll keep looking into it and keep you posted.
My bad, forgot to include window
and document
in the inheritance process, it was only dealing with nodes.
@sbehrens I'm still optimizing because there's more workload than before, I'll let you know once I have some nightlies for you to test.
Your existing configuration should work with the fresh nightlies if you also add the following:
--scope-exclude-pattern=ichnaea --scope-redundant-path-pattern=browse:10 --scope-redundant-path-pattern=WiMovie:10 --scope-redundant-path-pattern=/title/:10
Also, the seed URL will need to be something other than /
or /browse
(like https://www.netflix.com/EmailPreferences), I think it's because /browse
creates infinite snapshots and the redundancy rules necessary to avoid an infinite loop prevent the system from triggering the drop-down menu for the user account which leads to the other resources, I'll recheck that though to make sure.
In general, the system managed to crawl the user settings and extract and audit the JSON input vectors.
@sbehrens Spotted the issue for /browse
, will let you know once I'm done.
Thanks Tasos, I ran the scan from Ubuntu and it looked a bit better. I wanted to run it locally so I could proxy the traffic but something is messed up with the OSX package (see below):
Could not open library '/usr/lib/libc.dyliblibc.dylib': dlopen(/usr/lib/libc.dyliblibc.dylib, 5): image not found
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/ffi-856d674420fe/lib/ffi/library.rb:100:in `map'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/ffi-856d674420fe/lib/ffi/library.rb:100:in `ffi_lib'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/gems/ethon-0.8.1/lib/ethon/libc.rb:8:in `<module:Libc>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/gems/ethon-0.8.1/lib/ethon/libc.rb:6:in `<module:Ethon>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/gems/ethon-0.8.1/lib/ethon/libc.rb:1:in `<top (required)>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/gems/ethon-0.8.1/lib/ethon.rb:14:in `require'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/gems/ethon-0.8.1/lib/ethon.rb:14:in `<top (required)>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/gems/typhoeus-1.0.1/lib/typhoeus.rb:2:in `require'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/gems/typhoeus-1.0.1/lib/typhoeus.rb:2:in `<top (required)>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/lib/arachni/http/client.rb:9:in `require'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/lib/arachni/http/client.rb:9:in `<top (required)>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/lib/arachni/http.rb:9:in `require_relative'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/lib/arachni/http.rb:9:in `<top (required)>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/lib/arachni/framework.rb:29:in `require'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/lib/arachni/framework.rb:29:in `<module:Arachni>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/lib/arachni/framework.rb:17:in `<top (required)>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/lib/arachni.rb:95:in `require_relative'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/lib/arachni.rb:95:in `<top (required)>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/ui/cli/framework.rb:9:in `require_relative'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/ui/cli/framework.rb:9:in `<top (required)>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/bin/arachni:10:in `require_relative'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/system/gems/bundler/gems/arachni-43fe2221c230/bin/arachni:10:in `<top (required)>'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/bin/../system/arachni-ui-web/bin/arachni:16:in `load'
from /Users/sbehrens/scan_engines/arachni-2.0dev-1.0dev/bin/../system/arachni-ui-web/bin/arachni:16:in `<main>'
Not sure if it's a bad build or dependency.
Dependency issue for OSX 10.11: https://github.com/ffi/ffi/issues/461#issuecomment-174352401
got it, thanks.
On Tue, Jan 26, 2016 at 1:49 PM, Tasos Laskos notifications@github.com wrote:
Dependency issue for OSX 10.11: ffi/ffi#461 (comment) https://github.com/ffi/ffi/issues/461#issuecomment-174352401
— Reply to this email directly or view it on GitHub https://github.com/Arachni/arachni/issues/652#issuecomment-175248336.
So, the necessary events are now seen, but the nodes for the drop-down user menu in /browse
are created on click
or mouseover
and that operation is delayed due to the fade-in effect.
Unfortunately, there's no way for the system to know for how long to wait since the operation is asynchronous (I think) so when it checks for changes immediately after the event is triggered there are none.
In the rest of the pages the drop-down nodes for the user-menu are there but just not visible, which is why the paths are extracted and followed.
Got it. My workaround for the time being is I have added those paths from the drop down manually to the scan configuration, which seems to work.
On Tue, Jan 26, 2016 at 4:32 PM, Tasos Laskos notifications@github.com wrote:
So, the necessary events are now seen, but the nodes for the drop-down user menu in /browse are created on click or mouseover and that operation is delayed due to the fade-in effect. Unfortunately, there's no way for the system to know for how long to wait since the operation is asynchronous (I think) so when it checks for changes immediately after the event is triggered there are none.
In the rest of the pages the drop-down nodes for the user-menu are there but just not visible, which is why the paths are extracted and followed.
— Reply to this email directly or view it on GitHub https://github.com/Arachni/arachni/issues/652#issuecomment-175312868.
Yeah, or just provide an alternative seed URL. I just noticed an issue that went away after I deleted the browser cache, the crawl finishing almost immediately, which sounds similar to something you had mentioned in the past. Let me know if that happens again.
I've got some good news.
By tracking changes in setTimeout
timers caused by event triggers, I can identify effects/transitions and wait for them to complete before capturing a page snapshot.
This worked in another test case, not yet for your menu though but I'll keep looking into it.
Excellent news, let me know what you find out!
On Thu, Jan 28, 2016 at 9:29 AM, Tasos Laskos notifications@github.com wrote:
I've got some good news. By tracking changes in setTimeout timers caused by event triggers, I can identify effects/transitions and wait for them to complete before capturing a page snapshot. This worked in another test case, not yet for your menu though but I'll keep looking into it.
— Reply to this email directly or view it on GitHub https://github.com/Arachni/arachni/issues/652#issuecomment-176292760.
What an idiot, forgot about the /browse
redundancy rules I had set, they were preventing my change from working. Still, I'm not sure if the redundancy rules can be avoided in this case or what the magic limit is but I'm running a crawl to see what happens if I remove them completely.
Anyhow, I'll let you know once nightlies are up since it's up to the config from now on, hopefully.
Nightlies are up but I think you should stick with strict redundancy rules for /browse
and explicitly specify the paths like you do now, it's the fastest approach.
Hi @Zapotek, any luck enhancing the crawler? I've been still seeing Arachni having troubles parsing and clicking links/buttons on the site (based on proxy logs).
I don't think that has to do with delegated events, this functionality has been added. It could be something else, deduplication issue maybe. Have you tried the nightlies btw?
No luck with the nightlies, it seems to really struggle with the buttons/links. As an example, it thinks pretty much every page has this parameter 'search-input-toggle', which isn't valid. When it starts fuzzing parameters for pages, it only seems to fuzz this search-input-toggle and a few other non-valid parameters. The sitemap looks okay though, so it's at least able to click some links (although buttons/events seem more difficult). Truncated Sitemap below (with some of the weird parameter it thinks it found):
...snip
https://www.netflix.com/phonenumber?search-input-toggle=
https://www.netflix.com/phonenumber?search-input-toggle=1
https://www.netflix.com/pin
https://www.netflix.com/pin?search-input-toggle=
https://www.netflix.com/pin?search-input-toggle=1
https://www.netflix.com/popupdetails
https://www.netflix.com/privacypolicychanges
https://www.netflix.com/redeem
https://www.netflix.com/rememberme
https://www.netflix.com/reviews
https://www.netflix.com/reviews?search-input-toggle=
https://www.netflix.com/search
https://www.netflix.com/watch
https://www.netflix.com/whysecure
https://www.netflix.com/wiviewingactivity
https://www.netflix.com/youraccount
https://www.netflix.com/youraccount?search-input-toggle=
https://www.netflix.com/youraccount?search-input-toggle=1
I do remember everything working fine when I closed this issue, I'll have another look at it. Are you setting a timeout for the scan?
I set a 6 hour timeout, which is always triggered during scans.
On Mon, May 30, 2016 at 8:38 AM, Tasos Laskos notifications@github.com wrote:
I do remember everything working fine when I closed this issue, I'll have another look at it. Are you setting a timeout for the scan?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Arachni/arachni/issues/652#issuecomment-222516809, or mute the thread https://github.com/notifications/unsubscribe/AAqBzQCuw3zLY_iI4VrfgIt5iVzvt4usks5qGwR8gaJpZM4G2BIh .
Another thing that played a part last time was that DOM checks and browser jobs in general run parallel to everything else. The jobs are being pushed to a queue and there's no way to say when they will be executed. So it could be that things that are pushed to the queue early on don't get a chance to run within 6 hours, although at least some probably will. Do you see a total miss of a specific kind of input or some get checked while others don't?
I'm running a scan now to give more context, basically it finishes it's attack portion on a page but never fuzzes any valid parameters. I'll try to give you a more concrete example.
On Mon, May 30, 2016 at 9:12 AM, Tasos Laskos notifications@github.com wrote:
Another thing that played a part last time was that DOM checks and browser jobs in general run parallel to everything else. The jobs are being pushed to a queue and there's no way to say when they will be executed. So it could be that things that are pushed to the queue early on don't get a chance to run within 6 hours, although some should. Do you see a total miss of a specific kind of input or some get checked while others don't?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Arachni/arachni/issues/652#issuecomment-222522087, or mute the thread https://github.com/notifications/unsubscribe/AAqBzfLwDtGel5iTVpm4-Sdl6Pz9sCumks5qGwxVgaJpZM4G2BIh .
If the parameters need to be triggered by a DOM event they won't appear within the framework until later, and in complex webapps much, much later.
Hmm, I can let it run for 24 hours and see if that helps, as an example here are some of the 'phantom' parameters the scanner trys on every single Netflix page:
[*] XSS in script context: Auditing link input 'lnkctr' pointing to: 'https://www.netflix.com/YourAccount'
[*] XSS in script context: Auditing form input 'search-input-toggle' pointing to: 'https://www.netflix.com/YourAccount'
Neither lnkctr, or search-input-toggle are valid parameters. I see these particular parameters fuzzed on most requests. I never see any of the JSON parameters fuzzed when proxying or viewing the output of the log.
I remember search-input-toggle
being a hidden form parameter that appears in the HTML.
I don't remember lnkctr
.
If I browse the site using BurpSuite, I do not see any requests that include that parameter. I can send you a BurpLog if that would be helpful on the differences.
On Mon, May 30, 2016 at 9:33 AM, Tasos Laskos notifications@github.com wrote:
I remember search-input-toggle being a hidden form parameter that appears in the HTML. I don't remember lnkctr.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Arachni/arachni/issues/652#issuecomment-222525058, or mute the thread https://github.com/notifications/unsubscribe/AAqBzUZ__nuOoVvYxiKpKO-IjdwrITHLks5qGxF1gaJpZM4G2BIh .
If Arachni is auditing it it must be there in some way or another, regardless of whether or not it ends up being transmitted to the webapp over HTTP. The interface could be treating it as something transient just to hold a value and not actually send it over HTTP in the end. Still, Arachni has to audit it to make sure.
Got it that makes sense. I'm just not sure it ever gets to the JSON calls. I'll scope a scan to just the account page and let it run to completion and see if it seems to finally get to the JSON files. I'll report back
On Mon, May 30, 2016 at 9:39 AM, Tasos Laskos notifications@github.com wrote:
If Arachni is auditing it it must be there in some way or another, regardless of whether or not it ends up being transmitted to the webapp over HTTP. The interface could be treating it as something transient just to hold a value and not actually send it over HTTP in the end. Still, Arachni has to audit it to make sure.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Arachni/arachni/issues/652#issuecomment-222525688, or mute the thread https://github.com/notifications/unsubscribe/AAqBzULXYiAMSYBWwCxOUTGxCoUb0jMYks5qGxKvgaJpZM4G2BIh .
I distinctly remember it getting to the JSON requests last time and I'm not aware of any regressions in the nightlies. Let me know how it goes please.
The scanner is slowing down quite a bit with these error messages:
[-] [browser_cluster/worker#browser_respawn:250] BrowserCluster Worker#40474280: Please try increasing the maximum open files limit of your OS.
[-] [browser_cluster/worker#browser_respawn:248] BrowserCluster Worker#40474280: Could not respawn the browser, will try again at the next job. (Could not start the browser process.)
[-] [browser_cluster/worker#browser_respawn:250] BrowserCluster Worker#40474280: Please try increasing the maximum open files limit of your OS.
[*] File Inclusion: Analyzing response #11327 for link input 'search-input-toggle' pointing to: 'https://www.netflix.com/password'
[~] AutoThrottle: Decreasing HTTP request concurrency to 19.
[~] AutoThrottle: Average response time for this burst: 13.271647476190475
[*] File Inclusion: Analyzing response #11335 for link input 'search-input-toggle' pointing to: 'https://www.netflix.com/password'
[-] [browser_cluster/worker#browser_respawn:248] BrowserCluster Worker#40474280: Could not respawn the browser, will try again at the next job. (Could not start the browser process.)
[-] [browser_cluster/worker#browser_respawn:250] BrowserCluster Worker#40474280: Please try increasing the maximum open files limit of your OS.
Using AWS M3 XL instance, ulimit shouldn't be an issue. I'm not sure it will finish, so any advice would be appreciated.
Can you restart the scan with --output-debug
?
It could help show what's going on is that issue occurs again.
Also, why wouldn't ulimit
be an issue? It's a default setting on many *nix and doesn't actually have to do with real resource availability.
Can you show me the output of ulimit -a
please?
Sure, ill try with debug.
hostname:~$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 29518
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 32768
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 29518
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
My bad, the default is indeed pretty high.
Starting to hit that issue, here's the debug out:
11126: Working
11126: Working
11126: Working
[2016-05-30 18:57:54 +0000 - 0.1] [!] [browser#spawn_phantomjs:1370] BrowserCluster Worker#31332520: Killing process.
[-] [browser_cluster/worker#browser_respawn:248] BrowserCluster Worker#31332520: Could not respawn the browser, will try again at the next job. (Could not start the browser process.)
[-] [browser_cluster/worker#browser_respawn:250] BrowserCluster Worker#31332520: Please try increasing the maximum open files limit of your OS.
[2016-05-30 18:57:54 +0000 - 0.4] [!] [browser_cluster#job_done:213] BrowserCluster: Job done: #<Arachni::BrowserCluster::Jobs::BrowserProvider:19042680 callback= time= timed_out=false>
[*] File Inclusion: Analyzing response #5310 for form input 'search-input-toggle' pointing to: 'https://www.netflix.com/phonenumber'
[2016-05-30 18:57:54 +0000 - 0.3] [!] [browser#spawn_phantomjs:1315] BrowserCluster Worker#31332520: Attempt #0, chose port number 51175
[2016-05-30 18:57:54 +0000 - 0.2] [!] [browser_cluster/worker#run_job:71] BrowserCluster Worker#31332520: Started: #<Arachni::BrowserCluster::Jobs::BrowserProvider:12989820 callback= time= timed_out=false>
[2016-05-30 18:57:54 +0000 - 0.0] [!] [browser#spawn_phantomjs:1302] BrowserCluster Worker#31332520: Spawning PhantomJS...
[2016-05-30 18:57:54 +0000 - 0.0] [!] [browser#spawn_phantomjs:1319] BrowserCluster Worker#31332520: Spawning process: /apps/monterey/monterey/plugins/monterey-arachni/arachni/arachni/bin/../system/usr/bin/phantomjs
[2016-05-30 18:57:54 +0000 - 0.0] [!] [browser#spawn_phantomjs:1340] BrowserCluster Worker#31332520: Process spawned, waiting for it to boot-up...
[2016-05-30 18:57:54 +0000 - 0.2] [!] [browser#spawn_phantomjs:1315] BrowserCluster Worker#31332520: Attempt #0, chose port number 21569
[2016-05-30 18:57:54 +0000 - 0.0] [!] [browser#spawn_phantomjs:1319] BrowserCluster Worker#31332520: Spawning process: /apps/monterey/monterey/plugins/monterey-arachni/arachni/arachni/bin/../system/usr/bin/phantomjs
[2016-05-30 18:57:54 +0000 - 0.0] [!] [browser#spawn_phantomjs:1340] BrowserCluster Worker#31332520: Process spawned, waiting for it to boot-up...
[2016-05-30 18:58:47 +0000 - 0.0] [!] [browser#spawn_phantomjs:1370] BrowserCluster Worker#31332520: Killing process.
[2016-05-30 18:58:47 +0000 - 0.0] [!] [browser#spawn_phantomjs:1370] BrowserCluster Worker#31332520: Killing process.
[2016-05-30 18:58:47 +0000 - 0.1] [!] [browser#spawn_phantomjs:1315] BrowserCluster Worker#31332520: Attempt #5, chose port number 52710
[2016-05-30 18:58:47 +0000 - 0.1] [!] [browser#spawn_phantomjs:1319] BrowserCluster Worker#31332520: Spawning process: /apps/monterey/monterey/plugins/monterey-arachni/arachni/arachni/bin/../system/usr/bin/phantomjs
[2016-05-30 18:58:47 +0000 - 0.3] [!] [browser#spawn_phantomjs:1315] BrowserCluster Worker#31332520: Attempt #5, chose port number 25664
[2016-05-30 18:58:47 +0000 - 0.0] [!] [browser#spawn_phantomjs:1319] BrowserCluster Worker#31332520: Spawning process: /apps/monterey/monterey/plugins/monterey-arachni/arachni/arachni/bin/../system/usr/bin/phantomjs
[2016-05-30 18:58:47 +0000 - 0.1] [!] [browser#spawn_phantomjs:1340] BrowserCluster Worker#31332520: Process spawned, waiting for it to boot-up...
[2016-05-30 18:58:47 +0000 - 0.1] [!] [browser#spawn_phantomjs:1340] BrowserCluster Worker#31332520: Process spawned, waiting for it to boot-up...
I then see it do the following for the next 3 minutes or so:
[2016-05-30 18:59:18 +0000 - 0.0] [!] [browser#spawn_phantomjs:1370] BrowserCluster Worker#31332520: Killing process.
[2016-05-30 18:59:18 +0000 - 0.1] [!] [browser#spawn_phantomjs:1340] BrowserCluster Worker#31332520: Process spawned, waiting for it to boot-up...
[2016-05-30 18:59:18 +0000 - 0.2] [!] [browser#spawn_phantomjs:1370] BrowserCluster Worker#31332520: Killing process.
[2016-05-30 18:59:18 +0000 - 0.2] [!] [browser#spawn_phantomjs:1370] BrowserCluster Worker#31332520: Killing process.
[2016-05-30 18:59:18 +0000 - 0.2] [!] [browser#spawn_phantomjs:1315] BrowserCluster Worker#31332520: Attempt #9, chose port number 26941
[2016-05-30 18:59:18 +0000 - 0.0] [!] [browser#spawn_phantomjs:1319] BrowserCluster Worker#31332520: Spawning process: /apps/monterey/monterey/plugins/monterey-arachni/arachni/arachni/bin/../system/usr/bin/phantomjs
[2016-05-30 18:59:18 +0000 - 0.0] [!] [browser#spawn_phantomjs:1340] BrowserCluster Worker#31332520: Process spawned, waiting for it to boot-up...
[2016-05-30 18:59:18 +0000 - 0.2] [!] [browser#spawn_phantomjs:1315] BrowserCluster Worker#31332520: Attempt #8, chose port number 13450
[2016-05-30 18:59:18 +0000 - 0.1] [!] [browser#spawn_phantomjs:1319] BrowserCluster Worker#31332520: Spawning process: /apps/monterey/monterey/plugins/monterey-arachni/arachni/arachni/bin/../system/usr/bin/phantomjs
[2016-05-30 18:59:18 +0000 - 0.2] [!] [browser#spawn_phantomjs:1340] BrowserCluster Worker#31332520: Process spawned, waiting for it to boot-up...
[2016-05-30 18:59:18 +0000 - 0.4] [!] [browser#spawn_phantomjs:1315] BrowserCluster Worker#31332520: Attempt #8, chose port number 14003
[2016-05-30 18:59:18 +0000 - 0.0] [!] [browser#spawn_phantomjs:1319] BrowserCluster Worker#31332520: Spawning process: /apps/monterey/monterey/plugins/monterey-arachni/arachni/arachni/bin/../system/usr/bin/phantomjs
[2016-05-30 18:59:18 +0000 - 0.0] [!] [browser#spawn_phantomjs:1340] BrowserCluster Worker#31332520: Process spawned, waiting for it to boot-up...
Eventually at some point in time, it restarts at Attempt 0.
I don't think that you're using the nightlies, at least not current ones.
The Working
message isn't printed any more.
I'll try the newest nightly, and report back.
It's about 4 hours in, and the summary of findings reflects what I was saying before "about it fuzzing strange parameters".
[+] 2 | Cross-Site Request Forgery at https://www.netflix.com/youraccount in form with inputs `search-input-toggle`.
[+] 3 | Cross-Site Request Forgery at https://www.netflix.com/reviews in form with inputs `search-input-toggle`.
[+] 4 | Cross-Site Request Forgery at https://www.netflix.com/subtitlepreferences in form with inputs `search-input-toggle`.
[+] 5 | Cross-Site Request Forgery at https://www.netflix.com/pin in form with inputs `search-input-toggle`.
[+] 6 | Cross-Site Request Forgery at https://www.netflix.com/phonenumber in form with inputs `search-input-toggle`.
[+] 7 | Cross-Site Request Forgery at https://www.netflix.com/mylistorder in form with inputs `search-input-toggle`.
[+] 8 | Cross-Site Request Forgery at https://www.netflix.com/password in form with inputs `search-input-toggle`.
[+] 9 | Cross-Site Request Forgery at https://www.netflix.com/managedevices in form with inputs `search-input-toggle`.
I do see some parameters which seem valid (i redacted them out), so I'll continue to let the scanner run.
Have elements inherit their ancestors' DOM events when building the list in
DOMMonitor#elements_with_events
, in order to cover cases where events of child nodes are handled by an ancestor.This will create a lot more workload which will yield no results most of the time, so consider the following optimizations: