Closed GuilloOme closed 7 years ago
the recursion is still not working as expected. From htcap.org/scanme/ng/ the "steps" should be crawled in this order: 1,2,20,3,4,5,21,22. Your code produces 1,2,20,3,21,4,22,5. Also there are still duplicate requests to the database.
The order is different because it runs async so it call the next one as soon it's ready (call get 2 and 20, request 3 and 21…). In fact, it could be a problem in some edge case.
For the question of the duplicates, since the referrer urls* are different, every request made from these urls are considered different.
* index.php
, index.php?document-location-href=1
, index.php?document-location=1
, index.php?openurl=undefined
and index.php?window-location=1
Fix #22 and issues from #26, there is a lot of change here, it's some pretty advanced low level javascript (as low as javascript can go ;) ). if you need more information about it, ask me!
What have been done
Remove all the logic around the wait/sync of event (based on timing) and replace it with a logic based on the eventLoop cycle (more here: Concurrency model and Event Loop and this very good talk ). Everything happening on the analysed page is now place in a queue and treated when the stack is ready to take more.
Use the XHR event API from the browser to get the most up to date status on a request (no more sync wait)
Use the mutation observer API from the browser to get any change on the DOM (way faster and precise than storing a array of element)
All the probe is now totally async
Clean up a lot of code ;)
Benefits
window.location
navigation calls)Drawback
Test
Got a 85% coverage on wivet (was 35% before) Got more result from http://htcap.org/scanme:
index.php?document-location-href=1
index.php?document-location=1
index.php?openurl=undefined
index.php?window-location=1