bedirhan / wivet

Web Input Vector Extractor Teaser
MIT License
130 stars 36 forks source link

Using db to store results #10

Open skhakimov opened 10 years ago

skhakimov commented 10 years ago

Hello,

I was able to successfully use wivet when storing results in .dat files. I run into some issues when using sql database to store results. I have set up the wivet database, configured credentials, and set the following as:

define('DATASTORE', 'db');

However, crawling results are not being stored and the home page with the main table is blank. Is there anything else that must be done to make wivet work using a database?

The reason I am trying to use sql database vs .dat files is that I think there might be a race condition created when crawling wivet. Thus, using ZAP built-in Spider, I see GET messages being successfully sent to wivet/innerpages/test# but not all are being recorded in wivet statistic as a pass.

Thank you

bedirhan commented 10 years ago

Hi, Race condition is a valid concern and probably others had it too, however, since usually a crawler sends multiple requests to a single URL the issue haven't come up too often.

One way to prove might be repeating the request you see being sent but can't see in WIVET results using the same session id. Because, although a low possibility, it may be the case that some of the generated requests by the spider may not have the cookie sent.

About the db connectivity I'm out of ideas. It may be too much, however, you might write a simple PHP code using the DB connection functions of WIVET and see if that works.

Let me know about the results, and I'll add race condition concern to the github readme page.