JosephHewitt / wardriver_rev3

A portable ESP32-based WiFi/Bluetooth scanner for Wigle.net.
https://wardriver.uk
GNU General Public License v3.0
259 stars 26 forks source link

adjust dwell time? #177

Closed ZeroChaos- closed 2 months ago

ZeroChaos- commented 2 months ago

I can easily open a PR for this if you like it, or you can make the trivial change, but I wanted to discuss this line of code and felt an issue was most appropriate: https://github.com/JosephHewitt/wardriver_rev3/blob/main/A/A.ino#L2481

As I understand this it is dwelling on a channel for 110ms looking for beacons. Other tools, notably kismet, have a default of 200ms. With the default (but configurable) value of one beacon per 100ms, dwelling for 200ms basically allows you to drop 50% of the packets which increases your chances of picking up a given AP. While 110ms isn't a critically low value for dwell, it does slightly lower the chances of picking up a given AP on each pass. Dragorn did some math here which I share because it's interesting https://kismetwireless.net/posts/2022-07-71-wardriving/

Have you had a chance to test with different values than 110ms? Is this where you personally found the optimization point? I'm opening this to discuss the possibility of changing the value to 200ms but I'm certainly interested in discussions/testing/math to back up other values.

JosephHewitt commented 2 months ago

Thanks for bringing this up. I'm not an expert when it comes to RF physics so the value of 110ms didn't have much scientific basis, but it did have logic which I'll explain below:

Board A scans channels 1-13 at 110ms per channel meaning a full scan takes 1430ms. I'll use a value of 1500ms for the math since a scan can suffer delays. We can calculate how much distance a wardriver will move in 1500ms quite easily, but here are some key figures:

My logic was that if a full scan is completed every x meters where x is the approximate WiFi range, very few average networks should be missed due to speed alone. Based on a few online sources, I used a conservative estimate of 30 meters of average range for 2.4GHz WiFi (this takes into account the obstacles that will be affecting the WiFi signal, eg, the walls of the building between you and the WiFi AP). This means that, assuming my logic isn't completely flawed, traveling at around 40mph should allow the wardriver to be physically capable of capturing a beacon from every station in range per scan (eg, it won't drive past a network so quickly that it can't complete a full scan in the time it takes to pass). Naturally, there are a lot more factors at play, so it will definitely still miss beacons and this is where the increased scan time per channel might become relevant.

If we increase the per-channel scan time to 205ms (WiFi beacons are commonly sent at 102.4ms so 205ms is 2x the beacon time), less beacons per channel-scan will be missed, but the speeds calculated earlier would all be approximately halved meaning 20mph would be the new maximum optimal speed. I would therefore expect the wardriver to have worse overall results at speeds greater than 20mph. In the UK, where I first started the project, 30mph is a common speed to drive at so I wanted to ensure the wardriver could realistically pick up all networks when traveling at that speed, so 110ms is a nice value for that situation (again, unless my logic is bad).

The (closed source) rev2 version of this project had the per-channel scan time set to 300ms and that performed noticeably badly at high(er) speeds. It was quite normal to drive down a 40mph road twice in a day and get completely different results both times. The performance in this regard improved with the rev3 at 110ms, but a lot of other factors changed too so it's not an entirely fair comparison.

Again, I'd like to highlight the fact that I'm not an expert at physics so my logic could be flawed, but this is at least where the 110ms value originally came from. I disagree with the Kismet post that you linked stating that the Nyquist frequency is relevant. I believe that only applies to sampling continuous analog signals and not sporadic digital signals like those found when wardriving.

However, I do agree that spending double the time on a channel will double the opportunities to capture a beacon, but the issue I have with this is that the wardriver always scans constantly and therefore the time spent scanning a channel will never change regardless of the per-channel value which simply divides the time differently. Eg, 2x100ms scans is the same time as 1x200ms scan, just with a different division of time.

I'd definitely be happy to be educated more on exactly how the optimum values can be mathematically calculated, and would also be happy for people to test different values. However, I don't know of a good way to reliably benchmark changes to this value since there are too many different factors that will affect scan performance.

I'll add a way to make this value adjustable to make it easier for people to experiment with, but I expect it will make performance worse in most situations. For that reason I am against changing the default for now until I have more data.

ZeroChaos- commented 2 months ago

I'm not able to read all of this code as well as I'd like, so I'll sum up what I think is happening. You are doing the primary_scan_loop all the time in the background pinned to one of the cpus. There is minimal transformation happening in that task, and output from the primary_scan_loop is handled by a second concurrently running task. As such, I agree that in the case of your code this likely has minimal return on investment. Staying on the channel longer to maybe catch the second beacon when you miss the first is balanced out by not hopping channels as fast and thus missing different beacons. If I understood your code correctly then I agree your current dwell time is properly optimized. Thanks for taking the time to discuss it.