Closed danalex97 closed 6 years ago
Assume:
We will have a maximum error of: (% of data) err = ceil(Bw td / S) S / (Bw * t)
That is 3.33% of the total data transferred. err = 1 4Mb / (4Mb 30) = 3.33%
Supposing we fix the typical size of a torrent to N = 1000 shards, we will have a maximum total error of: 1 - (1 - err) ^ N. For example, if we use td = 1µs as our time unit for the simulation, the resulted error in the data to be transferred for all the shards should be (1 - (1 - 1/30/1000) * 1000) 100 = 3.27% of the whole file size. [?] is this error multiplicative or additive [?]
To that we need to add the total error resulted from latencies, which should be small since a typical delay should be under 200ms, which is about 200ms/3s = 200/3000s = 6.66% of the total data. The total error should be added the the error above, that is 9.93%. [this is clearly an upper bound - maybe take mean]
To account for the bandwidth separation, we can just consider equal bandwidth links.
^ Comment is wrong! The error is additive and we need to calculate it in terms of time. (i.e. download time)
We can model bandwidth similarly to paper [29]. (see measures)