Open Mitchellpkt opened 6 years ago
Ballpark thoughts, some small approximations made for simplicity: Suppose X(R) is the fraction of transactions in the recent zone use a ringsize > R (e.g. from the charts, X(40) ~ 0.02, i.e. 98% of txns use < 40 ring members)
Define P(R,M) as the probability of including an extant output made by a ring size > R in your new transaction with M ring members. P(R,M) ~ X(R)*M
The likelihood of an N-long chain of sequential transactions each including an made by > R ring members would be P(R,M)^N = [X(R)*M]^N, right? In other words, vanishingly small chance of finding 3 txns in a row with ring size > 40…
Probably missing some small corrective terms, but does this back-of-the-envelope approach seem right?
(turn this into a wiki post with example chains)
I think that the necessity for a fixed ringsize is relatively self-evident from a statistical perspective (I fully support fixed ring-size). However, it’d be fun to pull out some proof for good measure. (Only looking at transactions since January 2017, for relevance)
Fishing around for signatures of anomalous behaviors falls right in the ballpark of #noncesense-research-lab :- ) … Seems like a straightforward project to iterate over transactions with non-standard ringsizes and check whether any of their ring members were outputs generated by txns that had unusual ringsizes themselves.
I wouldn’t be surprised if we locate a few chains of transactions with a string of unusually-sized rings surrounded by 7-member decoys. (of course, this can be compared against the background likelihood of selecting a decoy with generated by > 7 ring members).
Wanted to check whether somebody else is already working on this? If not, suggestions for tackling? My default approach would be to use RPC (python wrapper?) to scan through [2017-present] transaction tree, memoize ring size info, then analyze that. Let me know if you have advice for starting points/libraries, better approaches, or prior art.