Closed siddontang closed 6 years ago
Hmm, so linearizability checking is NP-complete. Porcupine (somewhat efficiently) does a brute-force search to try to find a linearization of the history. But there's exponential blowup, so for really large histories, Porcupine can't check the history in a reasonable amount of time/memory. There may be some ways to make Porcupine more efficient, but beyond a certain point, you can't really avoid combinatorial explosion.
I added a timeout feature; I don't think we can do much better than that (worst case execution time is going to be bad).
Hi @anishathalye
I write a simple model to check bank transfer, see https://github.com/siddontang/chaos/blob/master/tidb/bank.go#L170
And I generate 5000 events, but to my surprise, the linearizability check is killed with OOM, but my server has 128 GB memory.
Here is the history log history.log.zip and I use https://github.com/siddontang/chaos/blob/master/tidb/bank.go#L246 to parse the history and verify it.