defuse / juggler-pow

A memory-but-not-time asymmetric proof-of-work function.
1 stars 0 forks source link

The upper bounds on indices and the extra nonce. #13

Open defuse opened 9 years ago

defuse commented 9 years ago

The prover could in-parallel run the work algorithm on two different extra nonces simultaneously, possibly finding a solution faster?

defuse commented 9 years ago

Is there even any benefit to limiting the maximum selector? Why force the prover to re-create all of that memory when they could just keep searching?

Same for the buckets not filling up. Just keep going, being sure to handle overflowing integers properly.

My original rationale was that giving the the prover more selector space to work with would let them brute-force for better selections easier, but that might not actually be a problem.

defuse commented 9 years ago

The maximum selector limit is crucial for security. Imagine there were no limit. Then the following attack would be possible:

  1. Only keep 1/4 or 1/8th of the buckets (or even less).
  2. Iterate through 4-tuples of them until you find a solution to the hashcash PoW.
  3. Now preimage the selector hash function to find a selector that selects the 4-tuple solution you just found. This is fast, since the selector hash function has a small output.

The maximum selector limit prevents this attack by ensuring that, while the 4-tuple found in step 2 solves the PoW, it's extremely unlikely to be one of the "valid" solutions allowed by the maximum limit. The space of all 4-tuples is vastly larger than the image of the selectors under the hash function that any solution found a priori to the selector hash is unlikely to be valid.

defuse commented 9 years ago

The upper limit on the index for filling the buckets is important for DoS protection, too. Suppose there were no limit, then the following attack would be possible:

  1. The attacker starts at 2^64 instead of 0 and fills all of the buckets with indices > 2^64.
  2. The attacker finds a valid solution to the PoW (the extra 2^64 constant does not slow them down very much, it only makes their integer math a bit bigger).
  3. The client must now verify all indicies up to 2^64 and beyond, because the client needs to know there were no preimages in that space.

The verifier will stop when it finds the first invalid preimage, but the prover could select the four buckets with the highest lowest [sic] preimage. How bad could the prover make it for the client? Assume the prover doesn't care if the proof checks out, and just wants to DoS the verifier. Assume the prover is also computationally more powerful than the client: They can iterate the extra nonce in order to find "extra-hard-to-verify" solutions. This makes sense, because the prover would build one very-hard-to-verify puzzle solution, and then send it a million times to carry out all of their DoS attacks (it could even be produced by a public effort once, and then re-used by millions of attackers).

defuse commented 9 years ago

We've established that the extra nonce is necessary in order for each puzzle to have a solution, otherwise we admit attacks. The next steps are to:

  1. Compute the probabilities of a solution existing, given different upper bounds. Set the upper bounds so that there's a good chance of the solution existing, but without admitting strong attacks.
  2. Find the size of the extra nonce that guarantees it's super likely for a solution to exist, i.e. the probability of a puzzle not having a solution should be below 1/2^128.
  3. Explore the parallelization enabled by extra nonce, and possibly limit it to the values discovered in (2) to reduce that threat.
defuse commented 9 years ago

This is intimately related to #5.